OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Sun, 08 Mar 2020 11:57:47 -0500measure the CieLab color of a squarehttp://answers.opencv.org/question/227301/measure-the-cielab-color-of-a-square/ Hi,
For compute the CieLab color of a square of an image do we have to compute the mean of each RVB channels then to convert this mean color to CieLab or a nother method ?
Thank you !
Bye !SylvainArdSun, 08 Mar 2020 11:57:47 -0500http://answers.opencv.org/question/227301/calculate the Euclidean distances among one pixel selected from the object of interest and the rest of the points in the imagehttp://answers.opencv.org/question/226393/calculate-the-euclidean-distances-among-one-pixel-selected-from-the-object-of-interest-and-the-rest-of-the-points-in-the-image/An image is taken as input and converted to CIE-Lab colour space. Now I have to select the object of interest in the image and find the euclidian distance among one pixel selected from the object of interest and the rest of the points in the image. My problem is
1.Selecting my object of interest. ( In the below image I want to select the red chair)
2. To measure the euclidian distance from one pixel from that red chair to the rest of the points of the image
3. compute the delta values of the image yielding only one chromatic difference on an image referred as to delta image. 4. Since the chromatic difference between two pixels gives a low value when both pixels are similar and a high value otherwise, according to this, such difference can be interpreted as a probability value of belonging to a particular colour.
![image description](/upfiles/15819160093552865.jpg)
**I have coded this far:**
1. Taken input of the image
2.Performed homography
3.Transformed to CIE- Lab space
After this I am instructed to measure euclidian distance.
Please help
titliSun, 16 Feb 2020 23:17:00 -0600http://answers.opencv.org/question/226393/determining chromatic components from an image in CIE-Lab space to compute euclidian distance.http://answers.opencv.org/question/226097/determining-chromatic-components-from-an-image-in-cie-lab-space-to-compute-euclidian-distance/Here is the following code that I have.
# -*- coding: utf-8 -*-
"""
Created on Sun Feb 9 00:09:31 2020
@author: Shrouti
"""
import cv2
import numpy as np
from PIL import Image, ImageCms
from skimage import color
f = 500
rotXval = 90
rotYval = 90
rotZval = 90
distXval = 500
distYval = 500
distZval = 500
def onFchange(val):
global f
f = val
def onRotXChange(val):
global rotXval
rotXval = val
def onRotYChange(val):
global rotYval
rotYval = val
def onRotZChange(val):
global rotZval
rotZval = val
def onDistXChange(val):
global distXval
distXval = val
def onDistYChange(val):
global distYval
distYval = val
def onDistZChange(val):
global distZval
distZval = val
if __name__ == '__main__':
# Read input image, and create output image
src = cv2.imread("D:\SHROUTI\Testpictures\handgesture2.png")
src = cv2.resize(src, (640, 480))
dst = np.zeros_like(src)
h, w = src.shape[:2]
# Create user interface with trackbars that will allow to modify the parameters of the transformation
wndname1 = "Source:"
wndname2 = "WarpPerspective: "
# Show original image
cv2.imshow(wndname1, src)
k = -1
while k != 27:
if f <= 0: f = 1
rotX = (rotXval - 90) * np.pi / 180
rotY = (rotYval - 90) * np.pi / 180
rotZ = (rotZval - 90) * np.pi / 180
distX = distXval - 500
distY = distYval - 500
distZ = distZval - 500
# Camera intrinsic matrix
K = np.array([[f, 0, w / 2, 0],
[0, f, h / 2, 0],
[0, 0, 1, 0]])
# K inverse
Kinv = np.zeros((4, 3))
Kinv[:3, :3] = np.linalg.inv(K[:3, :3]) * f
Kinv[-1, :] = [0, 0, 1]
# Rotation matrices around the X,Y,Z axis
RX = np.array([[1, 0, 0, 0],
[0, np.cos(rotX), -np.sin(rotX), 0],
[0, np.sin(rotX), np.cos(rotX), 0],
[0, 0, 0, 1]])
RY = np.array([[np.cos(rotY), 0, np.sin(rotY), 0],
[0, 1, 0, 0],
[-np.sin(rotY), 0, np.cos(rotY), 0],
[0, 0, 0, 1]])
RZ = np.array([[np.cos(rotZ), -np.sin(rotZ), 0, 0],
[np.sin(rotZ), np.cos(rotZ), 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1]])
# Composed rotation matrix with (RX,RY,RZ)
R = np.linalg.multi_dot([RX, RY, RZ])
# Translation matrix
T = np.array([[1, 0, 0, distX],
[0, 1, 0, distY],
[0, 0, 1, distZ],
[0, 0, 0, 1]])
# Overall homography matrix
H = np.linalg.multi_dot([K, R, T, Kinv])
# Apply matrix transformation
cv2.warpPerspective(src, H, (w, h), dst, cv2.INTER_NEAREST, cv2.BORDER_CONSTANT, 0)
# Show the image
cv2.imshow(wndname2, dst)
k = cv2.waitKey(1)
# Convert to Lab colourspace
Lab = color.rgb2lab(dst)
print(Lab)
# Convert to Lab colourspace
cv2.namedWindow(wndname1, 1)
cv2.namedWindow(wndname2, 1)
cv2.createTrackbar("f", wndname2, f, 1000, onFchange)
cv2.createTrackbar("Rotation X", wndname2, rotXval, 180, onRotXChange)
cv2.createTrackbar("Rotation Y", wndname2, rotYval, 180, onRotYChange)
cv2.createTrackbar("Rotation Z", wndname2, rotZval, 180, onRotZChange)
cv2.createTrackbar("Distance X", wndname2, distXval, 1000, onDistXChange)
cv2.createTrackbar("Distance Y", wndname2, distYval, 1000, onDistYChange)
cv2.createTrackbar("Distance Z", wndname2, distZval, 1000, onDistZChange)
Now what I have tried to do here is to apply homography transformation on an RGB image and also tried to transform it to CIE-L*ab colour space. What I need is, to calculate the Euclidean distances among one pixel selected from the object of interest and the rest of the points in the image. This can be done by calculating the level of similarity among the pixels. We also need to compute the delta values of the image yielding only one chromatic difference on an image.
There are delta values associated with this colour scale.ΔL, Δa andΔb indicate how much a standard and simple difference from one another in L, a, and b, respectively. Please help.titliTue, 11 Feb 2020 01:11:40 -0600http://answers.opencv.org/question/226097/error using Lab color spacehttp://answers.opencv.org/question/116917/error-using-lab-color-space/ When i change the object from one position to another, considering that their are two pixels at distance d1 and color (1,a1,b1) and (1,a2,b2), why cant i find this two pixels at given previous distance d1 with their color in lab format? eg. let color of p1 be(1,a1,b1) and p2 be (1,a2,b2) and distance between them be d1. than at next image when the rigid object containing this two pixels changes without change in its oriantation, why is the distance between them not same as privious, and why is their a,b value not same? In Lab space if a pixel is change from one light intensity to another than still their a b value should remain same, isn't it? given two points are voxels.dineshlamaSun, 04 Dec 2016 08:48:56 -0600http://answers.opencv.org/question/116917/separating lab image into 3 componentshttp://answers.opencv.org/question/64108/separating-lab-image-into-3-components/Hello Everyone, I want to dividide a BGR image into 3 images containing the 3 different CIE Lab color components (with just one channel). I know that I can do something like this:
#include <opencv2/opencv.hpp>
using namespace std;
using namespace cv;
int main( int argc, char** argv )
{
Mat src = imread( "/home/diego/Documents/sunset.jpg", -1 ), lab;
cvtColor(src, lab, CV_BGR2Lab);
imshow("lab", lab);
return 0;
}
And this gives me a kind of result like this
![image description](/upfiles/14342779333732375.jpeg)
However I read something that tells me that I need to treat the Luminance between 0 and 100, and a,b with values between -127 to 127 (So I don't know if this image is right or wrong). I would like to ask what is meant with these values... are these ones the pixel values, are these ones normalized values from the original 0 to 255? Are the values of this 3 channel image representing L((lab[0]), A(lab[1]), and B(lab[2])?
I have checked in several webpages to give me an idea but I just see very theoretical information about how the 3 axis Lab color space is composed but nothing that I can convert into real things.
Thanks and sorry for so many questions.diegomez_86Sun, 14 Jun 2015 05:43:12 -0500http://answers.opencv.org/question/64108/L*u*v* and L*a*b* Color Space are Non-Euclidean in 8Uhttp://answers.opencv.org/question/32581/luv-and-lab-color-space-are-non-euclidean-in-8u/From the docs for Luv:
> In case of 8-bit and 16-bit images, R, G, and B are converted to the floating-point format and scaled to fit 0 to 1 range. [...]
This outputs 0 <= L <= 100, -134 <= u <= 220, -140 <= v <= 122 .
The values are then converted to the destination data type:
8-bit images: L := 255/100 L, u := 255/354 (u + 134), v := 255/256 (v + 140)
A similar re-scaling happens for Lab.
Both Lab and Luv are perceptually Euclidean.
From Wikipedia:
> .. [The] relative perceptual differences between any two colors in Lab can be approximated by treating each color as a point in a three-dimensional space (with three components: L, a, b) and taking *the Euclidean distance between them*.
This Euclidean perceptual property is the main reason for using Lab and Luv.
The problem with the 8-bit re-scaling is that the Euclidean property is not preserved (in `8U`).
Has anyone noticed this before?
AdiWed, 30 Apr 2014 02:10:52 -0500http://answers.opencv.org/question/32581/CIE Lab conversion errorhttp://answers.opencv.org/question/37339/cie-lab-conversion-error/I am converting a cv::Mat(1,1,CV_32FC3) matrix using cvtColor and CV_BGR2Lab.
The answer I get back does not conform to the formula given in the documentation,
but does at points. So I am baffled as to what I am doing wrong. I can hardly
believe this is wrong in OpenCV.
It *appears* that the cube root part of the CIE Lab formula is not implemented.
So, for example, Black (0,0,0) is converted correctly, as is while (1,1,1), and
other colors, such as blue (1,0,0), green (0,1,0), and red (0,0,1).
However all in between colors seem wrong. For example, gray (128/255 128/255 128/255), or
approximately 0.5020 for each entry, gives me Lab = (53.5850 0 0), when it should return
the value (76.1895 0 0).
I have plotted the entire range of gray and see that cube root part of Open CV's
implementation seems to be missing. Greg WalshWed, 16 Jul 2014 17:20:00 -0500http://answers.opencv.org/question/37339/