pixel intensity are different from opencv to matlab
Hello, I am trying to convert matlab code to c++ code with opencv. And it's killing me..... The pixel intensity of the same image is different between opencv and matlab. The code used in opencv to printf pixel intensity:
Mat img1 = imread("image1.jpg");
Mat gray_image1;
cvtColor(img1, gray_image1, CV_BGR2GRAY); //searched that imread in opencv read as BGR sequence
int Height = gray_image1.rows;
int Width = gray_image1.cols;
for(int m=0;m<H;m++){
for(int n=0;n<W;n++){
printf("%d",gray_image1.at<uchar>(m,n));
//or printf("%f",gray_image1.at<float>(m,n));
}
printf("\n");
}
and the code used in matlab
Im1=imread("image1.jpg");
and in matlab just set a break point to see the pixel intensity of Im1. The difference between the same pixel is large. For example, the last pixel is about 180 in opencv while it is 200 in matlab. Could anyone tell me how to solve this problem? And what's the datatype of gray_image1 pixel in my code? float or int? And how t convert it into float to improve the accuaracy.
please re-try this comparison with bmp or png images, not with lossy jpg's
also, either load as grayscale:
imread(name,0)
or useimage.at<Vec3b>(m,n)
Hi, berak, I made a mistake before using cv:Rect So I didn't crop the image as desired. Thanks for your input.
hi,I'm facing the same issue. I tried with bmp and png also. But getting different results. @jason can you please tell me how you align with the same results.