Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

pixel intensity are different from opencv to matlab

Hello, I am trying to convert matlab code to c++ code with opencv. And it's killing me..... The pixel intensity of the same image is different between opencv and matlab. The code used in opencv to printf pixel intensity:

    Mat img1 = imread("image1.jpg");
    Mat gray_image1;
    cvtColor(img1, gray_image1, CV_BGR2GRAY); //searched that imread in opencv read as BGR sequence
    int Height = gray_image1.rows;
    int Width = gray_image1.cols;
     for(int m=0;m<H;m++){
         for(int n=0;n<W;n++){
           printf("%d",gray_image1.at<uchar>(m,n));
        //or printf("%f",gray_image1.at<float>(m,n)); 
    }
    printf("\n");
    }

and the code used in matlab

Im1=imread("image1.jpg");

and in matlab just set a break point to see the pixel intensity of Im1. The difference between the same pixel is large. For example, the last pixel is about 180 in opencv while it is 200 in matlab. Could anyone tell me how to solve this problem? And what's the datatype of gray_image1 pixel in my code? float or int? And how t convert it into float to improve the accuaracy.