Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Matrix subtraction normalization

Hello everyone.

Recently i had to implement an algorithm to subtract an image from this same image blurred. I first tried to:

Mat img = imread("some_image");
GaussianBlur(img, blurred, Size(7, 7), 0, 0);
Mat result = img - blurred;

But my output (result) was displayed as an black image.

So I found this normalization steps to solve the problem:

result_pixel = (pixel_image - pixel_blurred_image) / 2 + 127 ; for each pixel on image:

void sub(Mat & src, Mat & src2, Mat & result ) {

        for (int i = 0; i <= src.cols ; i++) {
                for (int j = 0; j <=src.rows; j++) {

                        int px1  = int(src.at<uchar>(j, i));
                        int px2 = int(src2.at<uchar>(j, i));
                        int px   = (px1 - px2 / 2) + 127;

                        result.at<uchar>(j, i) = px;

                }
        }
}

This kinda of normalization seems trivial to me. So I was wondering, doesn't Opencv already provides any option to apply this normalization automatically?

Matrix subtraction normalization

Hello everyone.

Recently i had to implement an algorithm to subtract an image from this same image blurred. I first tried to:

Mat img = imread("some_image");
GaussianBlur(img, blurred, Size(7, 7), 0, 0);
Mat result = img - blurred;

But my output (result) was displayed as an black image.

So I found this normalization steps to solve the problem:

result_pixel = (pixel_image - pixel_blurred_image) / 2 + 127 ; for each pixel on image:

void sub(Mat & src, Mat & src2, Mat & result ) {

        for (int i = 0; i <= < src.cols ; i++) {
                for (int j = 0; j <=src.rows; < src.rows; j++) {

                        int px1  = int(src.at<uchar>(j, i));
                        int px2 = int(src2.at<uchar>(j, i));
                        int px   = ( (px1 - px2 ) / 2) + 127;

                        result.at<uchar>(j, i) = px;

                }
        }
}

This kinda of normalization seems trivial to me. So I was wondering, doesn't Opencv already provides any option to apply this normalization automatically?

Matrix subtraction normalization

Hello everyone.

Recently i had to implement an algorithm to subtract an image from this same image blurred. I first tried to:

Mat img = imread("some_image");
GaussianBlur(img, blurred, Size(7, 7), 0, 0);
Mat result = img - blurred;

But my output (result) was displayed as an black image.

So I found this normalization steps to solve the problem:

result_pixel = (pixel_image - pixel_blurred_image) / 2 + 127 ; for each pixel on image:

void sub(Mat & src, Mat & src2, Mat & result ) {

        for (int i = 0; i < src.cols ; i++) {
                for (int j = 0; j < src.rows; j++) {

                        int px1  = int(src.at<uchar>(j, i));
                        int px2 = int(src2.at<uchar>(j, i));
                        int px   = ( (px1 - px2 ) / 2) + 127;

                        result.at<uchar>(j, i) = px;

                }
        }
}

This kinda of normalization seems trivial to me. So I was wondering, doesn't Opencv already provides any option to apply this normalization automatically?