Ask Your Question
0

cv::normalize not working for 16UC1

asked 2017-02-14 01:35:36 -0600

Nbb gravatar image

I am trying to scale the values of my matrix ucm which is of type 16UC1. The current min and max of ucm is 0 and 1 respectively.

cv::normalize(ucm, ucm, 0, 65535, cv::NORM_L2, CV_16UC1);

double min, max;
cv::minMaxLoc(ucm, &min, &max);
cout << min << " " << max << endl;

The above line of code however, results in a min and max of 0. I never experienced this problem before when scaling matrices of type 8UC1.

edit retag flag offensive close merge delete

2 answers

Sort by ยป oldest newest most voted
0

answered 2017-02-14 02:34:02 -0600

LBerger gravatar image

updated 2017-02-14 03:59:53 -0600

there is no problem you swap parameter 0 and 65535 :

Mat ucm = (Mat_<ushort>(2, 2) << 1, 2, 3, 4);
cout << ucm << endl;
cv::normalize(ucm, ucm, 65535, 0, cv::NORM_L2, CV_16UC1);
cout<<ucm<<endl;

and result :

[1, 2;
 3, 4]
[11965, 23930;
 35895, 47860]
11965 47860

47860 =4* 65535/sqrt(1+2 * 2+3 * 3+4 * 4)

edit flag offensive delete link more
0

answered 2017-02-14 02:37:13 -0600

kbarni gravatar image

if you want to normalize between 2 values, use NORM_MINMAX. Your code should be:

cv::normalize(ucm, ucm, 0, 65535, cv::NORM_MINMAX, CV_16UC1);

Otherwise the L2 norm of your image will be alpha=0 (so all your image is 0).

See the docs for more info

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2017-02-14 01:35:36 -0600

Seen: 2,614 times

Last updated: Feb 14 '17