HSV-Range iOS vs. Mac OS [closed]

asked 2012-12-19 03:08:38 -0600

iEng gravatar image

updated 2012-12-19 03:21:17 -0600

I have written a short piece of code, that detects regions with a certain hue in an image (12-17 in the range of 0-180). When running the code in iOS it works perfectly fine. When I run the code in Mac OS as a C++ project, the output only constists of pixels where the hue is around 150 in the range of 0-180. How is this possible? Am I missing something? Both projects are using 2.4.3. The only difference would be the way images are loaded into memory. C++ reads the file directly, Objective-C loads it as a UIImage first. But this really should not make a difference, since the original image is displayed correctly.

This is my code:

char* filename = argc >= 2 ? argv[1] : (char*)"/opencvimgs/1.jpg";
Mat img = imread(filename, 1);
cv::imshow("Original", img);

cv::Mat fg;
std::vector<cv::Mat> channels;
cv::Mat hsv;
cv::cvtColor( img, hsv, CV_RGB2HSV );
cv::split(hsv, channels);
cv::inRange(channels[0], 12, 17, channels[0]);
fg=channels[0];
cv::imshow("Foreground", fg);

Edit: I just discovered, that the 12-17 range in iOS roughly, but not exactly, maps to 90+12-90+17 in Mac OS, the range still being 0-180.

edit retag flag offensive reopen merge delete

Closed for the following reason question is not relevant or outdated by sturkmen
close date 2020-10-08 11:40:35.860713