Method of 10bit operation
I am going to display UHD 10bit Image on Monitor using Visual Studio 2010 (C++, Unicode Mode), OpenCV 2.4.10 Version. lplImage Structure : pCvMainImage = cvCreateImage(cvSize(3840, 2160), IPL_DEPTH_8U, 3) Input 0~255 value to pCvMainImage->imageData, Monitor Output is
imageData Input : 0 ~ 255 Monitor Output : 0 ~ 1023 I think that Graphic Card or Monitor change 10bit automatically. But I made lplimage structure : pCvMainImage = cvCreateImage(cvSize(3840, 2160), IPL_DEPTH_16U, 3) Input 0~ 1023 Value to pCvMainImage->imageData, Monitor Output is
ImageData Input : 0~ 255 256 ~ 512 513 ~ 767 768 ~ 1023 Monitor Output : 0 ~ 1023 0 ~ 1023 0 ~ 1023 0 ~ 1023
I use cvNamedWindow("Example", CV_WINDOW_AUTOSIZE) and cvShowImage("Example", pCvMainImage). The Cause is Unkown... Thanks for your Concern..
unrelated, but please avoid using opencv's deprecated c-api. they moved away from that in 2010 already, and you have to use cv::Mat(c++) today.
somehow, it's quite unclear, what the error / problem is. can you be a bit more explicit ?
In my opinion, internally you will have to use the appropriate data type (maybe CV_16S) and apply the corresponding color range. Externally, you will have to save your image using a compatible format (something like JPEG HDR?).