Method of 10bit operation

asked 2016-10-12 22:44:15 -0600

Ltg gravatar image

I am going to display UHD 10bit Image on Monitor using Visual Studio 2010 (C++, Unicode Mode), OpenCV 2.4.10 Version. lplImage Structure : pCvMainImage = cvCreateImage(cvSize(3840, 2160), IPL_DEPTH_8U, 3) Input 0~255 value to pCvMainImage->imageData, Monitor Output is

imageData Input : 0 ~ 255 Monitor Output : 0 ~ 1023 I think that Graphic Card or Monitor change 10bit automatically. But I made lplimage structure : pCvMainImage = cvCreateImage(cvSize(3840, 2160), IPL_DEPTH_16U, 3) Input 0~ 1023 Value to pCvMainImage->imageData, Monitor Output is

ImageData Input : 0~ 255 256 ~ 512 513 ~ 767 768 ~ 1023 Monitor Output : 0 ~ 1023 0 ~ 1023 0 ~ 1023 0 ~ 1023

I use cvNamedWindow("Example", CV_WINDOW_AUTOSIZE) and cvShowImage("Example", pCvMainImage). The Cause is Unkown... Thanks for your Concern..

edit retag flag offensive close merge delete

Comments

unrelated, but please avoid using opencv's deprecated c-api. they moved away from that in 2010 already, and you have to use cv::Mat(c++) today.

berak gravatar imageberak ( 2016-10-13 00:33:49 -0600 )edit

somehow, it's quite unclear, what the error / problem is. can you be a bit more explicit ?

berak gravatar imageberak ( 2016-10-13 00:35:44 -0600 )edit

In my opinion, internally you will have to use the appropriate data type (maybe CV_16S) and apply the corresponding color range. Externally, you will have to save your image using a compatible format (something like JPEG HDR?).

Eduardo gravatar imageEduardo ( 2016-10-13 07:06:27 -0600 )edit