Hello all,
I'm running into a problem reading a 10-bit, 3-channel encoded video that was saved from a high definition camera we have. I checked the file using GSpot, and it is encoded using the "Optibase VideoPump 10-bit 4:2:2 Component Y'CbCr" codec (v210).
I can read in frames using OpenCV's VideoCapture class from the file. I can even display the frames just find using OpenCV's imshow function. Unfortunately, the cv::Mat object that the frames are saved into is of an 8-bit type (CV_8UC3).
Here is the code that I am using to read the video file:
std::cout<< "Attempting to open file: "<< filename<< std::endl;
cv::VideoCapture videoIn;
videoIn.open( filename);
if( !videoIn.isOpened())
throw "Error when reading stream!";
// Set up the ViBe-based background segmenter.
if( !videoIn.read( frame))
return 0;
int type= frame.type(); // Returns 16 (CV_8UC3)
Like I said earlier, frames are being read and displayed, but when I check frame.type(), it returns 16 (CV_8UC3). Since I am trying to get the extra precision by using 10-bits, this is suboptimal.
Does anyone know how to actually read the frames at 10-bit?
Thanks!