# Can StereoSGBM support 16bit single channel?

Hello guys,

Once I was trying to use StereoSGBM compute to process 3D depth map with 16 bit single channel images via python + OpenCV3, OpenCV showed an error like "the left image's depth doesn't match 8 bit single channel". Looking at the code[1], exactly current OpenCV looks to restrict the data depth as CV_8U (i.e. uint8) and that's why I got the error.

And then, when I tried to to rebuild OpenCV changing the assertion as CV_16U (i.e. uint16, 16 bit channel), it seems to work well even though I expected to get some problems like compile error or core dumped at running time.

Absolutely, I'm a newbee for this tech and I'm not sure if it exactly works well or not, so that I'm wondering if someone lets me know if it still could contain some problems or issues (or yeah, it's no problem!)

Thanks and Best,

Kota

edit retag close merge delete

Yes that's strange but possible. After you have to check if it's Ok with method computeDisparity3WaySGBM and computeDisparitySGBM. You can try to call StreoSGBM using synthetic image with a known disparity map and check result..

You can ask an issue to opencv develloper.

( 2016-05-25 01:35:46 -0500 )edit

Why do you try to load depth map into stereo algorithm? The depth (disparity) map is the output of the algorithm.

( 2016-05-31 02:36:55 -0500 )edit

Ah, sorry my lack of English. I meant I'd like to use the 16bit stereo image files input and I expects the output is 3D depth you said as well.

( 2016-05-31 03:43:44 -0500 )edit

Sort by » oldest newest most voted

It does not work well, please take a look at the PixTypetypedef, it is used to iterate over the image pixels: https://github.com/Itseez/opencv/blob...

You do not catch any compile time errors because the type is checked during runtime (the line you changed) - it is common way to use cv::Mat class. You do not catch any runtime errors because invalid memory access is not checked on some platforms and in some build configurations. Probably if you try to run debug version of your program or use some tool (e.g. valgrind) to detect memory errors you should see it.

Although, you can try to change it to ushort and recompile OpenCV, than, maybe, it will work fine.

more

Thanks mshabunin for your suggestion. Exactly valgrind showed a bunch of memory errors.

Quick report for my result:

I tried to make PixType as ushort but it didn't work. My first compile failed around [1] that looks like for SIMD accelerations. To figure out if it is possible without SIMD, I tried to drop SSE compile option and then the compile succeeded but I got segmentation fault at runtime around the loop, here [2].

I'm still digging this but if you know something around there, it will absolutely help me to understand what's wrong.

( 2016-05-26 03:28:27 -0500 )edit

Looks like a problem is in const PixType* tab which is supposed to be some sort of a pixel value translation table, and have fixed size = TAB_SIZE = 256 + TAB_OFS*2 = 9 * 256. In this place the table size is hardcoded with the u8 type in mind, same thing with the SIMD implementations. And there can be even more such places.

So I'd recommend to use CV_8U type if you do not want to rework the whole algorithm.

If you need a better quality, please take a look at the contrib stereo module which contains BM and SGBM algorithms implemented with binary cost type and smoothing disparity filter in ximgproc module.

( 2016-05-30 04:18:52 -0500 )edit

Official site

GitHub

Wiki

Documentation