# From 3d point cloud to disparity map

Using a stereo calibrated rig of cameras, Ive obtained a disparity map. Using the reprojectImageTo3D() function, I have the 3d point cloud. I want to do some filtering and segmentation on the point cloud and after that re-render it to a disparity image.

Is there an elegant way to do this? I know about projectPoints() and I found this rendering example http://opencv.jp/opencv2-x-samples/point-cloud-rendering but it offers a free way of viewing the point cloud. I would like to simulate it so the output I get is actually the same as the disparity map.

edit retag close merge delete

Sort by » oldest newest most voted

Hi AgentCain,

I don't know if there is a function that implements this for you, but going back from 3D PointCloud to disparity is relatively easy knowing the reprojection matrix Q (one of the arguments to the function reprojectImageTo3D).

The matrix Q has the following structure:

Q =| 1 0 0 -Cx |
| 0 1 0 -Cy |
| 0 0 0  f  |
| 0 0 a  b  |


So assuming that your 3D point has the coordinates (X', Y', Z'), the disparity (d) and its position on the disparity image (Ix and Iy) can be calculated as follow:

d = (f - Z' * b ) / ( Z' * a)

Ix = X' * ( d * a + b ) + Cx

Iy = Y' * ( d * a + b ) + Cy

I attach a picture of the math, just in case you want to know the details. I hope this helps.

more

This really helps, thank you very much! Do you also know where is the center of the coordinate system?I think its the dead center of the reprojectImageTo3D() output

( 2012-11-22 04:00:07 -0500 )edit

If I am not mistaken, the center of the coordinate system is located at the left camera's center of projection with the X pointing to the right, Y pointing down and Z (depth) pointing outside the camera.

( 2012-11-22 04:29:31 -0500 )edit

I think you can calculate distance between point cloud and it projection to plane, multiply by proper factor and obtain your disparity point.

more

the thing is, I know that the output of the reprojectImageTo3D() function is of the same size as the disparity image. So every pixel of the output has the x,y,z values of the corresponding pixel in the disparity image. But how should I calculate the distance of that pixel relatively to the camera so I'll have the disparity. I dont know how the x,y,z axes are placed or where is the center of my xyz system relatively to the camera

( 2012-11-21 11:16:17 -0500 )edit

Hello,

I need to know if you got your coordinate system and translate the output of ReprojectImageTo3D function as I am working in the same topic, I have like this result

X coordinate values always between 350 - 450 Y coordinate values always between -80 to -30 z coordinates values always between 230 - 280 and sometimes 10000 when so far

more

Official site

GitHub

Wiki

Documentation