OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Thu, 19 Mar 2015 13:25:31 -05003D coordinates of a colored tracked object with stereo visionhttp://answers.opencv.org/question/57859/3d-coordinates-of-a-colored-tracked-object-with-stereo-vision/Hello, there is a lot of topics about 2D to 3D but I couldn't find my problem in those.
So I use the stereo_camera_calibration to find the parameters of my cameras. ( I followed this blog : [http://blog.martinperis.com/2011/01/opencv-stereo-camera-calibration.html] )
Then I am using this relation to deduce the 3D coordinates of my object :
vect=[[x],[y],[dx],[1]]
result = dot(self.Q, vect)
print "X=", result[0]/result[3]," Y= ",result[1]/result[3]," Z= ", result[2]/result[3]
where x and y are the coordinate on the image and dx is the difference between the x of the 2 cameras and Q the opencv matrix
What I get :
X= [-81.16746711] Y= [ 87.00418513] Z= [-826.69658138]
I don't understand how to use those results
When moving the object coordinates "follows" the increase or the decrases
At the moment I am just focusing on trying to set up the Z.
How can I find a relation between my results and the coordinates of the object in " the world" ?
EDIT : The relation between disparity and real depths are not linear so that explain why just trying to fix a coeficient didn't solved my problem. Is it possible to calculate the absolute distance between the camera and an object? Or maybe I need to use a landmark near my object to deduce the relative distance between the object and the landmark?scarlettThu, 19 Mar 2015 13:25:31 -0500http://answers.opencv.org/question/57859/