2015-03-24 08:16:33 -0600 | commented question | 3D coordinates of a colored tracked object with stereo vision Thank you for your message. ReprojectImageTo3D is usefull but i is more understandable to me to write the calculus in an explicit way ... But because of that I get an error though . |
2015-03-24 08:13:02 -0600 | answered a question | 3D coordinates of a colored tracked object with stereo vision OK so it seems a division by 0 were hided somewhere : I change my calcul with : with self.ratio = 105.32 Then I get some accurate results ...don't ask me about self.ratio, I think it is linked to the resolution because I don't need to change it even if my calibration parameters has changed, but I am still trying to figure it out |
2015-03-20 08:18:11 -0600 | received badge | ● Editor (source) |
2015-03-19 13:32:23 -0600 | asked a question | 3D coordinates of a colored tracked object with stereo vision Hello, there is a lot of topics about 2D to 3D but I couldn't find my problem in those. So I use the stereo_camera_calibration to find the parameters of my cameras. ( I followed this blog : [http://blog.martinperis.com/2011/01/o...] ) Then I am using this relation to deduce the 3D coordinates of my object : where x and y are the coordinate on the image and dx is the difference between the x of the 2 cameras and Q the opencv matrix What I get : X= [-81.16746711] Y= [ 87.00418513] Z= [-826.69658138] I don't understand how to use those results When moving the object coordinates "follows" the increase or the decrases At the moment I am just focusing on trying to set up the Z. How can I find a relation between my results and the coordinates of the object in " the world" ? EDIT : The relation between disparity and real depths are not linear so that explain why just trying to fix a coeficient didn't solved my problem. Is it possible to calculate the absolute distance between the camera and an object? Or maybe I need to use a landmark near my object to deduce the relative distance between the object and the landmark? |