calculate distance between two objects in a image using single camera
i am calculating distance between two balls in a image. first i have detected balls using hough transform circle and get their center point coordinates and applied distance measuring formula to get distance, but not getting nearby to the solution. say if two balls are 13 cm apart then i am getting 5.6 cm...
can you post some equations (or thoughts) about the algorithm that is computing the distance? Maybe you forgot a division by 2, so 5.6*2=11.2 which may include an error of measurement for 13?
i have coordinates of center of two detected circle (x1,y1) and (x2,y2) by pythagoras theorem , d=sqrt( (x2-x1)^2 + (y2-y1)^2 ) and converting result into cm (converting pixels to cm)
Are the balls on the same plane, parallel to the image plane? Have u taken a shot with a known size object at the same distance to know how many mm/pixel you have in the image?
If you've done this then you have all the info to get the distance between the 2 balls. Some error may come in if the balls are lying in different planes parallel to the image plane, but if they're not then you should get the correct distance between each other.
I guess that the problem is in the conversion pixels to cm. how have you done it?
There is no way to convert pixels to a distance. If your balls are aligned with your camera, your pixel distance would be zero independent of the distance of your balls. (you probably couldn't see the second ball, but I guess you get the point). You need to compute the full 3d Position of your balls to get the distance. And this is possible as long as you know the radius of your balls. (and you have a calibrated camera)
@FooBar: he's not trying to get the distance between the camera and the balls, but the relative distance between the 2 balls. Since the balls define a single plane this can be done without calibration: just need to place the camera perpendicular to the plane defined by the balls pose and count pixel in image using a known object size (could be one of the balls as well)
I know. But this is also not possible. Let's say we attach the balls to the ends of a stick with a length of 1m. If we align the stick with the direction to the camera, the balls are observed at the same pixel location (if you can somehow see both balls) so that the pixel distance is zero. Now your rotate the stick so that it's perpendicular to the optical axis. Now the balls are seen at different pixel locations even though their distance is still the same. [Now I get your point: You want to move the camera so that you have a special configuration. That could simplify the problem, but strongly restricts the use cases]
Well, it depends on the context he's dealing with..If the balls are on a table, then you just need a camera looking at the table from above and you're done with a single shot and no calibration. In the most general case where the camera is fixed and the balls can be at different depths you're right, will need to compute full 3d position to get relative distance but i think regular stereo would be more accurate (so at least 2 shots)
@FooBar: I think it is possible to get the distance between his balls if the balls moves only in the plane parallel to the camera plane (the same one always) and/or if you know the balls sizes, so you are able to get the size of the pixel. More if you know that on ball A the pixel has X cm and on ball B the pixel has Y cm, then you are also able to detect the distance based on Pythagoras Theorem.
even on a table he can't just get from pixels to cm as the pixel distance will decrease with the distance of the balls to the optical center.