# calculate distance between two objects in a image using single camera

i am calculating distance between two balls in a image. first i have detected balls using hough transform circle and get their center point coordinates and applied distance measuring formula to get distance, but not getting nearby to the solution. say if two balls are 13 cm apart then i am getting 5.6 cm...

edit retag close merge delete

1

can you post some equations (or thoughts) about the algorithm that is computing the distance? Maybe you forgot a division by 2, so 5.6*2=11.2 which may include an error of measurement for 13?

i have coordinates of center of two detected circle (x1,y1) and (x2,y2) by pythagoras theorem , d=sqrt( (x2-x1)^2 + (y2-y1)^2 ) and converting result into cm (converting pixels to cm)

1

Are the balls on the same plane, parallel to the image plane? Have u taken a shot with a known size object at the same distance to know how many mm/pixel you have in the image?

If you've done this then you have all the info to get the distance between the 2 balls. Some error may come in if the balls are lying in different planes parallel to the image plane, but if they're not then you should get the correct distance between each other.

1

I guess that the problem is in the conversion pixels to cm. how have you done it?

There is no way to convert pixels to a distance. If your balls are aligned with your camera, your pixel distance would be zero independent of the distance of your balls. (you probably couldn't see the second ball, but I guess you get the point). You need to compute the full 3d Position of your balls to get the distance. And this is possible as long as you know the radius of your balls. (and you have a calibrated camera)

1

@FooBar: he's not trying to get the distance between the camera and the balls, but the relative distance between the 2 balls. Since the balls define a single plane this can be done without calibration: just need to place the camera perpendicular to the plane defined by the balls pose and count pixel in image using a known object size (could be one of the balls as well)

I know. But this is also not possible. Let's say we attach the balls to the ends of a stick with a length of 1m. If we align the stick with the direction to the camera, the balls are observed at the same pixel location (if you can somehow see both balls) so that the pixel distance is zero. Now your rotate the stick so that it's perpendicular to the optical axis. Now the balls are seen at different pixel locations even though their distance is still the same. [Now I get your point: You want to move the camera so that you have a special configuration. That could simplify the problem, but strongly restricts the use cases]

1

Well, it depends on the context he's dealing with..If the balls are on a table, then you just need a camera looking at the table from above and you're done with a single shot and no calibration. In the most general case where the camera is fixed and the balls can be at different depths you're right, will need to compute full 3d position to get relative distance but i think regular stereo would be more accurate (so at least 2 shots)

@FooBar: I think it is possible to get the distance between his balls if the balls moves only in the plane parallel to the camera plane (the same one always) and/or if you know the balls sizes, so you are able to get the size of the pixel. More if you know that on ball A the pixel has X cm and on ball B the pixel has Y cm, then you are also able to detect the distance based on Pythagoras Theorem.

even on a table he can't just get from pixels to cm as the pixel distance will decrease with the distance of the balls to the optical center.

Sort by » oldest newest most voted The problem can be divided in two parts:

Plane containing the balls: Constrain in this problem would be that you need the plane containing the balls perpendicular to axis of your camera. If that is not the case, the detected distance between your balls will always be less than actual distance.

Distance of objects from camera: One case would be having the plane containing the ball at a fixed distance from your camera. In this case conversion of distance in pixels to cm would be possible by calibrating your camera. This can be done by placing your objects at a known distance from each other (say 10 cm) and calculating the distance between them in pixels say (500 pixels). Now while calculating distance would be simple mathematics. If pixel distance is 300, then actual distance is: x = 300 * 10 / 500. In this case it wont be necessary to know the size of object

Second case would be having the plane containing the ball at variable distance from your camera. In this case you should know the size of object before hand. Knowing the size of object, would help calibrate the camera. Say a object which (in your case : ball) is say 5 cm in real world is represented using 30 pixels, then the distance between object which is 300 pixels on image can be calculate as (300 * 5 / 30). Thus knowing the object size is key factor if the plane containing the objects is variable.

Hope this helps you!

more

In the first case I would say "the detected distance between your balls will always be greater than actual distance". More, I do not really understand "having the plane containing the ball at a fixed distance from your camera" (isn't it the ball is at a fixed distance from the camera, that is a "curved-plane")?

2

Thats never really possible irrespective of what. Even the problem statement clearly states that detected distance is less than the actaul distance which contradicts your prediction.

If the object size is known with a single shot you don't need a calibrated camera (in fact, in your first case you're calibrating the camera with a known object size which is the calibration pattern. Why not Just skip this step and do the same directly on the ball?) no matter the distance: in every case you'll get distance between the 2 objects by simply counting pixels, if they lie on the same plane parallel to the image plane. If they're on different planes then you're not going to solve the problem anyway.

2

ya thats correct, but I wanted to point out that if distance of objects from camera keeps changing, you will have to know the size of the object beforehand.

If the distance to the objects plane is changing, but the objects remain in the same plane parallel to the camera plane(image plane) it is also able to solve the problem...

sure, as long as you know the size of an object on that plane you can solve the problem for any given distance..

solved this problem,

by calculating distance between two coordinates of points using Pythagoras in pixels and just convert it into measuring units in (cm) , this solve my issue.

more

Official site

GitHub

Wiki

Documentation