# measuring distance between two balls in millimeters - how to improve accuracy [closed]

Hi!

I'm currently learning OpenCV and my current task is to measure the distance between two balls which are lying on a plate. My next step is to compare several cameras and resolutions to get a feeling how important resolution, noise, distortion etc. is and how heavy these parameters affect the accuracy. If the community is interested in the results I'm happy to share the results when they are ready! The camera is placed above the plate using a wide-angle lens. The width and height of the plate (1500 x 700 mm) and the radius of the balls (40 mm) are known.

My steps so far:

1. camera calibration
2. undistorting the image (the distortion is high due to the wide-angle lens)
3. findHomography: I use the corner points of the plate as input (4 points in pixels in the undistorted image) and the corner points in millimeters (starting with 0,0 in the lower left corner, up to 1500,700 in the upper right corner)
4. using HoughCircles to find the balls in the undistorted image
5. applying perspectiveTransform on the circle center points => circle center points now exist in millimeters
6. calculation the distance of the two center points: d = sqrt((x1-x2)^2+(y1-y2)^2)

The results: an error of around 4 mm at a distance of 300 mm, an error of around 25 mm at a distance of 1000 mm But if I measure are rectangle which is printed on the plate the error is smaller than 0.2 mm, so I guess the calibration and undistortion is working good.