# Distance from camera to object.. The error increasing linearly!

I compute the distance from an object (Chessboard) using OpenCV and my camera parameters following those steps:

1- findChessboardCorners

2- solvePnP

3- ->Distance from the translation vector given by solvePnP

The issue is that the error is increasing linearly when I get far from the camera.

The error is simply the difference between the real distance and the one given by my program.

The camera is calibrated (focal length 2.8mm, wide angle + distortion)

I want to know why the error is increasing? Is that normal?

like in this link

@LBerger So what do you think that error is due to? is it normal as I'm using a single camera?

yes with one camera error should be something like d*pixel_size/focal/2

Yep. Basically, your error in finding the chessboard corners goes up with the distance, because a 0.1 (or whatever) pixel error is now a larger distance. So a higher uncertainty in your world points means a higher uncertainty in your camera location.

@LBerger@Tetragramm Do you have any references about that? I'm interested more in that error so I can correct it with respect to my camera parameters :)

reference "Multiple view geometry in computer vision" R. Hartley A. Zisserman

@LBerger I mean that error d*pixel_size/focal/2, how did you get it ? it's not linear!

pixel_size is constant, focal length is constant, 2 is constant, only d changes, so it is linear.

What that equation means is basically what I said. As you move further away, each pixel covers more area on the object. So if you double the distance, what was 4 pixels is now one pixel.

OK got it! I'll write an answer.. thanks both of you