# camera calibration - tracking object distance

Hello,

I'm trying to calibrate my camera so I can eventually figure how far an object is away from the camera.

I've been following this question to detail the equation for figuring distance of an object but don't have enough reputation to comment or enough to publish a link so there's a space in http: h ttp://stackoverflow.com/questions/14038002/opencv-how-to-calculate-distance-between-camera-and-object-using-image

there are a couple things I'm unsure about, my matrix after running the calibrate.py with 30 or so photos, ran it another time with a different set of photos and got pretty much the same results.

```
RMS: 0.230393020863
camera matrix:
[[ 294.17185696 0. 153.23247818]
[ 0. 295.43662344 119.46194893]
[ 0. 0. 1. ]]
distortion coefficients: [ 0.14143871 -0.76981318 -0.01467287
-0.00334742 0.88460406]
```

in the other forum, the matrix was built for a iPhone 5S camera, the f_x and f_y results were 2.8036 but written down as:

```
f_x = 2803
f_y = 2805
c_x = 1637
c_y = 1271
```

**why was it multiplied by 1000? should mine be as followed then?:**

```
F_X = 294171
F_Y = 295436
C_X = 153232
X_Y = 119419
```

The further down to calculate pixels in a lower-resolution the object is calculated to be 41 pixels. I've got code working to track a blue ball, shown below. **what do I need to do to calculate the size of ball?**

```
if len(cnts) > 0:
c = max(cnts, key=cv2.contourArea)
((x, y), radius) = cv2.minEnclosingCircle(c)
M = cv2.moments(c)
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
if radius > 10:
cv2.circle(frame, (int(x), int(y)), int(radius),
(0, 255, 255), 2)
cv2.circle(frame, center, 5, (0, 0, 255), -1)
```

Thanks for your help!!