# is it possible to calculate a points 3D X and Y coordinate by inverting camera matrix

I have read camera calibration explained.

I was able to do it with and just to test it i used this and everything is perfectly working.

I am just curious given the below equation and assuming I know the Z of a 3D point and its x,y pixel coordinates in the image, would I be able to assess the X, and Y coordinates of the 3D point?

In other words if I multiple the two sides of the equation by inv of camera matrix, and assuming I know the Z = w of a 3D point, then can I get the 3D X and Y of the point in 3D space?

![image description] (https://docs.opencv.org/2.4/_images/m...)

Any comments is much appreciate it.

edit retag close merge delete

Sort by ยป oldest newest most voted

Yes you can. No need to invert the camera intrinsics matrix.

Perspective projection equations:

Reverse perspective projection equations assuming a known Z:

more

1

Thank you for valuable comment

I was wondering how would a small error in measurement of Z would distort the X and Y values assuming the is no error in measuring u and v.

Would you confirm it correlates with 1 / fx and 1/fy for X and Y direction? For my case it would be approximately 1/200.

I guess what I am trying to say is that the higher the fx and fy the lower detect we would experience from measuring Z/depth. Or simply the higher the resolution the more error we can tolerate introduced by measuring Z.

I hope it doesn't sound confusing. :)

Regards

( 2019-01-19 11:56:50 -0500 )edit

Official site

GitHub

Wiki

Documentation