Hi guys,
I have some troubles to "re-project" an 2D image point to 3D world coordinate system. Actually I have done an camera calibration with 15 pictures (using the example given by opencv) to find the intrinsic parameters. For the extrinsic parameters I did following:
I created an pattern where the metric position (in millimeters) of the feature points are known. Then I used cv::solvePnP with image points (feature points) and the related (feature points) in world coordinates (z=0 <=> I used a plane pattern). I got rotation and translation vector. With cv::Rodrigues I created a 3x3 rotation matrix. When I look on the values, they look fine, but when I try to find related 3D point from 2D image point, only z-coordinate fits. I tried following to get my 3D point: 3dP = R^T((K^(-1)2dP)-t)
R:=3x3 rotationmatrix K:=3x3 cameramatix(intrinsic parameters) t:=3x1 translationvector 3dp:= world coordinates 2dp:= image coordinates
Is there some stupid failure?
Thanks a lot! Juergeb