OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Sat, 01 Aug 2015 20:50:46 -0500Epipolar geometry pose estimation: Epipolar lines look good but wrong posehttp://answers.opencv.org/question/67540/epipolar-geometry-pose-estimation-epipolar-lines-look-good-but-wrong-pose/ I am trying to use OpenCV to estimate one pose of a camera relative to another, using SIFT feature tracking, FLANN matching and subsequent calculations of the fundamental and essential matrix. After decomposing the essential matrix, I check for degenerate configurations and obtain the "right" R and t.
Problem is, they never seem to be right. I am including a couple of image pairs:
1. Image 2 taken with 45 degree rotation along the Y axis and same position w.r.t. Image 1.
<a href="http://i.imgur.com/lEsdjFn.jpg">Image pair</a>
<a href="http://i.imgur.com/hCYV2kN.jpg">Result </a>
2. Image 2 taken from approx. couple of meters away along the negative X direction, slight displacement in the negative Y direction. Approx. 45-60 degree rotation in camera pose along Y axis.
<a href="http://i.imgur.com/zO1hwh3.jpg">Image pair</a>
<a href="http://i.imgur.com/nn803lk.jpg">Result</a>
The translation vector in the second case, seems to be overestimating the movement in Y and underestimating the movement in X. The rotation matrices when converted to Euler angles give wrong results in both the cases. This happens with a lot of other datasets as well. I have tried switching the fundamental matrix computation technique between RANSAC, LMEDS etc., and am now doing it with RANSAC and a second computation using only the inliers with the 8 point method. Changing the feature detection method does not help either. The epipolar lines seem to be proper, and the fundamental matrix satisfies x'.F.x = 0
Am I missing something fundamentally wrong here? Given the program understands the epipolar geometry properly, what could possibly be happening that results in a completely wrong pose? I am doing the check to make sure points lie in front of both cameras. Any thoughts/suggestions would be very helpful. Thanks!
<a href="http://pastebin.com/42PTHPP6">Code</a> for referenceFri, 31 Jul 2015 13:35:56 -0500http://answers.opencv.org/question/67540/epipolar-geometry-pose-estimation-epipolar-lines-look-good-but-wrong-pose/Answer by Eduardo for <p>I am trying to use OpenCV to estimate one pose of a camera relative to another, using SIFT feature tracking, FLANN matching and subsequent calculations of the fundamental and essential matrix. After decomposing the essential matrix, I check for degenerate configurations and obtain the "right" R and t. </p>
<p>Problem is, they never seem to be right. I am including a couple of image pairs:</p>
<ol>
<li>Image 2 taken with 45 degree rotation along the Y axis and same position w.r.t. Image 1.</li>
</ol>
<p><a href="http://i.imgur.com/lEsdjFn.jpg">Image pair</a></p>
<p><a href="http://i.imgur.com/hCYV2kN.jpg">Result </a></p>
<ol>
<li>Image 2 taken from approx. couple of meters away along the negative X direction, slight displacement in the negative Y direction. Approx. 45-60 degree rotation in camera pose along Y axis.</li>
</ol>
<p><a href="http://i.imgur.com/zO1hwh3.jpg">Image pair</a></p>
<p><a href="http://i.imgur.com/nn803lk.jpg">Result</a></p>
<p>The translation vector in the second case, seems to be overestimating the movement in Y and underestimating the movement in X. The rotation matrices when converted to Euler angles give wrong results in both the cases. This happens with a lot of other datasets as well. I have tried switching the fundamental matrix computation technique between RANSAC, LMEDS etc., and am now doing it with RANSAC and a second computation using only the inliers with the 8 point method. Changing the feature detection method does not help either. The epipolar lines seem to be proper, and the fundamental matrix satisfies x'.F.x = 0</p>
<p>Am I missing something fundamentally wrong here? Given the program understands the epipolar geometry properly, what could possibly be happening that results in a completely wrong pose? I am doing the check to make sure points lie in front of both cameras. Any thoughts/suggestions would be very helpful. Thanks!</p>
<p><a href="http://pastebin.com/42PTHPP6">Code</a> for reference</p>
http://answers.opencv.org/question/67540/epipolar-geometry-pose-estimation-epipolar-lines-look-good-but-wrong-pose/?answer=67606#post-id-67606Not really an answer but my comment doesn't fit.
I don't know Python well nevertheless I will try to help a little maybe:
- in function `in_front_of_both_cameras`, you find the z coordinate using the intersection between the two light rays ? If it is not already done, you could check the formula on simple cases ?
- I don't understand this line: `first_3d_point = np.array([first[0] * first_z, second[0] * first_z, first_z])`, for me it sould be: `first_3d_point = np.array([first[0] * first_z, first[1] * first_z, first_z])`.
- normally it should be Ok, but you could try to print the rotation and translation values for each configuration to see if the pose check is Ok or not (also if there is one configuration that gives correct values).
- for the Euler angles, what [configuration](https://en.wikipedia.org/wiki/Euler_angles#Rotation_matrix) did you use ? I use usually for roll, pitch, yaw the configuration [Z1Y2X3](https://en.wikipedia.org/wiki/Euler_angles#Rotation_matrix). You can also use [RQDecomp3x3](http://docs.opencv.org/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#rqdecomp3x3) directly.
Sat, 01 Aug 2015 09:39:39 -0500http://answers.opencv.org/question/67540/epipolar-geometry-pose-estimation-epipolar-lines-look-good-but-wrong-pose/?answer=67606#post-id-67606Comment by saihv for <p>Not really an answer but my comment doesn't fit.</p>
<p>I don't know Python well nevertheless I will try to help a little maybe:</p>
<ul>
<li>in function <code>in_front_of_both_cameras</code>, you find the z coordinate using the intersection between the two light rays ? If it is not already done, you could check the formula on simple cases ?</li>
<li>I don't understand this line: <code>first_3d_point = np.array([first[0] * first_z, second[0] * first_z, first_z])</code>, for me it sould be: <code>first_3d_point = np.array([first[0] * first_z, first[1] * first_z, first_z])</code>.</li>
<li>normally it should be Ok, but you could try to print the rotation and translation values for each configuration to see if the pose check is Ok or not (also if there is one configuration that gives correct values).</li>
<li>for the Euler angles, what <a href="https://en.wikipedia.org/wiki/Euler_angles#Rotation_matrix">configuration</a> did you use ? I use usually for roll, pitch, yaw the configuration <a href="https://en.wikipedia.org/wiki/Euler_angles#Rotation_matrix">Z1Y2X3</a>. You can also use <a href="http://docs.opencv.org/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#rqdecomp3x3">RQDecomp3x3</a> directly.</li>
</ul>
http://answers.opencv.org/question/67540/epipolar-geometry-pose-estimation-epipolar-lines-look-good-but-wrong-pose/?comment=67627#post-id-67627Hi Eduardo, thanks for the insight. About the first two points, I did make a mistake in point 2, thanks for spotting it. And I was referring to this answer when I coded it, I believe that formula is from H&Z. http://answers.opencv.org/question/27155/from-fundamental-matrix-to-rectified-images/
About point 3, I have had cases where none of the 4 configurations made proper sense. So I was confused about whether I am expecting too much from the method, or whether my features are not good enough.. But again, the epipolar lines looked proper. I am going to check with the Euler angle conversion changed. I was using X1Y2Z3 (again, from a response to another question on this forum)Sat, 01 Aug 2015 20:50:46 -0500http://answers.opencv.org/question/67540/epipolar-geometry-pose-estimation-epipolar-lines-look-good-but-wrong-pose/?comment=67627#post-id-67627