OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Mon, 04 Mar 2019 11:56:14 -0600Extracting the Essential matrix from the Fundamental matrixhttp://answers.opencv.org/question/209787/extracting-the-essential-matrix-from-the-fundamental-matrix/Hello everybody,
today I've a question for you all.
First of all, I've searched across the forum, across OpenCV forum and so on. The answer is probably inside one of them, but at this point I need some clarification, that's why I'm here with my question.
**INTRODUCTION**
I'm implementing an algorithm able to recover the **calibration** of the cameras, able to rectify the images in a good manner (to be more clear, estimating the extrinsic parameters). Most of my pipeline is pretty easy, and can be found around of the web. Obviously, I don't want to recover the full calibration but most of it. For instance, since I'm actually working with the KITTI dataset (http://www.cvlibs.net/publications/Geiger2013IJRR.pdf), I suppose that I know the value of **K_00**, **K_01**, **D_00**, **D_01** (camera intrinsics, they're given in their calibration file), so the value of the camera matrices and the distortion coefficient are known.
I do the following:
- Starting from the raw distorted images, I apply the undistortion using the intrinsics.
- Extract corresponding points from the **Left** and **Right** images
- Match them using a matcher (FLANN or BFMatcher or whatever)
- Filter the matched points with an outlier rejection algorithm (I checked the result visually)
- Call **findFundamentalMat** to retrieve the fundamental matrix (I call with LMedS since I've already filtered most of the outliers in the previous step)
If I try to calculate the error of the points correspondence applying `x' * F * x = 0` the result seems to be good (less than 0.1) and I suppose that everything is ok since there are a lot of examples around the web of doing that, so nothing new.
Since I want to rectify the images, I need the essential matrix.
**THE PROBLEM**
First of all, I obtain the Essential matrix simply applying the formula (9.12) in HZ book (page 257):
cv::Mat E = K_01.t() * fundamentalMat* K_00;
I then normalize the coordinates to verify the quality of E.
Given two correspondent points (matched1 and matched2), I do the normalization process as (obviously I apply that to the two sets of inliers that I've found, this is the example of what I do):
cv::Mat _1 = cv::Mat(3, 1, CV_32F)
_1.at<float>(0,0) = matched1.x;
_1.at<float>(1,0) = matched1.y;
_1.at<float>(2,0) = 1;
cv::Mat normalized_1 = (K_00.inv()) * _1;
So now I have the Essential Matrix and the normalized coordinates (I can eventually convert to Point3f or other structures), so I can verify the relationship `x'^T * E * x=0 ` *(HZ page 257, formula 9.11)* (I iterate over all the normalized coordinates)
cv::Mat residual = normalized_2.t() * E * normalized_1;
residual_value += cv::sum(residual)[0];
Every execution of the algorithm, the value of the Fundamental Matrix **slightly** change as expected (but the mean error, as mentioned above, is always something around 0.01) while the Essential Matrix... change a lot!
I tried to decompose the matrix using the OpenCV SVD implementation (I've understand is not the best, for that reason I'll switch probably to LAPACK for doing this, any suggestion?) and again here, the constraint that the two singular values must be equal is not respected, and this drive all my algorithm in a completely wrong estimation of the rectification.
I would like to test this algorithm also with the images produced with my own cameras (I've two Allied Vision camera) but I'm waiting for a high quality chessboard, so the KITTI dataset is my starting point.
**EDIT** one previous error was in the formula, I've calculated the residual of E as `x^T * E * x'=0 ` instead of `x'^T * E * x=0`. This is now fixed and the residual error of E seems to be good, but the Essential matrix that I get everytime is very different... And after the SVD, the two singular value doesn't look similar as they have to.
**EDIT** This is the different SVD singular value result:
cv::SVD produce this result:
>133.70399
>127.47910
>0.00000
while Eigen::SVD produce the following:
>1.00777
>0.00778
>0.00000
Okay maybe is not an OpenCV related problem, for sure, but any help is more than welcomeHYPEREGOMon, 04 Mar 2019 11:56:14 -0600http://answers.opencv.org/question/209787/