OpenCV Q&A Forum  RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 20122018.Mon, 04 Mar 2019 11:56:14 0600Extracting the Essential matrix from the Fundamental matrixhttp://answers.opencv.org/question/209787/extractingtheessentialmatrixfromthefundamentalmatrix/Hello everybody,
today I've a question for you all.
First of all, I've searched across the forum, across OpenCV forum and so on. The answer is probably inside one of them, but at this point I need some clarification, that's why I'm here with my question.
**INTRODUCTION**
I'm implementing an algorithm able to recover the **calibration** of the cameras, able to rectify the images in a good manner (to be more clear, estimating the extrinsic parameters). Most of my pipeline is pretty easy, and can be found around of the web. Obviously, I don't want to recover the full calibration but most of it. For instance, since I'm actually working with the KITTI dataset (http://www.cvlibs.net/publications/Geiger2013IJRR.pdf), I suppose that I know the value of **K_00**, **K_01**, **D_00**, **D_01** (camera intrinsics, they're given in their calibration file), so the value of the camera matrices and the distortion coefficient are known.
I do the following:
 Starting from the raw distorted images, I apply the undistortion using the intrinsics.
 Extract corresponding points from the **Left** and **Right** images
 Match them using a matcher (FLANN or BFMatcher or whatever)
 Filter the matched points with an outlier rejection algorithm (I checked the result visually)
 Call **findFundamentalMat** to retrieve the fundamental matrix (I call with LMedS since I've already filtered most of the outliers in the previous step)
If I try to calculate the error of the points correspondence applying `x' * F * x = 0` the result seems to be good (less than 0.1) and I suppose that everything is ok since there are a lot of examples around the web of doing that, so nothing new.
Since I want to rectify the images, I need the essential matrix.
**THE PROBLEM**
First of all, I obtain the Essential matrix simply applying the formula (9.12) in HZ book (page 257):
cv::Mat E = K_01.t() * fundamentalMat* K_00;
I then normalize the coordinates to verify the quality of E.
Given two correspondent points (matched1 and matched2), I do the normalization process as (obviously I apply that to the two sets of inliers that I've found, this is the example of what I do):
cv::Mat _1 = cv::Mat(3, 1, CV_32F)
_1.at<float>(0,0) = matched1.x;
_1.at<float>(1,0) = matched1.y;
_1.at<float>(2,0) = 1;
cv::Mat normalized_1 = (K_00.inv()) * _1;
So now I have the Essential Matrix and the normalized coordinates (I can eventually convert to Point3f or other structures), so I can verify the relationship `x'^T * E * x=0 ` *(HZ page 257, formula 9.11)* (I iterate over all the normalized coordinates)
cv::Mat residual = normalized_2.t() * E * normalized_1;
residual_value += cv::sum(residual)[0];
Every execution of the algorithm, the value of the Fundamental Matrix **slightly** change as expected (but the mean error, as mentioned above, is always something around 0.01) while the Essential Matrix... change a lot!
I tried to decompose the matrix using the OpenCV SVD implementation (I've understand is not the best, for that reason I'll switch probably to LAPACK for doing this, any suggestion?) and again here, the constraint that the two singular values must be equal is not respected, and this drive all my algorithm in a completely wrong estimation of the rectification.
I would like to test this algorithm also with the images produced with my own cameras (I've two Allied Vision camera) but I'm waiting for a high quality chessboard, so the KITTI dataset is my starting point.
**EDIT** one previous error was in the formula, I've calculated the residual of E as `x^T * E * x'=0 ` instead of `x'^T * E * x=0`. This is now fixed and the residual error of E seems to be good, but the Essential matrix that I get everytime is very different... And after the SVD, the two singular value doesn't look similar as they have to.
**EDIT** This is the different SVD singular value result:
cv::SVD produce this result:
>133.70399
>127.47910
>0.00000
while Eigen::SVD produce the following:
>1.00777
>0.00778
>0.00000
Okay maybe is not an OpenCV related problem, for sure, but any help is more than welcomeHYPEREGOMon, 04 Mar 2019 11:56:14 0600http://answers.opencv.org/question/209787/Essential matrix 6x3 (expecting 3x3)http://answers.opencv.org/question/67951/essentialmatrix6x3expecting3x3/ After calling [findEssentialMat()](http://docs.opencv.org/3.0rc1/d9/d0c/group__calib3d.html#ga0c86f6478f36d5be6e450751bbf4fec0) function:
essMatrix = findEssentialMat( points1, points2, 1.0, Point2d( 0, 0 ), RANSAC, 0.999, 3.0 );
I get 6x3 matrix. Next I would like to use [recoverPose()](http://docs.opencv.org/3.0rc1/d9/d0c/group__calib3d.html#ga40919d0c7eaf77b0df67dd76d5d24fa1) function which requires 3x3 essential matrix as an input. Therefore an "assertion fail" error shows up. Why is my essential matrix not 3x3 and what can I do?
Console output:
points1:
[496.24762, 215.24762;
496, 503.5;
783.99518, 503.50491;
783.75238, 215.24762;
639.99756, 359.25223]
points2:
[256.5, 256.5;
256.5, 256.5;
256.5, 256.5;
256.5, 256.5;
0, 0]
essMatrix.rows() = 6
essMatrix.cols() = 3
essMatrix:
[1.495439070665867e08, 0.0004217709137173954, 0.3456540852386693;
0.0004236403216190259, 2.044583490213094e06, 0.6168655278608869;
0.346120572048197, 0.6166038972684622, 2.663582069025173e06;
1.286638357093461e09, 0.001502552560190315, 0.3456533986887974;
0.001504059448579667, 2.06088959516535e06, 0.6168642186647175;
0.3461198575633839, 0.6166026176096243, 2.118298495767899e07]
OpenCV Error: Assertion failed (E.cols == 3 && E.rows == 3) in decomposeEssentialMat, file /.../opencv3.0.0rc1_stable/modules/calib3d/src/fivepoint.cpp, line 597
terminate called after throwing an instance of 'cv::Exception' what(): /.../opencv3.0.0rc1_stable/modules/calib3d/src/fivepoint.cpp:597: error: (215) E.cols == 3 && E.rows == 3 in function decomposeEssentialMat
SteveBorgesWed, 05 Aug 2015 08:09:59 0500http://answers.opencv.org/question/67951/How to use fivepoint.cpphttp://answers.opencv.org/question/17584/howtousefivepointcpp/Hi there,
I notice a nice little function has appeared in opencv : fivepoint.cpp.
However I can't figure how to use the various functions inside. Is there an example available ?
If the functions are not mature yet, I would be glad to test them.
Best regards,
GuidoGuidoFri, 26 Jul 2013 07:27:27 0500http://answers.opencv.org/question/17584/How to enable findEssentialMat in opencv 2.4.9?http://answers.opencv.org/question/9590/howtoenablefindessentialmatinopencv249/Hi, I'm being trying to use the new function findEssentialMat() in OpenCV 2.4.9 but when I try to compile my program it says that findEssentialMat is not defined. I include calib3d and I also link the proper library.
How should I compile OpenCV to enable the function?
This is my program:
#include "opencv2/opencv.hpp"
using namespace std;
using namespace cv;
Mat getEssential(const vector<KeyPoint>& keypoints1,const vector<KeyPoint>& keypoints2,vector<DMatch>& matches){
vector<Point2f> p1, p2;
for (vector<DMatch>::const_iterator it= matches.begin();it!= matches.end(); ++it) {
float x=keypoints1[it>queryIdx].pt.x;
float y=keypoints1[it>queryIdx].pt.y;
p1.push_back(Point2f(x,y));
x=keypoints2[it>trainIdx].pt.x;
y=keypoints2[it>trainIdx].pt.y;
p2.push_back(Point2f(x,y));
}
Mat output;
Mat essen = findEssentialMat(p1,p2,focal,pp,CV_RANSAC,0.99,1,output);
vector<DMatch> inliers;
for(int i=0;i<output.rows;i++){
int status=output.at<char>(i,0);
if(status==1){
inliers.push_back(matches[i]);
}
}
matches=inliers;
return essen;
}
int main(){
Ptr<FeatureDetector> fast = new FastFeatureDetector(10,true);
Ptr<FeatureDetector> detector = new PyramidAdaptedFeatureDetector(fast,3);
FREAK freak(true,true,22.0f,0);
BFMatcher matcher(NORM_HAMMING,true);
vector<DMatch> matches;
vector<KeyPoint> kp0,kp1;
Mat d0, d1;
Mat im0 = imread("/home/Chini/im0.png",0);
Mat im1 = imread("/home/Chini/im1.png",0);
detector>detect(im0,kp0,Mat());
detector>detect(im1,kp0,Mat());
freak.compute(im0,kp0,d0);
freak.compute(im1,kp1,d1);
matcher.match(d0,d1,matches);
Mat e = getEssential(kp0,kp1,matches);
}
When I try to compile it I received the following message:
example.cpp: In function ‘cv::Mat getEssential(const std::vector<cv::KeyPoint>&, const std::vector<cv::KeyPoint>&, std::vector<cv::DMatch>&)’:
example.cpp:18:62: error: ‘findEssentialMat’ is not defined
Thanks in advanceRaulPLMon, 18 Mar 2013 23:29:57 0500http://answers.opencv.org/question/9590/OpenCV Stereo Matching Essential Matrix weird valueshttp://answers.opencv.org/question/6285/opencvstereomatchingessentialmatrixweirdvalues/I have a stereo setup using OpenCV and two webcams (The one in the book Lerning OpenCV by H&Z). I computed essential and fundamental matrices, intrinces extrinces etc using BM correspondancy algorithm. Now I want to find the matching point of a pixel in left image in the other image. To do this I have defined the following function, which is incomplete since my primary aim is to calculate real world distance.
void StereoVision::findEpipolarLineForXY(int x, int y ,int lr)
{
if(calibrationDone)
{
CvPoint3D32f p1={x,y,1};
qDebug("%d,_,_,%d",p1.x,p1.y);
CvMat pt1=cvMat(3,1,CV_64FC1,&p1);
qDebug("");
CvMat e=_E;
qDebug("pt1:");
PrintMat(&pt1);
qDebug("e:");
PrintMat(&e);
//CvMat * corLine;
//CvMat* pt2=e*pt1;
CvMat *pt2 = cvCreateMat( e.rows, pt1.cols, CV_64FC1);
qDebug("pt2:");
PrintMat(pt2);
qDebug("%d>%d",pt2>rows,pt2>cols);
cvMatMul( &e, &pt1, pt2 );
qDebug("%d>%d",pt2>cols,pt2>data);
//const CvMat* f=&_F;
qDebug("");
//cvComputeCorrespondEpilines(&mat,lr,f,corLine);
qDebug("");
//qDebug("%d,,,%d",corLine>height,corLine>rows);
}
}
void StereoVision::PrintMat(CvMat *A)
{
int i, j;
for (i = 0; i < A>rows; i++)
{
QDebug dbg(QtDebugMsg);
dbg<<"\n";
switch (CV_MAT_DEPTH(A>type))
{
case CV_32F:
case CV_64F:
for (j = 0; j < A>cols; j++)
dbg <<"%8.3f "<< ((float)cvGetReal2D(A, i, j));
break;
case CV_8U:
case CV_16U:
for(j = 0; j < A>cols; j++)
dbg <<"%6d"<<((int)cvGetReal2D(A, i, j));
break;
default:
break;
}
dbg.~QDebug();
}
qDebug("");
}
I want to know why essential matrix is a bad one? all output is below:
350,,,317
0,,,1081466880

pt1:
%8.3f 350
%8.3f 317
%8.3f 1
e:
%8.3f 0 %8.3f inf %8.3f 0
%8.3f 0 %8.3f 0 %8.3f 0
%8.3f 0 %8.3f 0 %8.3f 0
pt2:
%8.3f inf
%8.3f inf
%8.3f inf
3>1
1>44201616
Also Id like to know if im on the right path to find the 3D distance of the pixel in real world coordinates?dramaticlookSun, 20 Jan 2013 03:55:13 0600http://answers.opencv.org/question/6285/5points algorithm in opencv ?http://answers.opencv.org/question/2603/5pointsalgorithminopencv/Hi there,
I was wondering if it was planned to add the 5point algorithm for essential matrix computation to opencv ? If not, is there people around here interested in cooperating to add this feature to opencv ?
Bests,
GuidoGuidoTue, 25 Sep 2012 09:59:00 0500http://answers.opencv.org/question/2603/