OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Mon, 26 Nov 2018 11:04:12 -0600Wrong rank in Fundamental Matrixhttp://answers.opencv.org/question/204100/wrong-rank-in-fundamental-matrix/Hi guys,
I'm using the OpenCV for Python3 and, based on the Mastering OpenCV Book, try to compute the epipoles from many images (Structure from Motion algorithm).
In many books, they say which Fundamental Matrix has rank 2. But, the OpenCV function returns a rank 3 matrix.
How can I make this right?
orb = cv2.ORB_create()
# find the keypoints and descriptors with ORB
kp1, des1 = orb.detectAndCompute(img1,None)
kp2, des2 = orb.detectAndCompute(img2,None)
# create BFMatcher object
bf = cv2.BFMatcher(cv2.NORM_L2, crossCheck=True)
# Match descriptors.
matches = bf.match(des1,des2)
# Sort them in the order of their distance.
matches = sorted(matches, key = lambda x:x.distance)
pts1 = []
pts2 = []
for m in matches:
pts2.append(kp2[m.trainIdx].pt)
pts1.append(kp1[m.queryIdx].pt)
F, mask = cv2.findFundamentalMat(pts1, pt2,cv2.FM_RANSAC)
pts1 = match['leftPts'][mask.ravel()==1]
pts2 = match['rightPts'][mask.ravel()==1]
# F is the Fundamental Matrix
From that code, the output are like
Processing image 0 and image 1
rank of F: 3
Processing image 0 and image 2
rank of F: 3
Processing image 0 and image 3
rank of F: 3
Processing image 0 and image 4
rank of F: 2
[...]
Someone could help me? Someone have any functional code for SfM using OpenCV?
Thanks in advance.
Lucas Amparo BarbosaMon, 26 Nov 2018 11:04:12 -0600http://answers.opencv.org/question/204100/Wrong Epipolar lines, No Visual sanityhttp://answers.opencv.org/question/181730/wrong-epipolar-lines-no-visual-sanity/Hi,
I've tried using the code given https://docs.opencv.org/3.2.0/da/de9/tutorial_py_epipolar_geometry.html to find the epipolar lines, but instead of getting the output given in the link, I am getting the following output.
![image description](/upfiles/15151486894383214.png)
but when changing the line `F, mask = cv2.findFundamentalMat(pts1,pts2,cv2.FM_LMEDS)` to
`F, mask = cv2.findFundamentalMat(pts1,pts2,cv2.FM_RANSAC)` i.e: using `RANSAC` algorithm to find Fundamental matrix instead of `LMEDS` this is the following output.
![image description](/upfiles/15151489062631591.png)
When the same line is replaced with `F, mask = cv2.findFundamentalMat(pts1,pts2,cv2.FM_8POINT)` i.e: use eight point algorithm this is the following output.
![image description](/upfiles/15151505056498559.png)
All of the above about output does not have any visual sanity nor anyway near to close to the given output in opencv documentation for finding epipolar lines. But ironically, if the same code if executed by changing the algorithm to find fundamental matrix in this particular sequence
1. FM_LMEDS
2. FM_8POINT
3. FM_7POINT
4. FM_LMEDS
most accurate results are generated. This is the output.
![image description](/upfiles/1515152387699454.png)
I thought we are suppose to get the above output in one run of any of the algorithm (with the variations in matrix values and error). Am I running the code incorrectly? What is that I've to do, to get the correct epipolar lines (i.e.; visually sane)? I am using opencv version 3.3.0 and python 2.7.
Looking forward for reply.
Thank you.salmankhhcuFri, 05 Jan 2018 06:00:03 -0600http://answers.opencv.org/question/181730/Inverse normalization in 8-point algorithm for fundamental matrixhttp://answers.opencv.org/question/178961/inverse-normalization-in-8-point-algorithm-for-fundamental-matrix/ The 8-point algorithm for Fundamental matrix, normalizes the pixel points before solving the linear system of equations and the solution is inverse normalized to get the Fundamental matrix.
Iam refering to the source at
https://github.com/opencv/opencv/blob/master/modules/calib3d/src/fundam.cpp
Lines 615 to 620
// apply the transformation that is inverse
// to what we used to normalize the point coordinates
Matx33d T1( scale1, 0, -scale1*m1c.x, 0, scale1, -scale1*m1c.y, 0, 0, 1 );
Matx33d T2( scale2, 0, -scale2*m2c.x, 0, scale2, -scale2*m2c.y, 0, 0, 1 );
F0 = T2.t()*F0*T1;
This appears to be same as the normalization procedure. Iam not able to understand how this is supposed to be inverse of the normalization. Any help is appreciated.
thensWed, 22 Nov 2017 22:00:34 -0600http://answers.opencv.org/question/178961/Fundamental Matrix Accuracyhttp://answers.opencv.org/question/57267/fundamental-matrix-accuracy/ I am working on 3d Reconstruction of Scene. I matched the feature of 2 images. I have the Keypoints1 and Keypoints2. I have Fundamental F and Essential Matrix E. Now I have to check X'FX =0. I know X' is the coordinate of second image and X is the coordinate of image of the first image. I have few questions can anyone please help me.
Is X' and X are the keypoints1 and keypoints2 ?
Is it (x, y) coordinate of that keypoints?
The product of X'FX, this will be in Mat format, if I not wrong. How this will be equal to 0?
Please can anyone help, sorry if my question is silly.
Thanks in advance.SUHASWed, 11 Mar 2015 06:06:43 -0500http://answers.opencv.org/question/57267/findFundamentalMat not correctly filtering outliershttp://answers.opencv.org/question/54824/findfundamentalmat-not-correctly-filtering-outliers/After detecting keypoints and matching them between two images, I run findFundamentalMat to estimate the Fundamental matrix and also filter the outliers. When I draw the matches using the mask I get from findFundamentalMat, there is sometimes some matches that are not filtered out eventhough they clearly don't fit in the transform.
Here is an example of a good filtering (Left image from robot's camre, right image static):
![image description](/upfiles/14235395351249148.png)
But without moving the robot, the matches change a lot from one picture to the next (due to the flickering in the light?)
And often there is one wrong match or two that are left. I suspect those matches to cause the inconsistency in my estimated Fundamental matrix which can look totally different from one image to the next, even without moving the robot.
![image description](/upfiles/14235396878588863.png)
Here the yellow and blue line clearly don't fit to the model. Could they cause the fundamental matrix to go totally wrong?MehdiMon, 09 Feb 2015 21:43:12 -0600http://answers.opencv.org/question/54824/