2019-04-16 01:11:32 -0600 | received badge | ● Popular Question (source) |
2015-08-22 11:05:07 -0600 | asked a question | recoverPose translation values Hi,I use findEssentialMat and recoverPose. (opencv 3) I have a small problem with the translation value t. The rotation values (euler angles e) are very exact. The t values do not match. Maybe I'm expecting something else. The t variable gives me 3 values. The three values are the relative distances (x,y,z) from camera 1 to camera 2? Is that right? |
2015-08-20 16:22:41 -0600 | commented answer | Position and Rotation of two cameras. Which function's I need? In what order? Thank you very much. After long time I could continue testing the code. That was a great help. |
2015-06-20 11:18:05 -0600 | received badge | ● Supporter (source) |
2015-06-20 11:14:11 -0600 | answered a question | Position and Rotation of two cameras. Which function's I need? In what order? Hi, thanks for the answers.
I don’t know the (relative) 3d position of the points. I know the two images, the focal length and the sensor size of the cameras. I tested with Blender if that's possible. Blender motion tracking can compute the relative position and rotation of the two cameras. Blender needs 8 points in both images. The result is very exact. It is possible. But how? I have found the openCv function findFundamentalMat. findFundamentalMat also requires min 8 points* in both images. This is the same rule as in Blender. *CV_FM_8POINT for an 8-point algorithm. N>=8 CV_FM_RANSAC for the RANSAC algorithm. N>=8 CV_FM_LMEDS for the LMedS algorithm. N>=8 And I found the function stereoCalibrate. UPDATE 2015.06.29 -----UPDATE 2015.06.29------- UPDATE 2015.06.29Hi. thanks for the comments.
The animation programm Blender can compute the points and position of the camera. Blender say that you needs min 8 points. I take a look at the function computeCorrespondEpilines. Here is the result. Image 2 looks like very good. Image 1 Image 2 I have copied the image into Blender. Here is the result. But what happens next? How can I get the following values? X, Y, Z, Roll, Pitch and Yaw Here is my code. (more) |
2015-06-20 11:12:05 -0600 | answered a question | Position and Rotation of two cameras. Which function's I need? In what order? Hi, thanks for the answers.
I don’t know the (relative) 3d position of the points. I know the two images, the focal length and the sensor size of the cameras. I tested with Blender if that's possible. Blender motion tracking can compute the relative position and rotation of the two cameras. Blender needs 8 points in both images. The result is very exact. It is possible. But how? I have found the openCv function findFundamentalMat. findFundamentalMat also requires min 8 points* in both images. This is the same rule as in Blender. *CV_FM_8POINT for an 8-point algorithm. N>=8 CV_FM_RANSAC for the RANSAC algorithm. N>=8 CV_FM_LMEDS for the LMedS algorithm. N>=8 And I found the function stereoCalibrate. |
2015-06-18 12:27:17 -0600 | received badge | ● Student (source) |
2015-06-18 11:06:42 -0600 | received badge | ● Editor (source) |
2015-06-18 10:50:22 -0600 | asked a question | Position and Rotation of two cameras. Which function's I need? In what order? I would like to compute the relative position and Rotation of two cameras with openCv. I use two images with dots. Here are the two images. http://www.bilder-upload.eu/show.php?... http://www.bilder-upload.eu/show.php?... I know the horizontal value (X) and vertical value (Y) of each points in the two images. I would like to compute the relative position and rotation of the two cameras. I tested with Blender if that's possible. With motion tracking Blender was able to compute the relative position and rotation. It takes 8 points or more. The result is very exact. Here is my Blender test 3D View. http://www.bilder-upload.eu/show.php?... I found many openCv functions. But I do not know which function's I need. stereoCalibrate ? findFundamentalMat? Which function's I need? In what order? |