Ask Your Question

mnchapel's profile - activity

2017-05-12 03:41:55 -0500 received badge  Self-Learner (source)
2017-05-12 03:23:49 -0500 received badge  Scholar (source)
2017-05-12 03:21:29 -0500 answered a question findEssentialMat give different results according to the number of feature points

I took a look at the Opencv Code and it seems that only five points are randomly chosen among all the feature points to compute the essential matrix. So I suppose that the error depends on which points are chosen and cv::RNG::uniform(0,count) is used to choose the points (with count == the number of feature points given to findEssentialMatrix). A priori there is no real solution. I choose randomly six points and if the essential matrix is not good, I compute it again. (Thanks LBerger for your time)

2017-05-11 11:29:55 -0500 commented answer findEssentialMat give different results according to the number of feature points

Thanks for your answer. Actually I didn't see that I misused the Rodrigues function so thank you for that. But if you look at the result of "Essai 4", you can see that there is still a problem about the translation vector. For all the other tests, translation vector is about [-0.8, 0.6, 0.0] and for the Essai 4 [0.0, 0.0, -1].

2017-05-11 05:34:29 -0500 received badge  Student (source)
2017-05-11 05:23:34 -0500 commented question findEssentialMat give different results according to the number of feature points

No, I didn't. I added data and I changed the code according to the .yml files. If you run the code with all points, it works but if you remove the two last points in static_feature_point_t and static_feature_point_tmdelta, the result is totally different (there are 1241 feature points).

2017-05-10 03:12:53 -0500 commented question findEssentialMat give different results according to the number of feature points

I added it.

2017-05-09 12:30:55 -0500 asked a question findEssentialMat give different results according to the number of feature points

Hello,

I use the findEssentialMatrix function on a set of feature points (~ 1200 points) and then I use triangulatePoints function to recover the 3D positions of those feature points. But I have a problem with the findEssentialMatrix function because it seems that the result changes according to the number of points.

For example, if I use 1241 points for one frame, the result is quite good (R= 0.5,0.5,0.5 and t=1,0,0) and if I remove only one point the result is totally different (R=3.0,2.0,2.0 and t=0,0,1). I tried to remove other feature points and sometimes it works and sometimes not. I don't understand why. Is there a reason for that ?

std::vector<cv::Point2d> static_feature_point_t;
std::vector<cv::Point2d> static_feature_point_tmdelta;

// read from file
cv::FileStorage fs_t("static_feature_point_t.yml", cv::FileStorage::READ);
cv::FileStorage fs_tmdelta("static_feature_point_tmdelta.yml", cv::FileStorage::READ);

cv::FileNode feature_point_t       = fs_t["feature_point"];
cv::FileNode feature_point_tmdelta = fs_tmdelta["feature_point"];

read(feature_point_t, static_feature_point_t);
read(feature_point_tmdelta, static_feature_point_tmdelta);

fs_t.release();
fs_tmdelta.release();

double focal = 300.;
cv::Point2d camera_principal_point(320, 240);

cv::Mat essential_matrix = cv::findEssentialMat(static_feature_point_t, static_feature_point_tmdelta, focal, camera_principal_point, cv::LMEDS);

cv::Mat rotation, translation;
cv::recoverPose(essential_matrix, static_feature_point_t, static_feature_point_tmdelta, rotation, translation, focal, camera_principal_point);
cv::Mat rot(3,1,CV_64F);
cv::Rodrigues(rotation, rot);
std::cout << "rotation " << rot*180./M_PI << std::endl;
std::cout << "translation " << translation << std::endl;

The two lists of feature points are here (I didn't find how to upload files on the forum or if it is possible)

Thanks,