Ask Your Question

percepticat's profile - activity

2018-08-16 02:11:50 -0600 received badge  Popular Question (source)
2015-02-25 01:52:59 -0600 received badge  Student (source)
2013-04-26 12:57:47 -0600 asked a question Inconsistent results from SolvePnp

I run SolvePnp with 4 points, on very similar input and get different output. Here's my code snippet with explanation inline:

=============================================================

const Mat ML = cameras.getLeftCameraMatrix();

Mat rvecL;

Mat_<float> tvecL;

Mat rauxL, tauxL;

pattern3D.push_back(Point3f(0, 0, 0));

Pattern3D.push_back(Point3f(99.5, 0));

pattern3D.push_back(Point3f(0, 99.5, 0));

pattern3D.push_back(Point3f(148.5, 99.5, 0));

vector<point2f> points2D;

INPUT OPTION1: points2D.push_back(Point2f(251.2490, 226.5540)); points2D.push_back(Point2f(330.2810, 231.8510)); points2D.push_back(Point2f(246.1260, 304.0430)); points2D.push_back(Point2f(363.5740, 310.1830));

INPUT OPTION2: points2D.push_back(Point2f(251.3380, 226.5890)); points2D.push_back(Point2f(330.2660, 231.8370)); points2D.push_back(Point2f(245.9680, 304.0390)); points2D.push_back(Point2f(363.5830, 310.1890));

INPUT OPTION3: points2D.push_back(Point2f(251.2710, 226.5610)); points2D.push_back(Point2f(330.2800, 231.8220)); points2D.push_back(Point2f(246.0760, 304.0240)); points2D.push_back(Point2f(363.3580, 310.1760));

solvePnP(pattern3D, points2D, ML, Mat(), rauxL, tauxL);

rauxL.convertTo(rvecL, CV_32F); tauxL.convertTo(tvecL, CV_32F); Mat_<float> rotMat(3,3); Rodrigues(rvecL, rotMat);

float tx = tvecL.at<float>(0); float ty = tvecL.at<float>(1); float tz = tvecL.at<float>(2); float rx = rvecL.at<float>(0); float ry = rvecL.at<float>(1); float rz = rvecL.at<float>(2);

Mat_<float> pt(3,4); for(int i=0; i<4; i++){ pt.at<float>(0,i) = pattern3D[i].x; pt.at<float>(1,i) = pattern3D[i].y; pt.at<float>(2,i) = pattern3D[i].z; }

positions = rotMat*pt; for(int i=0; i<4; i++){ positions.at<float>(0,i) += tx; positions.at<float>(1,i) += ty; positions.at<float>(2,i) += tz; } cout<<positions&lt;<endl; }<="" p="">

===========================================================================

And the three respective outputs:

[-93.012138, 6.0772247, -100.11108, 47.77607; -23.703554, -17.838806, 73.649681, 82.402596; 1379.6711, 1386.538, 1398.9637, 1409.2122]

[-94.372253, 4.9526367, -100.26565, 47.973022; -24.142147, -18.317234, 73.015259, 81.708725; 1413.5717, 1412.6304, 1392.933, 1391.5281]

[-92.997803, 6.0429764, -100.23169, 47.582932; -23.737469, -17.849192, 73.660095, 82.44812; 1380.3485, 1387.8654, 1399.3652, 1410.5839]

As you can see, the reconstructed set of 3D points is similar for the 1st and 3rd example and very different for the middle one (everything is in milimeters) BUT the input is almost identical (less than 0.5 pixel)

Any thoughts? Suggestions on how to solve this?

2013-04-18 16:06:17 -0600 commented answer SolvePnp: similar input returns very different output

I repeated the test by translating the camera (very accurate) while keeping the object stationary. I see the same results.

I am thinking that it's maybe related to the configuration of the points used. Imagine that I have 4 points on a plane as follows: A(0,0,0) ====== B(10,0,0) || \ C(0,10,0)=============D(15,10,0)

I can see how a rotation of the object about a line parallel to AB and CD and in between them would still make all the points project to the same 2D positions. So this can create an ambiguity.... right? And if you agree, what would be a good set of points to use?

2013-04-18 14:35:01 -0600 asked a question SolvePnp: similar input returns very different output

Hello,

I am using a planar object (composed of 16 3D points, all with Z=0). When I use SolvePnp with 16 2D points, I usually get results that make sense. However, if I move my planar object, for example translating it along the Y axis (up-down), at a certain configuration the output changes abruptly. If I keep moving in the same direction, the reconstructed 3D pose keeps changing smoothly from the new values.

I plotted the 3D position of my 16 points, and I notice that when the abrupt change happens, the reconstruction is still plausible, except that it is rotated by about 30 degrees along some axis in the middle of the object.

This is highly repeatable, and always happens at about the same point (i.e. the same Y axis location).

Any thoughts why this happens and how it can be avoided?

Thanks!