Ask Your Question
0

Rotating target changes distances computed with solvePnP

asked 2016-02-06 18:35:58 -0600

augustt198 gravatar image

updated 2016-02-08 11:02:10 -0600

I've been working on finding the distance to a target (of known real-world dimensions). The distance seems consistent enough when viewed from a forward-facing angle, but when I rotate the target (without translating it) the distance changes:

image description

Here is the relevant code:

vector<Point3f> objectPoints;
objectPoints.push_back(Point3f(-1, -1, 0));
objectPoints.push_back(Point3f(-1, 1, 0));
objectPoints.push_back(Point3f(1, 1, 0));
objectPoints.push_back(Point3f(1, -1, 0));
Mat objectPointsMat(objectPoints);

Mat rvec;
Mat tvec;
solvePnP(objectPointsMat, inputContour, cameraMatrix, distortionCoeffs, rvec, tvec);

double tx = tvec.at<double>(0, 0);
double ty = tvec.at<double>(1, 0);
double tz = tvec.at<double>(2, 0);
double dist = std::sqrt(tx*tx + ty*ty + tz*tz);

char *str;
asprintf(&str, "DIST = %7.3f", dist);
putText(original, str, Point(0, original.size().height), CV_FONT_HERSHEY_PLAIN,
    3, Scalar(255, 255, 255), 3);

EDIT: As Eduardo suggested, I have drawn the computed prose. I have also changed the objects points to match the width/height ratio of the actual target using:

objectPoints.push_back(Point3f(-1, -0.7, 0));
objectPoints.push_back(Point3f(-1, 0.7, 0));
objectPoints.push_back(Point3f(1, 0.7, 0));
objectPoints.push_back(Point3f(1, -0.7, 0));

The issue is that while yawing seems to work now, rolling makes the distance jump: image description

edit retag flag offensive close merge delete

Comments

Maybe you could try to display the object frame to see where the computed pose is reprojected in the image. An example of how to do this here and here.

You could also check if the correspondences 3D object points and 2D image points are correct, check if the cameraMatrix and the distortion coefficients are correct.

Eduardo gravatar imageEduardo ( 2016-02-07 11:15:49 -0600 )edit

What units are you using? The width of the rect is 2 but it's not meters. How did you specify the distance of the features during the intrinsic calibration? Could you also show the reprojection error? (Just apply cv::projectPoints to the 3d points and compare them to the features you found in the image)

FooBar gravatar imageFooBar ( 2016-02-08 11:43:34 -0600 )edit

Looking at the rotation vector you rendered in the second image it appears to shift the normal from toward the screen to straight up. I am also working on the same problem (although in python) and noticing the same issue even when describing the whole U shape rather than the bounding corners.

B-money gravatar imageB-money ( 2016-02-08 12:08:44 -0600 )edit

I can't find a way to send a private message, but can I pick your code on how you were able to detect this rectangle? I'm trying to do something similar with a hexagon shape, but I'm having problems.

Pomagalski gravatar imagePomagalski ( 2016-02-12 08:44:09 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
1

answered 2016-02-08 12:00:43 -0600

updated 2016-02-08 12:09:56 -0600

The ordering of your extracted points is jumping. If the rect is in the left part of the image, the green axis is pointing right out of the plane and everything looks ok. In other cases, the green and red axis look as if the both have an angle of 45 deg to your pattern. In this situation, solvePnP thinks that the pattern is held so that the longer side is held vertical but tilted.

Could you add labels for the points in inputContour (cv::puttext)? If you assign the 2d and 3d points incorrectly, the pose will just flip.

In one configuration, the height of the object is assumed to be 1.4 at a distance of about 5.4, in the other the height is assumed to be 2 at a distance of about 7.2. 7.2/5.4 ~ 2/1.4 which is also a hint that this is really the problem.

However, the units really look strange.

edit flag offensive delete link more

Comments

I think the unit is in foot ?

Eduardo gravatar imageEduardo ( 2016-02-09 05:06:01 -0600 )edit

The contour points are displayed in this gif: http://i.imgur.com/Au6tDQ4.gif Is there a method for consistently ordering the points?

I've also changed the object points to match the size of the real-life target in inches.

augustt198 gravatar imageaugustt198 ( 2016-02-09 19:07:52 -0600 )edit

@augustt198 Ran into the same issue here, took my first point (which is assumed to be the top left or top right from the contour detection) and the centroid of the contours. If the first point fell to the right of the centroid you need to flip the array so you start at the top left.

Next, I looked at the second and last point to determine if the points were going clockwise or counterclockwise. If the y of the last point was less than the second point then the order needs to be swapped retaining the first point.

After both were done the points were ordered from top left and going clockwise. Also note that I defined all 8 points not just the outer 4 corners.

Alternatively you can detect these two cases and pass in altered versions of the target descriptor to compensate

B-money gravatar imageB-money ( 2016-02-12 21:55:58 -0600 )edit

Question Tools

2 followers

Stats

Asked: 2016-02-06 18:35:58 -0600

Seen: 1,047 times

Last updated: Feb 08 '16