solvePnP issue with sudden rotation change, with occluded point/s

asked 2017-04-03 17:47:27 -0600

kyranf gravatar image

updated 2017-04-04 06:17:40 -0600

I have a problem where i am getting drastically different poses from solvepnp, if the points list includes a point that is in reality meant to be occluded, but sometimes peaks through (it's an optical sensor, very sensitive).

I have a camera matrix that is just an identity matrix, and a bunch of known 3D model coordinate points for a bunch of sensors on a solid object. The sensors provide their positions in the 'camera's image perspective.

My sensors are arranged in two rings, 440mm apart (parallel plane rings). Our 'camera' sees sensors from a certain direction only, meaning only the sensors along the ring closest to the 'camera' are normally visible.

my data below shows cycle '6' has the following image points, and 3D points (mm units), the unique sensor ID is the first number:

format: ID, img points, model points.

    1,[0.122001, 0.0337334],[-56.31, -27.12, 0]
    2,[0.135507, 0.0344581],[-38.97, -48.86, 0]
    3,[0.0428851, 0.0347298],[13.91, 60.93, 0]
    4,[0.0472973, 0.0344505],[-13.91, 60.93, 0]
    5,[0.0595242, 0.0333484],[-38.97, 48.86, 0]
    6,[0.0791165, 0.0331144],[-56.31, 27.12, 0]
    8,[0.0790406, 0.033673],[56.31, 27.12, 0]
    15,[0.141493, 0.389969],[-13.91, -60.93, -440]
    16,[0.136751, 0.397388],[-38.97, -48.86, -440]
    17,[0.101998, 0.407393],[-62.5, 0, -440]
    26,[0.0415029, 0.387616],[13.91, 60.93, -440]
    Sensors ready: 11

T-vec:

[121.287298603187;
 43.82786025370395;
 1268.803812947211]

R-Vec after conversion to euler using rodrigues doing this:

cv::Rodrigues(RotVec, rot);
rvec_euler = cv::RQDecomp3x3(rot, output_a, output_b);

rvec_euler = [-2.22604, -86.8052, 92.9033]

SolvePNP output pose (units as metres and degrees), also applied a negative sign to roll and yaw, if you notice:

    x: 0.121287, y: -0.043828, z: 1.268804, r: 2.226044, p: -86.805202, y: -92.903265

Then i have cycle '7', which in this cycle the data doesn't contain sensor ID 8 which happens to be "behind" the other sensors, on the other side of the object facing away from the 'camera' in this scenario.

    1,[0.122055, 0.0337258],[-56.31, -27.12, 0]
    2,[0.135553, 0.0344731],[-38.97, -48.86, 0]
    3,[0.0430438, 0.0347223],[13.91, 60.93, 0]
    4,[0.0471538, 0.0344656],[-13.91, 60.93, 0]
    5,[0.0595696, 0.0333635],[-38.97, 48.86, 0]
    6,[0.0790861, 0.0330465],[-56.31, 27.12, 0]
    15,[0.141408, 0.389986],[-13.91, -60.93, -440]
    16,[0.136812, 0.397423],[-38.97, -48.86, -440]
    17,[0.101968, 0.407419],[-62.5, 0, -440]
    26,[0.0415104, 0.387521],[13.91, 60.93, -440]
    Sensors ready: 10

t-vec:

[116.5373520148447;
 44.7917891685647;
 1274.362770182497]

  rvec_euler =  [-58.9083, -82.1685, 149.584]

pose for cycle 7, units as metres and degrees:

x: 0.116537, y: -0.044792, z: 1 ...
(more)
edit retag flag offensive close merge delete

Comments

You should add some figures to help understand your configuration. I don't have the details about how work the different solvePnP methods but it is plausible that a point behind the camera could lead to false results. Some tests:

  • try with solvePnPRansac() to try to eliminate this outlier
  • if you have an assumption about the camera pose (e.g. previous camera pose), you can try with useExtrinsicGuess=true and flags = SOLVEPNP_ITERATIVE), rvec and tvec with values close to current camera pose

Yes, in my opinion you can exclude points that should not be visible from a camera (points behind (not only occluded by another object) the camera) if you can.

You assume a light ray to project the points, a point behind the camera should not be visible in my opinion.

Eduardo gravatar imageEduardo ( 2017-04-04 03:58:40 -0600 )edit

@Eduardo thanks, i'd like to add some images that show the 'image points' and show you visually the difference between cycle 6 and 7, were sensor 8 is seen basically 'behind' one of the other sensor points. In no situation is a sensor point able to exist behind the camera, by the way. sensor points only exist in front in the field of view (theoretically 0-180 degrees both x and y FOV). I can't add images directly to the question because I don't have enough karma yet. I will upload to Imgur.

kyranf gravatar imagekyranf ( 2017-04-04 06:06:24 -0600 )edit

Images: sensor setup, to show you what i mean by rings.
http://imgur.com/dPQJUVU

Cycle 6 debug image, which includes sensor 8 (it is shown basically right on top of sensor 6 in the image plot) http://imgur.com/t3x3sY4

Cycle 7 debug image, does not have sensor 8, and shows a vastly different rotation vector http://imgur.com/i1mk1Ur

kyranf gravatar imagekyranf ( 2017-04-04 06:17:46 -0600 )edit

It is much clearer with the figures. So your model is 2 * 11 points. Be sure that the 3D points are correctly matched with the 2D image points. If so:

  • maybe try with another method SOLVEPNP_EPNP
  • maybe try with an initial guess with SOLVEPNP_ITERATIVE

The pose estimation problem (PnP problem) is a non trivial and non linear problem. To better debug, I would suggest you to draw the object frame (like in this tutorial), it is much easier to understand the estimated rotation.

If you want to try SOLVEPNP_P3P, this method requires exactly 4 points. In this case I would suggest you to take 2 points from the top and 2 points from the bottom ring (wide spread in x). The other flags are deactivated.

Eduardo gravatar imageEduardo ( 2017-04-04 08:25:30 -0600 )edit

thanks Eduardo, and yeah technically it's 2 * 14, for a total of 28 sensors. my sketch was not accurate on the exact number of sensors per ring. I've double checked the 2D -> 3D coordinate matching, that all looks fine. I'll check out the 3P3 method, shouldn't be too much effort to find top and bottom sensors furthest distance apart to provide it.

kyranf gravatar imagekyranf ( 2017-04-04 08:54:30 -0600 )edit