Ask Your Question
0

Projecting points using OpenCV

asked 2017-10-18 03:38:17 -0600

Ahmed gravatar image

updated 2017-10-18 03:52:27 -0600

I'm trying to:

do ray-plane intersection against several points that are in world coordinates, then I get those points that are intersected then I try to project these points from that world coordinates into image coordinates, but I get points in the range 0.4, 0.1, 0.5..etc.

Here is what I'm doing, hope that you spot an error

     Mat cameraIntrinsics( 3, 3, CV_32F );

      cameraIntrinsics.at<float>( 0, 0 ) = 1.6003814935684204;
      cameraIntrinsics.at<float>( 0, 1 ) = 0;
      cameraIntrinsics.at<float>( 0, 2 ) = -0.0021958351135253906;
      cameraIntrinsics.at<float>( 1, 0 ) = 0;
      cameraIntrinsics.at<float>( 1, 1 ) = 1.6003814935684204;
      cameraIntrinsics.at<float>( 1, 2 ) = -0.0044271680526435375;
      cameraIntrinsics.at<float>( 2, 0 ) = 0;
      cameraIntrinsics.at<float>( 2, 1 ) = 0;
      cameraIntrinsics.at<float>( 2, 2 ) = 1;


      Mat invCameraIntrinsics = cameraIntrinsics.inv();

      std::vector<cv::Point3f> points3D;
      std::vector<Ray> rays;
      for ( int i = 0; i < corners.size(); i++ )
      {
        cv::Point3f pt;

        pt.z = -1.0f;

        pt.x = corners[i].x;
        pt.y = corners[i].y;

        points3D.push_back( pt );

        Ray ray;

        ray.origin = Vec3f( 0, 0, 0);
        ray.direction = Vec3f( pt.x, pt.y, pt.z );

        rays.push_back( ray );
      }

      std::vector<cv::Point3f> pointsTransformed3D;


      cv::transform( points3D, pointsTransformed3D, invCameraIntrinsics );


      std::vector<cv::Vec3f> contacts;

      for ( int i = 0; i < pointsTransformed3D.size(); i++ )
      {
        Vec3f pt( pointsTransformed3D[i].x, pointsTransformed3D[i].y, pointsTransformed3D[i].z );

        cv::Vec3f contact;
        std::pair<bool, double> test = linePlaneIntersection( contact, rays[i].direction, rays[i].origin, Vec3f( 0, 1, 0 ), pt );
        if (test.first == true )
        {
          cv::Vec3f contact( rays[i].origin + ( rays[i].direction) * test.second);
          contacts.push_back( contact );
        }
      }


      Mat rotationMatrix( 3, 3, CV_32F );

      rotationMatrix.at<float>( 0, 0 ) = 0.9115078799790896;
      rotationMatrix.at<float>( 0, 1 ) = -0.1883612409043686;
      rotationMatrix.at<float>( 0, 2 ) = -0.3656137684237178;
      rotationMatrix.at<float>( 1, 0 ) = -0.3046835686704949;
      rotationMatrix.at<float>( 1, 1 ) = 0.2878667580409447;
      rotationMatrix.at<float>( 1, 2 ) = -0.9079100465339108;
      rotationMatrix.at<float>( 2, 0 ) = 0.2762631132059388;
      rotationMatrix.at<float>( 2, 1 ) = 0.9389636694462479;
      rotationMatrix.at<float>( 2, 2 ) = 0.2050022432604093;

      cv::Mat rVec( 3, 1, CV_32F ); // Rotation vector
      Rodrigues( rotationMatrix, rVec );
      double norm = cv::norm( rVec );

      float theta = (float)(sqrt( rVec.at<float>(0)*rVec.at<float>( 0 ) + rVec.at<float>( 1 )*rVec.at<float>( 1 ) + rVec.at<float>( 2 )*rVec.at<float>( 2 ) ) * 180 / 3.14567898726);

      cv::Mat tVec( 3, 1, CV_32F ); // Translation vector
      tVec.at<float>( 0 ) = 21.408294677734375;
      tVec.at<float>( 1 ) = 531.1319580078125;
      tVec.at<float>( 2 ) = 705.74224853515625;

      cv::Mat distCoeffs( 5, 1, CV_32F );   // Distortion vector
      distCoeffs.at<float>( 0 ) = 0;
      distCoeffs.at<float>( 1 ) = 0;
      distCoeffs.at<float>( 2 ) = 0;
      distCoeffs.at<float>( 3 ) = 0;
      distCoeffs.at<float>( 4 ) = 0;

      std::vector<cv::Point2d> projectedPoints;
      std::vector < cv::Point3d> ContactPoints;

      for ( int i = 0; i < contacts.size(); i++ )
      {
        cv::Point3d pt;

        pt.x = contacts[i][0];
        pt.y = contacts[i][1];
        pt.z = contacts[i][2];

        ContactPoints.push_back( pt );
      }


      cv::projectPoints( ContactPoints, rVec, tVec, cameraIntrinsics, distCoeffs, projectedPoints );

      for ( size_t i = 0; i < projectedPoints.size(); i++ )
      {
        cv::Point2d pt;

        pt.x = projectedPoints[i].x;
        pt.y = projectedPoints[i].y;

        cv::circle( src, pt, 10, cv::Scalar( 255, 0, 255 ), -1 );
      }

      imshow( "My window", src );
    }
  }

cv:waitKey( 0 );
  return 0;
}
edit retag flag offensive close merge delete

Comments

Got some sample input that we can run through?

Tetragramm gravatar imageTetragramm ( 2017-10-18 22:44:22 -0600 )edit
Ahmed gravatar imageAhmed ( 2017-10-19 12:35:47 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
0

answered 2017-10-20 16:31:29 -0600

Tetragramm gravatar image

Ok, Obvious thing is you're doing something weird with the camera intrinsics. You've definitely done that wrong.

The focal length should be expressed in pixels. Those numbers you have are definitely not in pixels. So I would expect FL values of about 1e3-1e4, then the principal point values should be close to cols/2, rows/2.

If you fix that, then the part where you multiply the inverseCameraExtrinsics gets simpler, because you just use (x,y,1) as your pt, and don't have to do that math there, which I think you've done wrong too.

Then you get the results you expect out of projectPoints, because it also expects the camera intrinsics to be in units of pixels.

edit flag offensive delete link more

Comments

can you post a fixed code please ? Thanks so much for your time and for your help!

Ahmed gravatar imageAhmed ( 2017-10-21 09:34:10 -0600 )edit

No, because I can't calibrate your camera for you. Re-calibrate the camera using the opencv functions like in THIS TUTORIAL.

Then get rid of that math around line 269 of the pastebin and just make it pt[0]=corner[i].x and same for the y.

That should be it.

Tetragramm gravatar imageTetragramm ( 2017-10-21 11:35:37 -0600 )edit

I have done what you suggested but I still get the projected points not as the original points https://pastebin.com/agz7LJmT

Ahmed gravatar imageAhmed ( 2017-10-21 11:55:23 -0600 )edit

Ah, I see. You are passing in an rvec and tvec into projectPoints, but you never modify the pointsTransformed3d. So if you set rvec and tvec to zero, you get back the same points.

Basically, you are asking, "if i moved the camera by rvec and tvec, where would those points be?". Which, since rvec and tvec are not zero, is not the values you put into it.

Tetragramm gravatar imageTetragramm ( 2017-10-21 12:25:53 -0600 )edit

Thanks, it's fixed now. I would like to know if there are points (the cube points) that are on the ground plane which is the table, it has a normal of (0,1,0) and point (0,0,0)

Ahmed gravatar imageAhmed ( 2017-10-21 13:38:06 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2017-10-18 03:38:17 -0600

Seen: 1,410 times

Last updated: Oct 20 '17