Stereo correspondence between left and right images

asked 2014-02-14 01:45:01 -0600

Julien Seinturier gravatar image

updated 2014-02-14 01:55:34 -0600

Hello,

I'm using OpenCV to set up a stereo odometry module. The whole module is written in Java but i'm using OpenCV to detect tracks on successive stereo couple. Here is my algorithm:

  1. Rectify images using stereo calibration parameters

  2. Detect matches between previous stereo left and current stereo left image using optical flow

  3. Compute disparity between current stereo left and current stereo right image

  4. Compute 3D position of the 2D points of the current stereo left images that have been matched with previous stereo left image

  5. Project computed 3D points to current stereo right image.

After step 4, i should have 2D matched points on current stereo left and current stereo right image.

The step 3 compute 3D points using this code:

 vector<Point3d> point3dInput;
 point3dInput.reserve(stereoLeftFeatureLocations->size());

 for(int i = 0; i < stereoLeftFeatureLocations->size(); i++){

   point3dInput.push_back(Point3d((*stereoLeftFeatureLocations)[i].x,     
                                  (*stereoLeftFeatureLocations)[i].y,    
                                  disp.at<short>((*stereoLeftFeatureLocations)[i].y,    
                                                 (*stereoLeftFeatureLocations)[i].x)));

 }

// Compute 3D points from disparity
vector<cv::Point3d> points3d;    
Point3f point3d;

perspectiveTransform(point3dInput, points3d, stereoRectQ);

where:

  • stereoLeftFeatureLocations is a pointer to a vector of Point2f that represents 2d points within the stereo left image and obtained with optical flow computation

  • disp is the disparity computed between current stereo left and right rectified images

  • stereoRectQ is the Q matrix given by rectification parameters computation.

Obtained 3D points seems valid at this time.

During the step 4, i project the 3D points onto the current stereo right image using this code:

cv::Mat decomposedCameraMatrix;

cv::Mat decomposedRotMatrix;

cv::Mat decomposedRotVector;

cv::Mat decomposedTransVector;

cv::Mat decomposedTransVectorEuclidian = cv::Mat(3, 1, CV_64F);

// Decompose stereo right projection matrix
cv::decomposeProjectionMatrix(getStereoRightProjectionMatrix(), decomposedCameraMatrix,
                              decomposedRotMatrix, decomposedTransVector);

// From rotation matrix to rotation vector
cv::Rodrigues(decomposedRotMatrix, decomposedRotVector);

// From homogeneous to euclidian translation vector
decomposedTransVectorEuclidian.at<double>(0, 0) =   decomposedTransVector.at<double>(0, 0) 
                                                  / decomposedTransVector.at<double>(0, 3);

decomposedTransVectorEuclidian.at<double>(1, 0) =   decomposedTransVector.at<double>(0, 1) 
                                                  / decomposedTransVector.at<double>(0, 3);

decomposedTransVectorEuclidian.at<double>(2, 0) =   decomposedTransVector.at<double>(0, 2) 
                                                  / decomposedTransVector.at<double>(0, 3);

cv::projectPoints(points3d, decomposedRotVector, decomposedTransVectorEuclidian, 
                  decomposedCameraMatrix, distortion, rightProjectedPoints);

My problem is that it seems that the projected points lie at the same coordinates on left and right images (seems that translation between images is not applied)

Could anyone can help me ? Maybe my method for stereo matching is bad but a have no other ideas at this time,

thanks.

Julien

edit retag flag offensive close merge delete