Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Update Projection matrices for triangulation

I have a stereo vision application that is attempting to do the following:

Load stereo calibration data from xml

find and match features on left/right frames.

Using the calibration matrices, triangulate points.

Run solvePnp on 2d/3d correspondences

Using the results from solvePnP, UPDATE the projection matrices:

//Calibrated P matrix
cv::Mat Kd(3, 3, CV_32F); // intrinsic parameter matrix
cv::Mat Rd(3, 3, CV_32F); // rotation matrix
cv::Mat Td(4, 1, CV_32F); // translation vector
cv::Mat newP1(4, 3, CV_32F);

cv::decomposeProjectionMatrix(P, Kd, Rd, Td);

cv::Mat T1 = cv::Mat(3, 1, CV_32F);
homogeneousToEuclidean(Td, T1);

//add the new t and r
T1 = T1 + (tvec / 10);
Rd = Rd * R;    

projectionFromKRt(Kd, Rd, T1, newP1);

triangulate again.

What i want to see, is that the triangulated points are in the same world space, as the camera has moved. Instead, the points have moved the amount I would expect the camera to have moved. Am I thinking about this wrong? How can i triangulate the points into the same place, with a known camera position?

Update Projection matrices for triangulation

I have a stereo vision application that is attempting to do the following:

Load stereo calibration data from xml

find and match features on left/right frames.

Using the calibration matrices, triangulate points.

Run solvePnp on 2d/3d correspondences

Using the results from solvePnP, UPDATE the projection matrices:

//Calibrated P matrix
cv::Mat Kd(3, 3, CV_32F); // intrinsic parameter matrix
cv::Mat Rd(3, 3, CV_32F); // rotation matrix
cv::Mat Td(4, 1, CV_32F); // translation vector
cv::Mat newP1(4, 3, CV_32F);

cv::decomposeProjectionMatrix(P, Kd, Rd, Td);

cv::Mat T1 = cv::Mat(3, 1, CV_32F);
homogeneousToEuclidean(Td, T1);

//add the new t and r
T1 = T1 + (tvec / 10);
Rd = Rd * R;    

projectionFromKRt(Kd, Rd, T1, newP1);
newP1);  //(from cv:sfm)

triangulate again.

What i want to see, is that the triangulated points are in the same world space, as the camera has moved. Instead, the points have moved the amount I would expect the camera to have moved. Am I thinking about this wrong? How can i triangulate the points into the same place, with a known camera position?

Update Projection matrices for triangulation

I have a stereo vision application that is attempting to do the following:

Load stereo calibration data from xml

find and match features on left/right frames.

Using the calibration matrices, triangulate points.

Run solvePnp on 2d/3d correspondences

Using the results from solvePnP, UPDATE the projection matrices:

//Update Projection matrices with tvec and rvec

//Calibrated P matrix
cv::Mat MatrixFunctions::UpdateProjectionMatrix(cv::Mat P, cv::Mat tvec, cv::Mat R)
{
    //decompose the calibrated matrices
    //p1
    cv::Mat Kd(3, 3, CV_32F); // intrinsic parameter matrix
 cv::Mat Rd(3, 3, CV_32F); // rotation matrix
 cv::Mat Td(4, 1, CV_32F); // translation vector
 cv::Mat newP1(4, 3, CV_32F);

 cv::decomposeProjectionMatrix(P, Kd, Rd, Td);

    //fromH
    cv::Mat T1 = cv::Mat(3, 1, CV_32F);
 homogeneousToEuclidean(Td, T1);

    R = R.t();
    tvec = -R * tvec;

    //add the new t and r
 T1 = T1 + (tvec / 10);
 Rd = Rd * + R;    

 projectionFromKRt(Kd, Rd, R, T1, newP1);  //(from cv:sfm)
newP1);

    return newP1;

}

by passing:

//return tvec and rvec from solvepnp //create new Projection Matrices for left and right using the new position, P1 and P2 are the calibration P matrices

    cv::Mat updatedP1(4, 3, CV_32F);
    cv::Mat updatedP2(4, 3, CV_32F);
    cv::Mat RMat;
    cv::Rodrigues(rvec, RMat);
    updatedP1 = MatFunc->UpdateProjectionMatrix(P1, tvec, RMat);
    updatedP2 = MatFunc->UpdateProjectionMatrix(P2, tvec, RMat);

triangulate again.

What i want to see, is that the triangulated points are in the same world space, as the camera has moved. Instead, the points have moved the amount I would expect the camera to have moved. Am I thinking about this wrong? How can i triangulate the points into the same place, with a known camera position?

Update Projection matrices for triangulation

I have a stereo vision application that is attempting to do the following:

Load stereo calibration data from xml

find and match features on left/right frames.

Using the calibration matrices, triangulate points.

Run solvePnp on 2d/3d correspondences

Using the results from solvePnP, UPDATE the projection matrices:

//Update Projection matrices with tvec and rvec

cv::Mat MatrixFunctions::UpdateProjectionMatrix(cv::Mat P, cv::Mat tvec, cv::Mat R)
{
    //decompose the calibrated matrices
    //p1
    cv::Mat Kd(3, 3, CV_32F); // intrinsic parameter matrix
    cv::Mat Rd(3, 3, CV_32F); // rotation matrix
    cv::Mat Td(4, 1, CV_32F); // translation vector
    cv::Mat newP1(4, 3, CV_32F);

    cv::decomposeProjectionMatrix(P, Kd, Rd, Td);

    //fromH
    cv::Mat T1 = cv::Mat(3, 1, CV_32F);
    homogeneousToEuclidean(Td, T1);

    R = R.t();
    tvec = -R * tvec;

    //add the new t and r
    T1 = T1 + (tvec / 10);
    Rd = Rd + R;    

    projectionFromKRt(Kd, R, T1, newP1);

    return newP1;

}

by passing:

//return tvec and rvec from solvepnp //create new Projection Matrices for left and right using the new position, P1 and P2 are the calibration P matrices

    cv::Mat updatedP1(4, 3, CV_32F);
    cv::Mat updatedP2(4, 3, CV_32F);
    cv::Mat RMat;
    cv::Rodrigues(rvec, RMat);
    updatedP1 = MatFunc->UpdateProjectionMatrix(P1, tvec, RMat);
    updatedP2 = MatFunc->UpdateProjectionMatrix(P2, tvec, RMat);

triangulate again.

What i want to see, is that the triangulated points are in the same world space, as the camera has moved. Instead, the points have moved the amount I would expect the camera to have moved. Am I thinking about this wrong? How can i triangulate the points into the same place, with a known camera position?

Update Projection matrices for triangulation

I have a stereo vision application that is attempting to do the following:

Load stereo calibration data from xml

find and match features on left/right frames.

Using the calibration matrices, triangulate points.

Run solvePnp on 2d/3d correspondences

Using the results from solvePnP, UPDATE the projection matrices:

//Update Projection matrices with tvec and rvec

cv::Mat MatrixFunctions::UpdateProjectionMatrix(cv::Mat P, cv::Mat tvec, cv::Mat R)
{
    //decompose the calibrated matrices
    //p1
    cv::Mat Kd(3, 3, CV_32F); // intrinsic parameter matrix
    cv::Mat Rd(3, 3, CV_32F); // rotation matrix
    cv::Mat Td(4, 1, CV_32F); // translation vector
    cv::Mat newP1(4, 3, CV_32F);

    cv::decomposeProjectionMatrix(P, Kd, Rd, Td);

    //fromH
    cv::Mat T1 = cv::Mat(3, 1, CV_32F);
    homogeneousToEuclidean(Td, T1);

    R = R.t();
    tvec = -R * tvec;

    //add the new t and r
    T1 = T1 + (tvec / 10);
    Rd = Rd + R;    

    projectionFromKRt(Kd, R, T1, newP1);
newP1); //from cv:sfm

    return newP1;

}

by passing:

//return tvec and rvec from solvepnp //create new Projection Matrices for left and right using the new position, P1 and P2 are the calibration P matrices

    cv::Mat updatedP1(4, 3, CV_32F);
    cv::Mat updatedP2(4, 3, CV_32F);
    cv::Mat RMat;
    cv::Rodrigues(rvec, RMat);
    updatedP1 = MatFunc->UpdateProjectionMatrix(P1, tvec, RMat);
    updatedP2 = MatFunc->UpdateProjectionMatrix(P2, tvec, RMat);

triangulate again.

What i want to see, is that the triangulated points are in the same world space, as the camera has moved. Instead, the points have moved the amount I would expect the camera to have moved. Am I thinking about this wrong? How can i triangulate the points into the same place, with a known camera position?

Update Projection matrices for triangulation

I have a stereo vision application that is attempting to do the following:

Load stereo calibration data from xml

find and match features on left/right frames.

Using the calibration matrices, triangulate points.

Run solvePnp on 2d/3d correspondences

Using the results from solvePnP, UPDATE the projection matrices:

//Update Projection matrices with tvec and rvec

cv::Mat MatrixFunctions::UpdateProjectionMatrix(cv::Mat P, cv::Mat tvec, cv::Mat R)
{
    //decompose the calibrated matrices
    //p1
    cv::Mat Kd(3, 3, CV_32F); // intrinsic parameter matrix
    cv::Mat Rd(3, 3, CV_32F); // rotation matrix
    cv::Mat Td(4, 1, CV_32F); // translation vector
    cv::Mat newP1(4, 3, CV_32F);

    cv::decomposeProjectionMatrix(P, Kd, Rd, Td);

    //fromH
    cv::Mat T1 = cv::Mat(3, 1, CV_32F);
    homogeneousToEuclidean(Td, T1);

    R = R.t();
    tvec = -R * tvec;

    //add the new t and r
    T1 = T1 + (tvec / 10);
    Rd = Rd + R;    

    projectionFromKRt(Kd, R, T1, newP1); //from cv:sfm

    return newP1;

}

by passing:

//return tvec and rvec from solvepnp //create new Projection Matrices for left and right using the new position, P1 and P2 are the calibration P matrices

    cv::Mat updatedP1(4, 3, CV_32F);
    cv::Mat updatedP2(4, 3, CV_32F);
    cv::Mat RMat;
    cv::Rodrigues(rvec, RMat);
    updatedP1 = MatFunc->UpdateProjectionMatrix(P1, tvec, RMat);
    updatedP2 = MatFunc->UpdateProjectionMatrix(P2, tvec, RMat);

triangulate again.

What i want to see, is that the triangulated points are in the same world space, as the camera has moved. Instead, the points have moved the amount I would expect the camera to have moved. are warped, and very wrong. Am I thinking about this wrong? How can i triangulate the points into the same place, with a known camera position?