I am trying to triangulate points from a calibrated pair of fisheye cameras.
I have a calibration that looks pretty good (checked with viewing rectified images).
I have my P1 and P2 matrices from the stereo calibration process. I have my matched orb points in left and right frames (checked with cv::drawMatches).
i run the following:
std::vector<cv::Point2f> pntsMatchedLeft;
std::vector<cv::Point2f> pntsMatchedRight;
DetectKeypointsOrb(left, d1, kp1);
DetectKeypointsOrb(right, d2, kp2);
std::vector<cv::DMatch> matches;
matches = RobustMatching(d1, d2, kp1, kp2);
//check matches by drawing them.
for (int x = 0; x < matches.size(); x++)
{
cv::Point2f pnt2dL = kp1[matches[x].queryIdx].pt;
pntsMatchedLeft.push_back(pnt2dL);
cv::Point2f pnt2dR = kp1[matches[x].trainIdx].pt;
pntsMatchedRight.push_back(pnt2dR);
}
std::vector<cv::Point2f> pntsMatchedLeftUD;
std::vector<cv::Point2f> pntsMatchedRightUD;
cv::fisheye::undistortPoints(pntsMatchedLeft, pntsMatchedLeftUD, K1, D1);
cv::fisheye::undistortPoints(pntsMatchedRight, pntsMatchedRightUD, K2, D2);
cv::Mat point3d_homo;
cv::triangulatePoints(P1,P2,
pntsMatchedLeftUD, pntsMatchedRightUD,
point3d_homo);
assert(point3d_homo.cols == triangulation_points1.size());
// create point cloud
std::vector< Eigen::Vector3f> cloud;
for (int i = 0; i < point3d_homo.cols; i++) {
Eigen::Vector3f point;
cv::Mat p3d;
cv::Mat _p3h = point3d_homo.col(i);
convertPointsFromHomogeneous(_p3h.t(), p3d);
point.x() = p3d.at<double>(0);
point.y() = p3d.at<double>(1);
point.z() = p3d.at<double>(2);
cloud.push_back(point);
}
But the points that i get are, for example:
1.65398e+22 0 0
3.68707e+12 0 0
5.02082e+15 1.42875e-27 0
2.26795e+17 0 0
-3.87201e+15 0 0
-4.601e+24 0 0
-1.35347e+20 0 0
3.52328e+15 -1.29677e-25 0
4.1635e+12 -1.29677e-25 0
-1.08779e+15 -1.29677e-25 0
1-5.17322e+24 0 0
(I would expect no zeros)
What am I doing wrong here? Do i need to undistort the points as I am doing? Skipping that step seems to make minimal difference.
Thank you!