2017-11-17 06:42:12 -0500 received badge ● Notable Question (source) 2017-02-28 13:22:01 -0500 received badge ● Popular Question (source) 2015-07-19 05:33:58 -0500 received badge ● Critic (source) 2015-07-19 05:30:42 -0500 commented question I'm very disappointed about opencv 3. This is not a question. If you submitted a pull request and your build passed all checks, it will have been merged. At least give links to draw attention to your contribution. 2015-07-19 05:16:06 -0500 received badge ● Enthusiast 2015-07-13 15:37:00 -0500 commented question Why do we pass R and P to undistortPoints() fcn (calib3d module)? I haven't given thought to R, but the purpose of the/onw last parameter is to control whether the resulting image coordinates are normalised (i.e. are in the range [-1,1] or not). If your goal is simply to remove the distortion, but still have the points in pixel coordinates with origin at the top left (as it is when you obtain e.g. features with a detector), you would pass your original camera matrix for P. Essentially this argument describes which camera matrix should be valid for the undistorted points. If you leave it blank, the identity matrix is assumed, which is valid for undistorted normalised coordinates. This is important if you use findEssentialMat and recoverPose which are new in OCV3 which take this parameter as well and it should be equal for all 3 calls. 2015-07-13 03:36:05 -0500 received badge ● Citizen Patrol (source) 2015-07-11 00:49:24 -0500 received badge ● Scholar (source) 2015-07-11 00:49:20 -0500 answered a question undistortPoints, findEssentialMat, recoverPose: What is the relation between their arguments? As it turns out, my data seemingly is off. By using manually labelled correspondences I determined that Possibility 1 and 2 are indeed the correct ones, as one would expect. 2015-07-10 06:37:32 -0500 commented answer Strange epipolar lines and 3d reconstruction [OpenCV for Java] But that's still not right, is it? I mean it makes sense in the right image, but in the left image, the lines should converge to the right, as the second frame's camera center is further to the right. 2015-07-10 03:46:42 -0500 received badge ● Teacher (source) 2015-07-10 03:37:50 -0500 commented question opencv 3 essentialmatrix and recoverpose One question is whether the 8 correspondences you have are actually noise-free. In theory, only 5 are required (which should not be coplanar or close to it), but in reality, you need much more because the measurements are imprecise. 2015-07-10 03:35:42 -0500 answered a question openCV 3.0 recoverPose wrong results Well, my impression so far is that translation is computed rather well, but rotation is almost always bullshit, unless there is none between the two frames. But rotation around the z axis like in your case seems to work occasionally. I'm not sure about the equations, but you can use the RQdecomp3x3 function to decompose the matrix to euler angles  Mat mtxR, mtxQ; Vec3d angles = RQDecomp3x3(R, mtxR, mtxQ); cout << "Translation: " << t.t() << endl; cout << "Euler angles [x y z] in degrees: " << angles.t() << endl;  I'm not sure though if x, y, z is really the order of axes here. Also, you need to pay attention to whether feature coordinates are normalised or not, and possibly try various combination of input parameters when performing undistortion and then finding E (I've outlined the issue here), so something like this may be requires depending on your specific case. Mat E = findEssentialMat(imgpts1, imgpts2, 1.0, Point2d(0,0), RANSAC, 0.999, 3, mask); correctMatches(E, imgpts1, imgpts2, imgpts1, imgpts2); recoverPose(E, imgpts1, imgpts2, R, t, 1.0, Point2d(0,0), mask);  2015-07-10 03:26:15 -0500 commented question How To Import External Library To OpenCV @openman Compiling opencv is a tedious business, you need to set a lot of options correctly depending on your system, so don't expect is to work out of the box. 2015-07-09 14:24:06 -0500 answered a question How To Import External Library To OpenCV For version 3.0, you will need to compile from source, including the contrib modules. In CMake, you would set the EXTRA_MODULES_PATH variable to the modules folder of the cloned contrib repo. 2015-07-09 01:03:59 -0500 commented answer Licence of the source code in this book “Mastering OpenCV with Practical Computer Vision Projects” Please do not use answers to post comments. 2015-07-09 01:02:35 -0500 received badge ● Editor (source) 2015-07-09 01:02:13 -0500 answered a question how to identify excess includes Does your IDE or editor of choice not show errors when it cannot find a header? If it does, why not just delete the suspects and check whether it still compiles? You could run a command on the command line, prepending #warning "I am being included" or some such thing to every header, although that seems not so straightforward, so you might be better off inserting that line by hand. The #warning directive prints something when the header is processed, so you should see al those messages belonging to included headers. 2015-07-08 14:38:52 -0500 commented question triangulate to 3-D on corresponding 2-D points Yup, same here. After much research it seems to me that almost no one gets this to work. 2015-07-08 06:15:20 -0500 received badge ● Student (source) 2015-07-08 05:45:21 -0500 asked a question undistortPoints, findEssentialMat, recoverPose: What is the relation between their arguments? TL;DR: What relation should hold between the arguments passed to undistortPoints, findEssentialMat and recoverPose. I have code like the following in my program  Mat mask; // inlier mask undistortPoints(imgpts1, imgpts1, K, dist_coefficients, noArray(), K); undistortPoints(imgpts2, imgpts2, K, dist_coefficients, noArray(), K); Mat E = findEssentialMat(imgpts1, imgpts2, 1, Point2d(0,0), RANSAC, 0.999, 3, mask); correctMatches(E, imgpts1, imgpts2, imgpts1, imgpts2); recoverPose(E, imgpts1, imgpts2, R, t, 1.0, Point2d(0,0), mask);  I undistort the Points before finding the essential matrix. The doc states that one can pass the new camera matrix as the last argument. When omitted, points are in normalized coordinates (between -1 and 1). In that case, I would expect that I pass 1 for the focal length and (0,0) for the principal point to findEssentialMat, as the points are normalized. So I would think this to be the way: Possibility 1 (normalize coordinates)  Mat mask; // inlier mask undistortPoints(imgpts1, imgpts1, K, dist_coefficients); undistortPoints(imgpts2, imgpts2, K, dist_coefficients); Mat E = findEssentialMat(imgpts1, imgpts2, 1.0, Point2d(0,0), RANSAC, 0.999, 3, mask); correctMatches(E, imgpts1, imgpts2, imgpts1, imgpts2); recoverPose(E, imgpts1, imgpts2, R, t, 1.0, Point2d(0,0), mask);  Possibility 2 (do not normalize coordinates)  Mat mask; // inlier mask undistortPoints(imgpts1, imgpts1, K, dist_coefficients, noArray(), K); undistortPoints(imgpts2, imgpts2, K, dist_coefficients, noArray(), K); double focal = K.at(0,0); Point2d principalPoint(K.at(0,2), K.at(1,2)); Mat E = findEssentialMat(imgpts1, imgpts2, focal, principalPoint, RANSAC, 0.999, 3, mask); correctMatches(E, imgpts1, imgpts2, imgpts1, imgpts2); recoverPose(E, imgpts1, imgpts2, R, t, focal, principalPoint, mask);  However, I have found, that I only get reasonable results when I tell undistortPoints that the old camera matrix shall still be valid (I guess in that case only distortion is removed) and pass arguments to findEssentialMat as if the points were normalized, which they are not. Is this a bug, insufficient documentation or user error? Update It might me that correctedMatches should be called with (non-normalised) image/pixel coordinates and the Fundamental Matrix, not E, this may be another mistake in my computation. It can be obtained by F = K^-T * E * K^-1`