OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Wed, 14 Dec 2016 08:54:28 -0600Is cv::triangulatePoints() returning 3D points in world coordinate system?http://answers.opencv.org/question/118966/is-cvtriangulatepoints-returning-3d-points-in-world-coordinate-system/Considering a moving camera with fixed calibration matrix (intrinsic parameters), I am triangulating tracked feature points from two views that are not consecutive. The view poses are in camera coordinate system and images are already undistorted before detecting and tracking features.
Please can you confirm if the triangulated points are in world coordinate system after applying the cv::triangulatePoints() and cv::convertPointsFromHomogeneous() functions.ale_XompiWed, 14 Dec 2016 08:54:28 -0600http://answers.opencv.org/question/118966/Interpret results from triangulatePointshttp://answers.opencv.org/question/104557/interpret-results-from-triangulatepoints/Hi,
I am detecting ArUco Markers with a stereo setup and would like to know the 3d coordinates of the corners. I am using triangulatePoints to achieve that (the rig is fully calibrated and I am calling undistortPoints before triangulating), but I do not understand how to interpret the results. Here is an example:
![left image](/upfiles/14768718144111549.png)
![right image](/upfiles/14768718303093121.png)
As you can see the markers are detected fine. The results of triangulatePoints are the following:
0.247877 0.0300715 0.501093
0.254448 0.0923606 0.518614
0.181621 0.0959466 0.508083
0.176167 0.0358917 0.50486
0.00881887 0.0501222 0.502481
0.00898725 0.00313973 0.520062
0.0636986 -0.00419561 0.526967
0.0654933 0.0450242 0.509843
0.166304 -0.163573 0.579394
0.225936 -0.172218 0.58141
0.230371 -0.112224 0.581557
0.170264 -0.104482 0.576754
0.0295858 -0.132247 0.574503
0.0318483 -0.16779 0.591691
0.0783909 -0.178207 0.602229
0.0787127 -0.13994 0.583072
0.151794 -0.236149 0.629165
0.102628 -0.229509 0.624732
0.0989285 -0.286064 0.634169
0.151599 -0.29231 0.637837
I tried to plot these in Matlab, and the result looks good:
![Matlab plot](/upfiles/1476872189605449.png)
But I just cannot find any relations to the cameras position. I would like to have coordinates somehow relative to one of the cameras, so that the camera is the origin of the used coordinate system. How can I do that? I read that the origin is the optical center of the first camera, but then my x-values should not be all-positive, right?benTThu, 13 Oct 2016 09:10:36 -0500http://answers.opencv.org/question/104557/Triangulation origin with stereo systemhttp://answers.opencv.org/question/90124/triangulation-origin-with-stereo-system/I m using a stereo system and so i m trying to get world coordinates of some points.
I can do it with specific calibration for each camera and then i calculate rotation matrix and
translation vector. And finally i triangulate but i m not sure of the origin of the world coordinates.
As you can see on my figure, values correspond to depth value but they shoud be close of 400 as it is flat.
So i suppose that the origin is the left camera that s why it variates...
![image description](/upfiles/14580381341122406.png)
A piece of my code with my projective arrays and triangulate function :
#C1 and C2 are the cameras matrix (left and rig)
#R_0 and T_0 are the transformation between cameras
#Coord1 and Coord2 are the correspondant coordinates of left and right respectively
P1 = np.dot(C1,np.hstack((np.identity(3),np.zeros((3,1)))))
P2 =np.dot(C2,np.hstack(((R_0),T_0)))
for i in range(Coord1.shape[0])
z = cv2.triangulatePoints(P1, P2, Coord1[i,],Coord2[i,])
My cameras present an angle, the Z axis direction (direction of the depth) is not normal to my surface. And i want the depth from the baseline direction. So i have to rotate my points?
![image description](/upfiles/14580784518525808.png)
Thanks for help ;)Bilou563Tue, 15 Mar 2016 05:41:44 -0500http://answers.opencv.org/question/90124/Coordinate axis with triangulatePointshttp://answers.opencv.org/question/71653/coordinate-axis-with-triangulatepoints/So, I have the projection matrix of the left camera:
![image description](/upfiles/14431709882413459.png)
and the projection matrix of my right camera:
![image description](/upfiles/14431710436527588.png)
And when I perform `triangulatePoints` on the two vectors of corresponding points, I get the collection of points in 3D space.
All of the points in 3D space have a **negative Z coordinate**. I **assume** that the initial orientation of each camera is directed **in the positive Z axis direction**.
My assumption was that OpenCV uses Right Hand coordinate system like this:
![image description](/upfiles/14431714989453053.png)
![image description](https://upload.wikimedia.org/wikipedia/commons/b/b2/3D_Cartesian_Coodinate_Handedness.jpg)
So, when I positioned my cameras with projection matrices, the complete picture would look like this:
![image description](/upfiles/14431719796004324.png)
But my experiment leads me to believe that OpenCV uses Left Hand coordinate system:
![image description](/upfiles/14431715414653933.png)
And that my projection matrices have effectively messed up the left and right concept:
![image description](/upfiles/14431720286658647.png)
Is everything I've said correct? Is the latter coordinate system really the one that is used by OpenCV?
If I assume that it is, everything seems to work fine. But when I want to visualize things using `viz` module, their `WCoordinateSystem` widget is right handed.acajicFri, 25 Sep 2015 04:11:22 -0500http://answers.opencv.org/question/71653/