2020-11-12 10:25:03 -0600 | received badge | ● Popular Question (source) |
2020-09-24 01:22:27 -0600 | marked best answer | Mat structure for depth map? I am reading an image in to a cv::Mat.
The Mat is then passed to this function: (that gets xyz coordinates from a depth map).
Which causes an assertion error.
I assume that this means the Mat I am using is incompatible with the function, but what Mat structure WILL work? Or do I need to convert it somehow before I pass it through? |
2020-07-10 11:35:42 -0600 | received badge | ● Popular Question (source) |
2020-05-07 12:46:47 -0600 | received badge | ● Popular Question (source) |
2019-08-30 04:47:04 -0600 | received badge | ● Popular Question (source) |
2019-04-04 02:41:56 -0600 | received badge | ● Famous Question (source) |
2017-11-21 16:46:44 -0600 | received badge | ● Popular Question (source) |
2017-10-04 14:00:22 -0600 | edited question | 'Mapping' Aruco markers? 'Mapping' Aruco markers? I have a calibrated camera, and an application that tracks Aruco markers using opencv 3.2. Wha |
2017-10-04 13:43:11 -0600 | asked a question | 'Mapping' Aruco markers? 'Mapping' Aruco markers? I have a calibrated camera, and an application that tracks Aruco markers using opencv 3.2. Wha |
2017-09-22 07:45:24 -0600 | asked a question | Aruco markers with openCv, get the 3d corner coordinates? Aruco markers with openCv, get the 3d corner coordinates? I am detecting a printed Aruco marker using opencv 3.2: aruco |
2017-09-17 06:26:59 -0600 | commented answer | aruco giving strange rotations. Hi, yup I did. I ended up using the built in opencv function 'cv2eigen', works great. Convert 'rvec' to a Mat, then use |
2017-08-20 11:21:04 -0600 | received badge | ● Organizer (source) |
2017-08-20 11:17:16 -0600 | asked a question | Call a cv::Mat from a c++ dll to a C# picturebox? As above. I have this function in a c++ dll: If I show the image there, it is correct. The type returns as '16'. in a C# application, I have: and the function to load the images is: This loads the image to the box, but it looks like this: What am I missing? The image size is 640w * 360h, I have tried a couple of other PixelFormats, but see an equally broken image. Thanks! |
2017-06-25 14:27:27 -0600 | asked a question | Manually set up stereo projection matrices? I have a stereo camera system, and need to get my Projection matrices in order to triangulate points. As i already have the intrinsic, extrinsics and distortion values, I would like to just manually plug these in to get the P1 and P2 matrices. My images are rectified, so the rotation matrix is zero, and the translation is just the 12cm baseline. I have the following code: which gives me the following: Before i go any further i want to check, am i doing this correctly? Do the returned matrices look correct? thank you! |
2017-05-25 10:44:29 -0600 | asked a question | Error using cv::cuda::StreamAccessor::wrapStream I am trying to build some third party code that uses the CUDA modules. I am down to one last unresolved external symbol error. Yay! This one, though, i am stuck with. All the cuda code is in a lib, which builds fine. But when i try to compile a sample function that calls the lib, i get: in It seems like it cannot find This seems to live in : which i have linked to. What else could I be missing? my full libs list is: |
2017-05-21 15:16:24 -0600 | commented answer | How does resizing an image affect the intrinsics? Explained perfectly. thank you! |
2017-05-21 14:37:11 -0600 | asked a question | How does resizing an image affect the intrinsics? Hi there, i have calibrated a camera using images at 1280 * 720. the intrinsic data is: So the focal length is: and the center pixels are: Now, to speed up my algorithm, I have resized the images by half, to give a resolution of: My question is, how do I need to adjust my intrinsic matrix to match? I assume i need to half the cx and cy values, but the focal stays the same? Is this correct? Thank you. |
2017-05-09 14:24:45 -0600 | asked a question | Aruco module, does it have 'Markermap'? Hi, I was using the original aruco code, from here: https://sourceforge.net/projects/aruc... But am now using the opencv contrib module version, which works great. In the original code though, there is this concept of a 'markermap' which means you can place markers all over a room, film them, then run the video through aruco to calculate the positions. Once you have this map, you can use the 3d marker positions to localize. My question is, does this exist in the same way in the opencv version? Thanks! |
2017-05-04 15:24:58 -0600 | received badge | ● Notable Question (source) |
2017-05-04 14:52:12 -0600 | asked a question | Aruco module, estimatePoseSingleMarkers looks great, estimatePoseBoard does not. Hi, as the title says... I have a calibrated camera, the calibration is tested by undistorting a frame, and it looks good. When i run the aruco module and detect a single marker, estimating the pose and drawing the axis, it looks great. When i do the same thing, but using a charuco board, it finds all the markers, but the axis, and therefore the pose, jump around almost every frame. It is very very noisy. I am basically using the straight sample code, why would the single markers track well, and the board not? What can i look at to improve the stability of the Thank you! |
2017-04-26 15:18:52 -0600 | asked a question | interactive charuco application fails... Hi, i have built the sample app I have a higher resolution video that i am trying to run, but when i run:
in a cmd prompt, I get: The mov is in the same folder as the exe. i have tried converting to an avi, and I get the same issue. Can anyone help me out here? Thanks! |
2017-04-02 14:03:12 -0600 | asked a question | cv::Matx, copy section to another Matx I am porting some code from one project to another. One uses With cvImuMatrix and imuPoseOut are both: I have not used What do these two lines need to be when using cv::Matx44d? (Neither copyTo, nor rowrange seem to be included in Matx) Thank you! |
2017-02-17 13:39:52 -0600 | commented answer | realsense R200, difference between saved images and live stream? Thank you! The conversion to 8bit first solved it for me. |
2017-02-17 10:29:16 -0600 | commented answer | realsense R200, difference between saved images and live stream? Thank you for taking the time to look at this. It was as simple as converting to 8bit. |
2017-02-17 04:57:16 -0600 | commented question | realsense R200, difference between saved images and live stream? Hi, Thanks for your response. I am just wondering if there is anything obvious that it could be. Is there any obvious difference in the Mat types from the example code to the code I am using? Is there any other 16 bit Mat type I could try? Thanks again! |
2017-02-16 14:07:38 -0600 | commented question | StereoRectify of non-parallel cameras? Thank you for your response. i do need rectification, but the overlap turned out to be too small for what I need anyway. I have moved to using a different camera setup. Thanks again. |
2017-02-16 14:07:38 -0600 | received badge | ● Commentator |
2017-02-16 14:06:27 -0600 | asked a question | realsense R200, difference between saved images and live stream? Hi, I have a realsense r200 camera, and am using it with Stereo Odometry. The images are converted to cv::mat like this: which looks fine, and is from the docs. When i save images out using: and run them through the third party Odometry library by loading the images, like this: Everything works great. BUT, when i use a live stream of images, like this: It does not work at all. I suspect this is due to a mismatch in I have tried: But I see no difference. Does anyone have any thoughts on what might be missing here? ..if it helps.. I have found an example by the author of the library. They used: Thank you! |
2017-02-13 15:18:49 -0600 | asked a question | StereoRectify of non-parallel cameras? Hi, i have a factory-calibrated camera, which I am trying to rectify the images from. The cameras are not parallel to each other, they are on a 72 degree angle, but there is still around a 20% overlap on the frames. Is this why it is failing? I fill the Matrices manually from the calibration data, then run stereorectify as follows: cv::Mat R1, P1, Q, map1x, map1y, R2, P2, map2x, map2y, imgU1, imgU2; Mat CM1; Mat CM2; Mat D1, D2; Mat R, T, E, F; The results are: What can i look at to get a better result? Thank you! |
2017-02-13 14:28:45 -0600 | asked a question | StereoRectify, what type of Mats are needed? Hi, i am trying to run stereo rectify. i have a factory calibrated camera, and so am using the values provided. I fill the Mats manually, but StereoRectify crashes with the error: My code is: I obviously have an incorrectly formatted Mat, but cannot figure which one. Thank you! |
2017-02-13 08:47:03 -0600 | commented answer | aruCo module, world space coordinates? Thank you. Exactly what i need to know. |
2017-02-12 13:29:56 -0600 | asked a question | aruCo module, world space coordinates? Hi, I need to move a camera, and know exactly how far it has traveled. I need to do this with a mono camera, or it would be simple, using stereo odometry. I am looking into the aRuCo module in openCv, which returns the camera pose. My question is: Since the size of the board / marker is known, is the camera translation returned in accurate world-space coordinates, or not? eg, if I move a meter, will the tvec value reflect that in a repeatable way? Thanks! |