2018-09-11 06:27:26 -0600 | asked a question | Using OpenCL causes application to hang at startup Using OpenCL causes application to hang at startup Hi, When building an application on Windows using OpenCV, the appli |
2014-12-04 03:24:25 -0600 | asked a question | Please support our Kickstarter Campaign, it makes OpenCV easy It gives a user interface wrapper allowing nodal realtime connections and running of OpenCV. No need to code! Loads of Features: Please support our kick starter Campaign!! Thanks so much! |
2014-11-19 07:01:56 -0600 | answered a question | Error during Static Library Creation : " undefined reference to symbol 'v4l2_munmap' " |
2014-11-11 13:42:57 -0600 | answered a question | Opencv3 and Algorithm (create, getList, etc.) I'd also like to know what is happening here? I can't find the "initModule..." anywhere. Algorithm::getList() only shows 2 algorithms. What is going on? |
2014-11-11 13:42:52 -0600 | received badge | ● Supporter (source) |
2014-11-10 08:45:30 -0600 | commented question | Has anyone seen this software? It says in the performance section: http://www.apulus.com/performance/ that it uses openCV |
2014-11-09 19:39:29 -0600 | asked a question | Has anyone seen this software? |
2014-06-13 06:21:27 -0600 | asked a question | Stereo calibration using known locations How do you calibrate a camera using known 3D points and their corresponding 2d image locations. Using a chessboard is easy, is it the same method for real world points. How then do you then get the cameras to output data in that coordinate system, so a point on the images gives the same 3D point via triangulation as was used in the calibration? |
2014-05-15 16:58:31 -0600 | asked a question | OpenCV Stereo Calibration and triangulation in a user defined coordinate system How do you stereo cameras so that the output of the triangulation is in a real world coordinate system, that is defined by known points? OpenCV stereo calibration returns results based on the pose of the left hand camera being the reference coordinate system. I am currently doing the following: Intrinsically calibrating both the left and right camera using a chess board. This gives the Camera Matrix A, and the distortion coefficients for the camera. Running stereo calibrate, again using the chessboard, for both cameras. This returns the extrinsic parameters, but they are relative to the cameras and not the coordinate system I would like to use. How do I calibrate the cameras in such a way that known 3D point locations, with their corresponding 2D pixel locations in both images provides a method of extrinsically calibrating so the output of triangulation will be in my coordinate system? |