2017-03-21 04:33:18 -0600 | commented answer | The program to count number of white and black pixels in a binary image is not working.Here I paste the code i used.Please help me with a solution. The image mentioned in the question is binary, so it (255,255,255) or (0,0,0) |
2017-03-21 04:30:30 -0600 | commented question | The program to count number of white and black pixels in a binary image is not working.Here I paste the code i used.Please help me with a solution. Can you share how to obtain your binary image? Even though, I totally agree with berak's comment above; your loop works perfectly with appropriate image. |
2017-03-21 03:20:18 -0600 | commented answer | The program to count number of white and black pixels in a binary image is not working.Here I paste the code i used.Please help me with a solution. Correct. should be converted to one channel first |
2017-03-21 03:10:35 -0600 | answered a question | The program to count number of white and black pixels in a binary image is not working.Here I paste the code i used.Please help me with a solution. You can use count non-zero function http://docs.opencv.org/2.4/modules/co... For white pixels, do the same with inverted version of the image |
2017-03-15 01:41:40 -0600 | received badge | ● Scholar (source) |
2017-03-15 01:41:20 -0600 | received badge | ● Enthusiast |
2017-03-14 11:04:32 -0600 | asked a question | UndistortPoints odd results Hi all, I've calibrated my camera and here is the distortion parameters: [ 7.0576386285112147e-02, -5.0734456409579369e+00,-1.1508247483618957e-02, -3.9730820350519589e-03, 8.0251688016585078e+01 ] My problem is that when undistort a point (552,320) , I get (0.146645,-0.104564 ). What can be the cause of this? How can I get undistorted point in pixel coordinates for example (560,315) ? |
2016-03-29 09:50:44 -0600 | answered a question | How do i access UMat pixel by pixel? have you tried this? umat.getMat(ACCESS_READ).at<uchar>(row, column); |
2016-03-29 09:42:12 -0600 | answered a question | accuracy of *stereoCalibrate()* try this flag CV_CALIB_CB_ADAPTIVE_THRESH | CV_CALIB_CB_FILTER_QUADS | CV_CALIB_CB_FAST_CHECK |
2016-03-18 11:07:37 -0600 | commented answer | Stereo vision - Tilted camera and triangulation landmark not sure. if correct, you can use rvecs1 from left camera I guess |
2016-03-18 10:15:03 -0600 | commented answer | Stereo vision - Tilted camera and triangulation landmark If you're using stereo system, just use cv::stereoCalibrate() to find rotation matrix and translation vector between cameras, which will be used in triangulation.cv::cameraCalibrate() gives you the relationship between the camera and outside world, not the relationship between two cameras |
2016-03-18 10:01:59 -0600 | commented answer | Stereo vision - Tilted camera and triangulation landmark I just realized that, you're calibrating your cameras separately by using cv::calibrateCamera() function, aren't you? |
2016-03-16 09:43:25 -0600 | answered a question | what's the coordinate relations between disparity image and the rectified images(the left or right view). Disparity map is calculated w.r.t left image, i.e the (x,y) pixel in disparity map corresponds the (x,y) pixel in left image. On the other hand, the pixel in the right which corresponds to that pixel is not accessible in OpenCV's Semi Global Block Matching algorithm as far as I know. You need to go deeper in the source code. |
2016-03-16 08:52:44 -0600 | answered a question | Triangulation origin with stereo system The origin is the optical center of the Camera 1. See http://www.mathworks.com/help/vision/... |
2016-03-16 08:50:16 -0600 | answered a question | Stereo vision - Tilted camera and triangulation landmark Yes, it is correct. To be able to get constant depth, you may make your cameras parallel which is not good for depth precision. Another way is to fit a plane to the points you get from triangulation and align this plane to xy plane so that the normal is towards to z direction. |
2016-03-11 08:24:42 -0600 | commented answer | Is there any way to decrease CPU usage of feature detector? That works. Thank you. |
2016-03-10 10:18:25 -0600 | commented answer | Is there any way to decrease CPU usage of feature detector? Thank you for your answer. However, It couldn't solve. Actually even if it could solve, it is not appropriate for my project. Let me give you the big picture: I am trying to implement a project in a pipes & filters fashion using tbb library. The feature detection is one of the filters. Since openCV's feature detection occupies more than 50% of CPU resources, I could not run more than two filter in parallel which is the main ideo of pipes&filters design. This destroys the performance of my pipeline. Any other recommendation on that? |
2016-03-10 03:28:54 -0600 | asked a question | Is there any way to decrease CPU usage of feature detector? Hi everbody, The feature detector,in OpenCV c++, launches 14 threads and makes heavy usage of CPU (more than 50%). Is there any way to decrease this usage. Simple code: enviroınment: Intel i7 3770K , 8GB RAM, Windows 8.1 Pro Visual studio 2013 Pro |