Ask Your Question

Issue with calcOpticalFlowPyrLK()

asked 2016-10-08 21:01:26 -0500

patrchri gravatar image

updated 2016-10-08 21:03:48 -0500


I am trying to implement visual odometry via my webcam and the steps I have followed so far are the following:

  1. Capture video from webcam (2 following frames each loop).
  2. Undistort the frames with the callibration parameters I got.
  3. Convert frames to gray scale.
  4. Apply FAST algorithm for feature detection.
  5. And now I am at the calcOpticalFlowPyrLK() step.

My issue is that at step 4 when I print the number of keypoints found in the 2 following frames I get different numbers between the frames (which is logical), but when I do the same thing after the calling of calcOpticalFlowPyrLK() I get the exact same number.

To be more specific, I have wrote:

 vector<KeyPoint> kpointsframe1;
 vector<KeyPoint> kpointsframe2;
 vector<Point2f> pointsframe1;
 vector<Point2f> pointsframe2;
 vector<uchar> status;
 vector<float> err;
 Size winSize = Size(21,21);
 TermCriteria termcrit = TermCriteria(TermCriteria::COUNT+TermCriteria::EPS,30,0.01);
 Mat Essential;;;
 for(int i=0;i<kpointsframe1.size();i++)    pointsframe1.push_back(;
 for(int i=0;i<kpointsframe2.size();i++)    pointsframe2.push_back(;

and I print the following in each loop:


If I comment the line


I get different numbers, but if I leave it like that I get 3 numbers which are all the same in each loop. All the above code is in a while loop. I am wondering why this happens? Shouldn't the calcOpticalFlowPyrLK() try to find the matching between the keypoints of the two frames and give the status of their differences or I am understanding something wrong?

I am new to image processing so try to bare with me for any potential questions that may arise.

Thank you for answering and for your time in advance,


edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted

answered 2016-10-08 21:31:54 -0500

Tetragramm gravatar image

You are supposed to leave pointsframe2 empty, and Optical Flow finds the location in the new frame, of the old points. The second point list can be filled, but it functions as a starting point for the algorithm to start looking. Check out the tutorial for a better explanation.

If you are trying to match lists of points, you should use Descriptor Matching.

edit flag offensive delete link more


Firstly, thank you for your quick reply. So in other words, the Optical Flow gives me the error of the keypoint/point matching, since it searches for the match of the points given from the first frame, am I correct ? If indeed it does that, shouldn't be easy by their difference, while of course knowing the fps of my webcam, to know an estimate of the speed I am moving in a stable environment ? I apologize in advance for the many questions, but I am trying to ensure that my thoughts on this are correct and I have a clear understanding of what is going on and the way I proceed with monocular odometry.

patrchri gravatar imagepatrchri ( 2016-10-08 22:19:40 -0500 )edit

"the Optical Flow gives me the error of the keypoint/point matching, since it searches for the match of the points given from the first frame" Not as such, no. It is important that you realize that in the code you have posted there is nothing resembling keypoint matching. Optical flow merely follows points from one image to another. They need not be keypoints. Keypoint matching can find matches regardless of where the keypoint is in the second image, whereas optical flow is a local process.

"Easy" Ahahaha. No. Nothing involving a monocular camera and the real world is easy. Speed is, in fact, impossible without at least one external reference of distance. You can get it up to a scaling factor, but without a known external distance, you don't know that factor.

Tetragramm gravatar imageTetragramm ( 2016-10-08 23:26:40 -0500 )edit

-"Optical Flow finds the location in the new frame, of the old points".

Since I find the keypoints with Fast in the previous frame, convert them to points and then call Optical Flow, wouldn't the function try to search these points in the next frame? By "local proccess", I guess you mean that it searches locally for each point and not in the entire image, in other words is used for small range displacements and it might lose some points or show false match?

-"Keypoint matching can find matches regardless of where the keypoint is in the second image, whereas optical flow is a local process."

So you suggest to get rid of Optical flow and use a global matcher? Like brute force for example?

-""Easy" Ahahaha. No."

Lol,what I meant is not the implementation,but the idea of the estimation :)

patrchri gravatar imagepatrchri ( 2016-10-09 07:55:14 -0500 )edit

HERE is an explanation about how the Optical Flow method you're using works. In this case, "local" is augmented by the nlevels parameter, that performs the process on resized images to get a coarse->fine answer.

I suggest you continue using Optical Flow. It does what you want, and you can get rid of the entire second FAST call in your code. But it is important that you know what you're doing, or when you look for details or help you will get the wrong answers.

Sure, this is one of the simpler problems. Most of the complicated parts are hidden away by OpenCV calls, but it's still complicated. Good luck.

Tetragramm gravatar imageTetragramm ( 2016-10-09 08:05:06 -0500 )edit

Ok, thank you for your help.

patrchri gravatar imagepatrchri ( 2016-10-09 08:18:21 -0500 )edit

Question Tools

1 follower


Asked: 2016-10-08 21:01:26 -0500

Seen: 546 times

Last updated: Oct 08 '16