For visual odometry, I want to track features over a sequence of frames.
I tried tracking features with cv::calcOpticalFlowPyrLK
in OpenCV 2.4, but I have the problem that these features drift (e.g. as the camera moves left, the features also slowly move to the left), because I start the search for the feature at the last feature location and thus I am biased to find the feature closer to where it actually was.
Thus, I want to use a feature descriptor such as BRISK.
The problem that I have is that with BRISK, or any other feature descriptor, I have to look for features in both images and then match them. However, I already have one feature descriptor (as well as an approximate location where the feature might be) and would like to just look for this feature. What I do is, I extract features in a region of interest around the most recent feature location and then take the best match. Unfortunately, the best match is not always actually the same feature, even if the subsequent images are identical. The problem is that I have to just hope that the feature extractor (I use FAST) finds the same feature interesting in the new image. If it doesn't, I end up with a match that is a few pixels off, or no match at all. This causes me to lose features very often, even if they are still very visible and the view point has not changed at all.
What is the recommended approach for this? Ideally I would like to search for a specific feature descriptor and not have to blindly look for features and hope that one of them is the one I am looking for.
Thank you for your help