Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

How rotation invariant is FAST + FREAK in practice?

I've been doing some experiments with FAST and FREAK, cause I read that it is rotation invariant. However, when I compare two images that are 180° rotated i get really poor results.

What I need to explain next might also be important: I only use the best 500 keypoints from each image (the ones with the biggest response). When I match the images with FAST (or FAST PYRAMID, doesn't make a difference) and FREAK I get 0 matches.

BRISK does quite a good job, but it's slow.

My Question is, just how rotation invariant is FAST+FREAK in practice? (in theory I guess it should even be good for 180°?) and is there a way to improve it without changing my detector and descriptor (both are quite fast, i know BRISK gives better results, but it's too slow for the application I am going to make).

How rotation invariant is FAST + FREAK in practice?

I've been doing some experiments with FAST and FREAK, cause I read that it is rotation invariant. However, when I compare two images that are 180° rotated i get really poor results.

What I need to explain next might also be important: I only use the best 500 keypoints from each image (the ones with the biggest response). When I match the images with FAST (or FAST PYRAMID, doesn't make a difference) and FREAK I get 0 matches. When I use all the keypoints I get 23 matches, but they are just wrong (check image) image description

BRISK does quite a good job, but it's slow.

My Question is, just how rotation invariant is FAST+FREAK in practice? (in theory I guess it should even be good for 180°?) and is there a way to improve it without changing my detector and descriptor (both are quite fast, i know BRISK gives better results, but it's too slow for the application I am going to make).

How rotation invariant is FAST + FREAK in practice?

I've been doing some experiments with FAST and FREAK, cause I read that it is rotation invariant. However, when I compare two images that are 180° rotated i get really poor results.

What I need to explain next might also be important: I only use the best 500 keypoints from each image (the ones with the biggest response). When I match the images with FAST (or FAST PYRAMID, doesn't make a difference) and FREAK I get 0 matches. When I use all the keypoints I get 23 matches, but they are just wrong (check image) image description

BRISK does quite a good job, but it's slow.

My Question is, just how rotation invariant is FAST+FREAK in practice? (in theory I guess it should even be good for 180°?) and is there a way to improve it without changing my detector and descriptor (both are quite fast, i know BRISK gives better results, but it's too slow for the application I am going to make).

EDIT: it might be important to add that I am doing this on a Android device with a Java Wrapper for the OpenCV Functions. It is critical for me that it all runs as fast as possible (hence not using the BRISK)