Ask Your Question
0

How to use Surf to detect features in android application?

asked 2014-01-12 18:37:02 -0600

habisravi gravatar image

I am developing an Augmented Reality Application. I am trying to use SURF for detecting objects. I have developed a custom java camera activity. I don't know how to add OpenCV SURF detector to my application. Do i need to implement SURF using NDK native development? I want to detect features in real time.

edit retag flag offensive close merge delete

Comments

For detecting features in real-time try to look at ORB or FAST. I didn't try to use them on the Android device but I made a few comparisons on PC. The results were that FAST is faster (way faster - 25ms on average (there are a few parameters you can tune)) and the accuracy is the same if not better. The bad thing with fast is that you need to use with a descriptor (if you need to) and because there was a weird bug at that time in the ORB descriptor class I used it with SURF. My suggestion would be to try to use FAST and ORB for feature detection and as for descriptor, ORB and FREAK (didn't use though, just read a bit about it).

andrei.toader gravatar imageandrei.toader ( 2014-02-10 02:56:07 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
0

answered 2014-01-13 01:02:31 -0600

Moster gravatar image

updated 2014-01-13 01:03:11 -0600

"I want to detect features in real time" - then dont use surf and check out binary descriptors. I cant imagine that any modern mobile device can use surf in real time. Also, the nonfree module with sift and surf is not included in the android package. You can build it on your own though. Search for "Using OpenCV Nonfree Module android" on google. There is no java wrapper though. So you need to use JNI

edit flag offensive delete link more

Question Tools

Stats

Asked: 2014-01-12 18:37:02 -0600

Seen: 1,666 times

Last updated: Jan 13 '14