Ask Your Question
1

Clarify about BOW Module

asked 2015-03-17 02:58:13 -0500

Abu Gaseem gravatar image

I read about the BOW algorithm that can be used in object recognition and classification . I'm working on application to recognize shops and buildings based on Local features matching and geo-location . The data-set consist of 26 POI ,23 of them are shops and three are buildings .

The algorithm say after extracting all the descriptors for each features detected from the data-sets (Train images) to cluster them to a k group/cluster . My first question ,is the number of clusters that i should use equal to my POI in my application which is 26 ? My second question after clustering into K cluster , the algorithm represent each Ki cluster in one center descriptor it's size depends on the Descriptor extractor algorithm used ,for example in SURF 64 real , so how is that one center descriptor can be used in matching, and does the rest descriptors that belong to that center Ki descriptor related to each other in some data structure like tree , because that make sense for me .

I'm sorry if my question not clear because I'm confused with that topic . I will appreciate any help .

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
5

answered 2015-03-17 03:15:56 -0500

Guanta gravatar image
  • Question: "is the number of clusters that i should use equal to my POI in my application which is 26 ?"

    Answer: No, it is not related to the number of classes at all. Typical values range from k=100 to 10000 (or more)


  • Question: "how is that one center descriptor can be used in matching, and does the rest descriptors that belong to that center Ki descriptor related to each other in some data structure like tree"

    Answer: The rest of the descriptors belonging to that cluster are not needed for matching (i.e. you only need to store the cluster centers). The goal is to have a global descriptor, you achieve that by computing now a histogram of all local descriptors (of one image) to their nearest cluster ccenters.

    Example: Let's make a small example then it's getting clearer: you have chosen 100 clusters, you cluster your local descriptors (you don't need to train all but let's say 500000 random ones) of your training set. Now, given a new image you compute again local descriptors. For each of these cluster center you search the nearest cluster center and you count their occurence in a histogram (i.e. you initialize a vector of 100 with zeros - each entry standing for a cluster-id - and count how often one cluster center has chosen as the nearest one). After normalizing the histogram (typically L1) you have your bag of words descriptor (of dimension 100) which you can compare now. In other words you have transformed all your local descriptors to a global one which you can now use for classification, etc.

I hope, it got clearer now!

edit flag offensive delete link more

Comments

YOU the man . Thank you infinite much because you uncover my misunderstanding .

Abu Gaseem gravatar imageAbu Gaseem ( 2015-03-17 03:34:32 -0500 )edit

SVM is a binary classifier ? or it can classify n classes ?

Abu Gaseem gravatar imageAbu Gaseem ( 2015-03-17 10:17:47 -0500 )edit

SVM is a binary classifier. However, there exist 2 strategies for multi-class classification with SVMs: one-vs-rest, i.e. for each class you train an svm using the instances of the current class as positives and all others as negatives (resulting in n_classes classifiers). The other strategies is called one-vs-one, here you train an SVM for each different class-combination (resulting in n_classes * (n_classes - 1) / 2) classifiers) . More commonly used is the first variant.

Guanta gravatar imageGuanta ( 2015-03-17 10:53:45 -0500 )edit

Thanks for response . http://answers.opencv.org/question/57... can you guide me on the right way please ?

Abu Gaseem gravatar imageAbu Gaseem ( 2015-03-17 10:57:27 -0500 )edit
Login/Signup to Answer

Question Tools

3 followers

Stats

Asked: 2015-03-17 02:58:13 -0500

Seen: 240 times

Last updated: Mar 17 '15