2017-03-17 13:13:12 -0600 | received badge | ● Self-Learner (source) |
2017-03-17 12:20:21 -0600 | received badge | ● Editor (source) |
2017-03-17 12:10:11 -0600 | answered a question | Explaination about LDA class / methods I've looked into Fisher lda today, and understood what I missed :
J = ("space"/"distance" between projected averages)² / (sum of scatters²) which tends to represent the distance between the different classified groups. And so the eigenvectors give the directions which best separate those groups.
If anyone can approve :) ? Etienne |
2017-03-17 11:17:45 -0600 | commented question | Explaination about LDA class / methods Thank you berak, that's true, sorry I missed that... |
2017-03-15 14:07:21 -0600 | asked a question | Explaination about LDA class / methods Hello, I'm about to start a project using LDA classification. I have started reading about opencv and the LDA class, but they are still some grey areas for me compare to the theory I have read here and the associated example (1) :
I have seen on this thread (2) that they perform a classic L2 norm to find the closest neighbour in the lda-subspace to predict, but I can't find any theory explanation about LDA talking about this part.
If anyone could enlight me about how to use this LDA :) Thank you ! Etienne LINKS : (sorry for removing the 'http' part, I've no right for direct links...) (1) : people.revoledu.com/kardi/tutorial/LDA/LDA.html (1) : people.revoledu.com/kardi/tutorial/LDA/Numerical%20Example.html (2) : answers.opencv.org/question/64165/how-to-perform-linear-discriminant-analysis-with-opencv/ (3) : github.com/opencv/opencv/blob/master/modules/core/src/lda.cpp |