1 | initial version |
I've looked into Fisher lda today, and understood what I missed :
J = ("space"/"distance" between projected averages)² / (sum of scatters²)
which tends to represent the distance between the different classified groups. And so the eigenvectors give the directions which best separate those groups.
The comparison I've made between openCV and the two first links is irrelevant, because one is LDA and the other Fisher LDA....
Finally, as berak pointed, there is indeed a covariance matrix I did'nt see...
Thank you !
Etienne
2 | No.2 Revision |
I've looked into Fisher lda today, and understood what I missed :
J = ("space"/"distance" between projected averages)² / (sum of scatters²)
which tends to represent the distance between the different classified groups. And so the eigenvectors give the directions which best separate those groups.
The comparison I've made between openCV and the two first links is irrelevant, because one is LDA and the other Fisher LDA....
Finally, as berak pointed, there is indeed a covariance matrix I did'nt see...
Thank you !
Etienne
3 | No.3 Revision |
I've looked into Fisher lda today, and understood what I missed :
J = ("space"/"distance" between projected averages)² / (sum of scatters²)
which tends to represent the distance between the different classified groups. And so the eigenvectors give the directions which best separate those groups.
The comparison I've made between openCV and the two first links is irrelevant, because one is LDA and the other Fisher LDA....
Finally, as berak pointed, there is indeed a covariance matrix I did'nt see...
Thank you !If anyone can approve :) ?
Etienne