OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Mon, 27 Mar 2017 15:42:53 -0500PCA in thousands of dimensionshttp://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/ I have vectors in 4096 [VLAD codes][1], each one of them representing an image.
I have to run PCA on them *without reducing their dimension* on learning datasets with less than 4096 images (e.g., the [Holiday dataset][2] with <2k images.) and obtaining the rotation matrix `A`.
In [this](http://search.ieice.org/bin/summary.php?id=e99-d_10_2656) paper (which also explains why I need to run this PCA without dimensionality reduction) they solved the problem with this approach:
>For efficient computation of A, we compute at most the first 1,024 eigenvectors by eigendecomposition of the covariance matrix, and
the remaining orthogonal complements up to D-dimensional are filled using Gram-Schmidt orthogonalization.
Now, how do I implement this using C++ libraries? I'm using OpenCV, but [cv::PCA](http://docs.opencv.org/trunk/d3/d8d/classcv_1_1PCA.html) seems not to offer such a strategy. Is there any way to do this?
[1]: http://www.vlfeat.org/overview/encodings.html
[2]: http://lear.inrialpes.fr/people/jegou/data.phpWed, 22 Mar 2017 18:48:29 -0500http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/Comment by lovaj for <p>I have vectors in 4096 <a href="http://www.vlfeat.org/overview/encodings.html">VLAD codes</a>, each one of them representing an image. </p>
<p>I have to run PCA on them <em>without reducing their dimension</em> on learning datasets with less than 4096 images (e.g., the <a href="http://lear.inrialpes.fr/people/jegou/data.php">Holiday dataset</a> with <2k images.) and obtaining the rotation matrix <code>A</code>.</p>
<p>In <a href="http://search.ieice.org/bin/summary.php?id=e99-d_10_2656">this</a> paper (which also explains why I need to run this PCA without dimensionality reduction) they solved the problem with this approach:</p>
<blockquote>
<p>For efficient computation of A, we compute at most the first 1,024 eigenvectors by eigendecomposition of the covariance matrix, and
the remaining orthogonal complements up to D-dimensional are filled using Gram-Schmidt orthogonalization.</p>
</blockquote>
<p>Now, how do I implement this using C++ libraries? I'm using OpenCV, but <a href="http://docs.opencv.org/trunk/d3/d8d/classcv_1_1PCA.html">cv::PCA</a> seems not to offer such a strategy. Is there any way to do this?</p>
http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=136248#post-id-136248@StevenPuttermans thanks for the reference, but how this could help me? No gram-schmidt method is cited there, neither how to compute PCA in thousands of dimensions :)Mon, 27 Mar 2017 15:42:53 -0500http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=136248#post-id-136248Comment by LBerger for <p>I have vectors in 4096 <a href="http://www.vlfeat.org/overview/encodings.html">VLAD codes</a>, each one of them representing an image. </p>
<p>I have to run PCA on them <em>without reducing their dimension</em> on learning datasets with less than 4096 images (e.g., the <a href="http://lear.inrialpes.fr/people/jegou/data.php">Holiday dataset</a> with <2k images.) and obtaining the rotation matrix <code>A</code>.</p>
<p>In <a href="http://search.ieice.org/bin/summary.php?id=e99-d_10_2656">this</a> paper (which also explains why I need to run this PCA without dimensionality reduction) they solved the problem with this approach:</p>
<blockquote>
<p>For efficient computation of A, we compute at most the first 1,024 eigenvectors by eigendecomposition of the covariance matrix, and
the remaining orthogonal complements up to D-dimensional are filled using Gram-Schmidt orthogonalization.</p>
</blockquote>
<p>Now, how do I implement this using C++ libraries? I'm using OpenCV, but <a href="http://docs.opencv.org/trunk/d3/d8d/classcv_1_1PCA.html">cv::PCA</a> seems not to offer such a strategy. Is there any way to do this?</p>
http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=135494#post-id-135494answer is in your question?
"we compute at most the first 1,024 eigenvectors by eigendecomposition of the covariance matrix"
use eigen lib #include < Eigen / Eigenvalues >Thu, 23 Mar 2017 03:38:29 -0500http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=135494#post-id-135494Comment by StevenPuttemans for <p>I have vectors in 4096 <a href="http://www.vlfeat.org/overview/encodings.html">VLAD codes</a>, each one of them representing an image. </p>
<p>I have to run PCA on them <em>without reducing their dimension</em> on learning datasets with less than 4096 images (e.g., the <a href="http://lear.inrialpes.fr/people/jegou/data.php">Holiday dataset</a> with <2k images.) and obtaining the rotation matrix <code>A</code>.</p>
<p>In <a href="http://search.ieice.org/bin/summary.php?id=e99-d_10_2656">this</a> paper (which also explains why I need to run this PCA without dimensionality reduction) they solved the problem with this approach:</p>
<blockquote>
<p>For efficient computation of A, we compute at most the first 1,024 eigenvectors by eigendecomposition of the covariance matrix, and
the remaining orthogonal complements up to D-dimensional are filled using Gram-Schmidt orthogonalization.</p>
</blockquote>
<p>Now, how do I implement this using C++ libraries? I'm using OpenCV, but <a href="http://docs.opencv.org/trunk/d3/d8d/classcv_1_1PCA.html">cv::PCA</a> seems not to offer such a strategy. Is there any way to do this?</p>
http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=135679#post-id-135679https://eigen.tuxfamily.org/dox/classEigen_1_1EigenSolver.htmlFri, 24 Mar 2017 08:51:38 -0500http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=135679#post-id-135679Comment by lovaj for <p>I have vectors in 4096 <a href="http://www.vlfeat.org/overview/encodings.html">VLAD codes</a>, each one of them representing an image. </p>
<p>I have to run PCA on them <em>without reducing their dimension</em> on learning datasets with less than 4096 images (e.g., the <a href="http://lear.inrialpes.fr/people/jegou/data.php">Holiday dataset</a> with <2k images.) and obtaining the rotation matrix <code>A</code>.</p>
<p>In <a href="http://search.ieice.org/bin/summary.php?id=e99-d_10_2656">this</a> paper (which also explains why I need to run this PCA without dimensionality reduction) they solved the problem with this approach:</p>
<blockquote>
<p>For efficient computation of A, we compute at most the first 1,024 eigenvectors by eigendecomposition of the covariance matrix, and
the remaining orthogonal complements up to D-dimensional are filled using Gram-Schmidt orthogonalization.</p>
</blockquote>
<p>Now, how do I implement this using C++ libraries? I'm using OpenCV, but <a href="http://docs.opencv.org/trunk/d3/d8d/classcv_1_1PCA.html">cv::PCA</a> seems not to offer such a strategy. Is there any way to do this?</p>
http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=135496#post-id-135496@LBerger thanks for your comment. However, I don't fully understand this. How do you obtain the covariance matrix? How do you compute the first 1024 eigenvectors? And finally, what about the D - 1024 remaining components?Thu, 23 Mar 2017 04:43:39 -0500http://answers.opencv.org/question/135452/pca-in-thousands-of-dimensions/?comment=135496#post-id-135496