5

I'm trying to implement th Local Coordinate System (LCS) of this paper.

It's all clear to me about how it works, but the only thing that I' dont understand is the "rotation" mechanism. Quoting the paper (sect 4.2):

This processing actually encompasses three distinct operations: centering (C), rotation with PCA basis (R) and dimensionality reduction by a factor of 2 (D).

I don't understand what is the bold part. And later:Later in the same section:

we learn off-line (e.g., on Flickr60K for Holidays) a rotation matrix Q_i from training descriptors mapped to this word.

I understand the sentence except for the "rotation matrix" part, again.

From my knowledge, PCA consists in obtaining two matrices: the eigenvector matrix of the covariance matrix of the centered data and the correspondent diagonal eigenvalues matrix. I know that if we want to have a nxp matrix and we want to reduce a vector v from 1xp to 1xd (where d<p) all we have to do is the product between v and the pxd diagonal matrix of the most relevant eigenvalues. But I've never heard the "rotation" stuff.

Can someone please explain me this?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
user6321
  • 389
  • 3
  • 12
  • 4
    Rotation matrix = eigenvector matrix. You should read some intuitive explanations of PCA where "rotation" is made explicit, see e.g. my answer in http://stats.stackexchange.com/questions/2691 and perhaps some other answers there too. – amoeba Jan 13 '17 at 12:48
  • 1
    Almost surely they mean eigenvector matrix. PCA _is_ orthogonal rotation. – ttnphns Jan 13 '17 at 12:49
  • @amoeba Oh I see! Is it correct that when we perform dimension reduction of v what we do is the product of v by the eigenvalue matrix obtain by PCA? Here instead seems that we have to multiply v by the eigenvector matrix! – user6321 Jan 13 '17 at 13:09
  • No. What you wrote in your post after "I know that" is wrong. You always multiply with eigenvectors (which is the same as rotating). – amoeba Jan 13 '17 at 13:11
  • @amoeba Ok I understand. One last thing (I don't think I need to open another question for it). Since eigenvectors are ordered according to the most significant eigenvalues, do I have to order `v` in the same way to before doing the product with the eigenvector matrix? For example, if the third eigenvectors is swapped with the first one, do I have to swap `v[3]` with `v[1]` before doing the product with eigenvector matrix? – user6321 Jan 13 '17 at 15:00
  • 1
    No you don't need to swap. This question shows that you have very little understanding of PCA, so I would strongly recommend you to do some basic reading on it. – amoeba Jan 17 '17 at 11:53

0 Answers0