2

I'm dealing currently with kernels and kernel PCA. For this purpose I've been reading a few papers on these topics. In this context I've been reading the paper "Kernel Principal Component Analysis" by Schölkopf et. al. While reading the paper certain questions emerged:

  • Performing the kernel PCA, we get a kernel matrix K. From the 'standard' PCA we get Eigenvalues and Eigenvectors. How do we get the Eigenvectors and Eigenvalues from the kernel matrix?

The dual eigenvalue problem is:

$m \cdot \lambda \cdot \alpha = K \cdot \alpha$

where m is the number of data points, alpha are some coefficients and K is our kernel matrix.

  • How are the coefficients alpha determined? What do they stand for?

  • Did I understand correctly that the PCA itself is performed implicitly in the feature space?

  • having the Eigenvalues and Eigenvectors: What is the intuition of them in feature space? What do they mean if we use e.g. a polynomial kernel of degree 2 and obtain the top k-eigenvalues?

  • What is the geometric/visual intuition of the obtained Eigenvalues in feature space?

  • How can we reconstruct from the determined Eigenvalues and Eigenvectors the original dataset?

  • In how far can the Eigenvalues and Eigenvectors be compared to those in the 'standard'/linear PCA?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Daniyal
  • 121
  • 1
  • There is a lot of questions here. Have you checked [the existing threads on kernel pca](http://stats.stackexchange.com/questions/tagged/pca+kernel-trick?sort=votes&pageSize=50)? In particular e.g. http://stats.stackexchange.com/questions/94463, http://stats.stackexchange.com/questions/131140, http://stats.stackexchange.com/questions/101344. – amoeba Dec 07 '16 at 13:28
  • 1
    Thank you for your links! Jep, I've checked them, they have not provided the answers I've been looking for however the links were highly usefull to answer further questions :-) – Daniyal Dec 07 '16 at 14:03
  • Where you able to answer your Q after reading those links? If so, you could answer yourself! – kjetil b halvorsen Jul 31 '18 at 10:56
  • 1
    @kjetilbhalvorsen, unfortunately no. I found those posts that were closest: https://stats.stackexchange.com/questions/126014/how-to-project-a-new-vector-onto-the-pc-space-using-kernel-pca and https://stats.stackexchange.com/questions/8182/is-it-possible-to-use-kernel-pca-for-feature-selection. However, I still lack any intuition of what the Eigenvectors "represent". If I have from a 10x10 gramian after KPCA 10 Eigenvectors sorted by their Eigenvalues - I choose for example the first two Eigenvectors (those with highest and second highest Eigenvalue). What do these two Eigenvectors mean? – Daniyal Sep 04 '19 at 06:55

0 Answers0