1

My attempt:

Let $x,y\in\mathbb{R}^d$. We already know the Fourier transform of a Gaussian function is a Gaussian function.If substituting $x-y$ for the variable after Fourier transform, we have $$ \exp\left(-\frac{\|x-y\|^2}{2\sigma^2}\right)=\frac{1}{(2\pi)^{d/2}}\int_{\mathbb{R}^d}\exp\left(\frac{i \xi^\mathrm{T} x}{\sigma}\right)\exp\left(-\frac{i \xi^\mathrm{T} y}{\sigma}\right)\exp\left(-\frac{\|\xi\|^2}{2}\right)\,\mathrm{d}\xi $$ which means $\exp[-\|x-y\|^2/(2\sigma^2)]$ is an inner product with weight $\exp(-\|\xi\|^2/2)$.

Secondly, $\{\exp[(i \xi^\mathrm{T} x)/\sigma]\}_{\xi\in\mathbb{R}^d}$ are basis of Fourier transform, which implies linear independence, so they can be seen as the basis of the implicit feature space (remember they form the inner product). They are uncountable because the subscript $\xi$ is continuous.

The problem is that from Mercer's theorem, a symmetric positive semidefinite kernel should admit countably many basis vectors. So there should not be uncountably many basis vectors.

What is wrong in my attempt of deriving the basis of the implicit feature space of Gaussian kernel?

ziyuang
  • 1,536
  • 8
  • 32
  • Maybe this helps: http://stats.stackexchange.com/questions/80398/how-can-svm-find-an-infinite-feature-space-where-linear-separation-is-always-p/168309#168309 –  Apr 05 '16 at 13:00
  • @fcop looks like some general introduction for kernel trick. – ziyuang Apr 05 '16 at 13:44
  • Well that may be, but if you look at the theory on Reproducable Kernel Hilbert Spaces (RKHS) where they show this, you will see that it is like that that they do it –  Apr 05 '16 at 14:15

0 Answers0