As a follow up to this question - How to obtain decision boundaries from linear SVM in R? Is it possible to do the same with non linear SVM? (Radial for example). What do the weights represent?
-
see http://stats.stackexchange.com/questions/164935/how-to-calculate-decision-boundary-from-support-vectors/167245#167245 – Mar 07 '17 at 13:31
1 Answers
That depends. In the radial-basis function (RBF) case, it's generally impossible to obtain the weights.
The kernel trick is applied as I outlined in this answer.
Basically, we define new weights $\mathbb{w} = \phi(x)\cdot\mathbb{u}$ so that $\mathbb{w}^T\phi(x)=\mathbb{u}^T\phi(x)^T\phi(x)=\mathbb{u}^TK$ and $\mathbb{w}^T\mathbb{w}=\mathbb{u}^TK\mathbb{u}$.
We minimize with regards to $\mathbb{u}$, so to recover the weights in kernel space we simply do the back transformation $\mathbb{w}_{p_K\times1} = \phi(x)_{p_K\times N}\cdot\mathbb{u}_{N\times1}$.
Some kernel spaces are infinite though, like the RBF kernel, and so the resulting weights vector is infinite as well (i.e. you can't really represent it, nor the $\phi(x)$ kernel space representation as $p_k\rightarrow+ \infty$). Other finite space cases admit the solution though.