1

As far as I understand the value of epsilon defines the data points (the support vectors) that get included in the computations. These are the points that lay outside of the 'tube' of width 2*epsilon.

Fig.'

On the other hand the C in the minimization problem seems to penalize the amount of deviations from the epsilon margin. Thus, does it define the number of support vector as well? Doasn't it decrease them?

Fig.

It contradict with what I experience while using sklearn.svm.SVR. I notice distinguish increase of computation time when I increase the value of C. Thus, it is contradicting with the above. Why increasing the constant C results in longer computation time?

*the images are from "A tutorial on support vector regression" by ALEX J. SMOLA and BERNHARD SCHÖLKOPF

DexzMen
  • 177
  • 1
  • 10
  • Check this answer (even though it uses classification example, the effect of C is same) - https://stats.stackexchange.com/questions/31066/what-is-the-influence-of-c-in-svms-with-linear-kernel – wololo Aug 28 '18 at 23:09
  • Increasing $C$ means you place a higher priority on avoid mistakes. This actually doesn't necessarily mean you will have more support vectors. Generally speaking, when one looks at the optimization time one tends to look at $\lambda = \frac{1}{C}$ and how it impacts the strong convexity (i.e., increasing $C$ means we're still strongly convex, but with a "worse" constant) - this in turn does tend to mean that increasing $C$ results in longer optimization times. – MotiNK Aug 31 '18 at 18:02
  • Thank you for your response. What is lambda in your comment? – DexzMen Sep 03 '18 at 16:28

0 Answers0