2

I am trying to understand SVM from very basic level. I encountered this statement that large margin tries to improve the test error i.e the generalization ability of the classifier. I read about it further in this but still I don't get the point.

Can someone please explain this to me ?

3 Answers3

1

Small margins will allow it do draw very complex decision boundaries that take many sharp turns. Larger margins don't allow that because the margin wouldn't fit between the data points that you are trying to turn around.

The more complex the decision boundaries and the more it just represents where single data points lie in your training data, the less likely it will be representative of other data not in your training set: over-fitting.

David Ernst
  • 2,799
  • 8
  • 14
  • Thanks for the answer David. Does "Memory capacity has been decreased" as mentioned in the link provided means the same as learning complex decision boundary if the margin is large has decreased ? – Ashutosh Mishra Sep 22 '17 at 19:55
  • yes, less memory capacity if the margin is large – David Ernst Sep 22 '17 at 20:06
1

One answer to this question comes from the SRM (Structural Risk Minimization). In fact it has been proven in (Vapnik, 2000) that the VC dimensions of a separating hyperplane is bounded by the margin.

A more in depth discussion can be found here.

Tonca
  • 526
  • 3
  • 12
0

Because a larger elastic margin produces a more stable result, by adding some bias for less variance error in the training. It helps avoiding overfitting as well.

Tonca
  • 526
  • 3
  • 12
  • Thanks Tonca but could you please be more general, in terms of layman language !! – Ashutosh Mishra Sep 22 '17 at 19:50
  • In other words, the larger margin is less sensible to noise or outliers. But you should really read about the bias-variance trade-off. It is one the most important concepts of ML. – Tonca Sep 22 '17 at 19:55