0

Is the loss is the same as the error in deep learning?

I feel it's the same but I'm maybe wong...

Fractale
  • 103
  • 3

1 Answers1

1

Usually loss and error are different concepts, but sometimes people conflate the two because conceptually, they're similar.

Loss functions measure the misfit of the model -- how much the model is wrong.

Error usually is shorthand for "error rate," the proportion of samples misclassified.

These two concepts are not necessarily the same. For example, cross-entropy loss can be any non-negative number, but the error rate is some number between 0 and 1.

Moreover, the error rate is not a differentiable function so it is not suitable for use in the back-propagation algorithm. But cross-entropy loss is differentiable, and is perfectly reasonable to use in back-prop.

Sycorax
  • 76,417
  • 20
  • 189
  • 313
  • To see if I under-fit or over-fit my data should I look at the total loss or about the error rate? error rate = nbSamplesMisclassified/ nbSample right? – Fractale Nov 14 '18 at 05:05
  • 1
    General questions about neural networks and overfitting have a number of threads on this website. Here's one that seems like a good place to start. https://stats.stackexchange.com/questions/131233/neural-network-over-fitting You can find more using the search feature. – Sycorax Nov 14 '18 at 08:18