5

I read the material on the difference betweeen accuracy and precision, but it makes me feel confused. Can I define accuracy as: \begin{equation} accuracy=\frac{TruePositive+TrueNegative}{TruePositive+TrueNegative+FalsePositive+FlaseNegative} \end{equation}

So in machine learning, what is the difference between accuracy and precision?

GoingMyWay
  • 1,111
  • 2
  • 13
  • 25
  • 1
    [This picture](https://www.google.com/#q=accuracy+precision+bullseye) is a good way to internalize the difference. (In machine learning, accuracy vs. precision is actually analogous to bias vs. variance, if you are familiar with that.) – GeoMatt22 Oct 14 '16 at 04:04
  • @GeoMatt22, thank you, can I define accuracy as :\begin{equation} accuracy=\frac{TruePositive+TrueNegative}{TruePositive+TrueNegative+FalsePositive+FlaseNegative} \end{equation} – GoingMyWay Oct 14 '16 at 04:25
  • 1
    Aaah, now I see your confusion. "Accuracy" and "precision" are general terms throughout science (and have the sense indicated by the bullseye diagrams I linked to before). However in the *particular* context of [Binary Classification](https://en.wikipedia.org/wiki/Evaluation_of_binary_classifiers) these terms have very specific definitions. The chart at that Wikipedia page gives these. (Note that this context is more specialized than just "machine learning".) – GeoMatt22 Oct 14 '16 at 04:34
  • Just for reference, I made my comments into an answer. (We have far too many "unanswered questions" already that are answered in the comments!) – GeoMatt22 Oct 14 '16 at 04:57

1 Answers1

10

(Just for reference, I am posting my comments as an answer. Note that the first version of the question did not include the formula.)

"Accuracy" and "precision" are general terms throughout science. A good way to internalize the difference are the common "bullseye diagrams". In machine learning/statistics as a whole, accuracy vs. precision is analogous to bias vs. variance.

However in the particular context of Binary Classification* these terms have very specific definitions. The chart at that Wikipedia page gives these, which are $$\mathrm{Accuracy}=\frac{\mathrm{True}}{\mathrm{Total}} \text{ , } \mathrm{Precision}=\frac{\mathrm{True\;Positive}}{\mathrm{All\;Positive}} $$ i.e. the fraction of cases that are correctly classified vs. the fraction of positives that are true.

(*Note that this context is much more specialized than simply "machine learning".)

GeoMatt22
  • 11,997
  • 2
  • 34
  • 64