I know how Naive Bayes work for classifying binary problems. I just need to know what are the standard way to apply NB on multi-class classification problems. Any idea please?
Asked
Active
Viewed 3.1k times
1 Answers
19
Unlike some classifiers, multi-class labeling is trivial with Naive Bayes.
For each test example $i$, and each class $k$ you want to find: $$\arg \max_k P(\textrm{class}_k | \textrm{data}_i)$$
In other words, you compute the probability of each class label in the usual way, then pick the class with the largest probability.

naive
- 899
- 1
- 9
- 14

Matt Krause
- 19,089
- 3
- 60
- 101
-
Thank you Matt. As you said, it is pretty straightforward. While I think this would not be the case with SVM for example. – Mohammadreza Mar 25 '15 at 03:30
-
My pleasure. For other methods, there are (many) ways of combining two-way classifiers (like SVMs) into a multi-class system. I think there has also been some work on extending SVMs to do this "natively." – Matt Krause Mar 25 '15 at 15:52