Table of Contents

## Naive bayes classifier in Data Mining

**Step 1. Calculate P(C _{i})**

P(buys_computer = “no”) = 5/14= 0.357.

P(buys_computer = “yes”) = 9/14 = 0.643.

**Step 2. Calculate P(X|C _{i}) for all classes**

P(age = “<= 30” | buys_computer = “no”) = 3/5 = 0.6.

P(age = “<=30” | buys_computer = “yes”) = 2/9 = 0.222.

P(income = “medium” | buys_computer = “no”) = 2/5 = 0.4.

P(income = “medium” | buys_computer = “yes”) = 4/9 = 0.444

P(student = “yes” | buys_computer = “no”) = 1/5 = 0.2

P(student = “yes” | buys_computer = “yes) = 6/9 = 0.667

P(credit_rating = “fair” | buys_computer = “no”) = 2/5 = 0.4

P(credit_rating = “fair” | buys_computer = “yes”) = 6/9 = 0.667

**Step 3.** Select the scenario against which you want to classify.

**X = (age <= 30 , income = medium, student = yes, ****credit_rating**** = fair)**

** Step 4:** Calculate **P(****X|C****i****) :**

P(X|buys_computer = “no”) = 0.6 x 0.4 x 0.2 x 0.4 = 0.019

P(X|buys_computer = “yes”) = 0.222 x 0.444 x 0.667 x 0.667 = 0.044

** Step 5: **Calculate

**C**P(**X|C**

**i**

**)*P(**

**C**

**i**

**) :**

P(X|buys_computer = “no”) * P(buys_computer = “no”) = 0.007

P(X|buys_computer = “yes”) * P(buys_computer = “yes”) = 0.028

**Therefore, X belongs to class (“****buys_computer**** = yes”) **** **

## Next Similar Tutorials

- Decision tree induction on categorical attributes – Click Here
- Decision Tree Induction and Entropy in data mining – Click Here
- Overfitting of decision tree and tree pruning – Click Here
- Attribute selection Measures – Click Here
- Computing Information-Gain for Continuous-Valued Attributes in data mining – Click Here
- Gini index for binary variables – Click Here
- Bagging and Bootstrap in Data Mining, Machine Learning – Click Here
- Evaluation of a classifier by confusion matrix in data mining – Click Here
- Holdout method for evaluating a classifier in data mining – Click Here
- RainForest Algorithm / Framework – Click Here
- Boosting in data mining – Click Here
- Naive Bayes Classifier – Click Here

hi, Vysakh thanks for responding. Please, can you indicate the mistake?

Hi Fazal,

Thank you for the content. Helped me refresh some contents. I found that some of the results in the examples had some mistakes in it. Kindly check.