Bayes and Naive Bayes are very important techniques in machine learning. I am going to cover the Naive Bayes Classifier which is widley used in machine learning, but before that I will explain in this post the Bayes’ theorm by examples


A brief introduction for Bayes’ theorem:

Let A and B denote two events. In Bayes’ theorem, P(A|B) the probability that A occurs given B already occured can be computed by:

Where P(B|A) is the probability of observing B given A occurs, and P(A), P(B) the probability of A and B occurs repectively.


Examples:

I am copying these examples from many books and tutorial sites, and I am listing these references here:


Example 1: Coin flip

Given two coins, one is unfair with 90% of flips getting a head and 10% getting a tail, another one is fair. Randomly pick one coin and flip it. What is the probability that this coin is the unfair one, if we get a head?

Let’s denote U, the event of picking the unfair coin and H the event of getting a head. so the probability of unfair coin given a head is already observed is ***P(U|H) can be calculated as:

P(H|U) = 90%

P(U) = 50% as we randomly pick a coin out of two.

P(H) can be observed by two paths, the fair coin is picked P(F) and the unfair coin is picked P(U).
P(H) = P(H|U)P(U) + P(H|F)P(F)


Example 2: Cancer screening

Suppose a physician reported the following cancer screening test scenario:

  Cancer No Cancer Total
Text Positive 80 900 980
Text Negative 20 9000 9020
Total 100 9900 10000

The problem: if the result of this screening test on a person is positive, what is the probability that they actually have cancer?

Let’s assign the event of having a cancer as C and positive testing as Pos. let’s calculate P(C|Pos):


Example 3: Email Spam detector:

Let us start with this training data

ID Terms in email Is spam
Training Data 1 Click win prize Yes
2 Click meeting setup meeting No
3 Prize free prize Yes
4 Click prize free Yes
Testing Case 5 Free setup meeting free ?

First we define two events: spam and not spam $S$ and $NS$ respectively.
From the training set:
$P(S) = \frac{3}{4}$
$P(NS) = \frac{1}{4}$

Or we can impose an assumption of prior to be $P(s) = 1\%$.

To calculate $P(S|x)$ where $x = (free, setup, meeting, click)$, we start by calculate $P(x_i|S)$.
$P(free|S) = \frac{2+1}{9+6} = \frac{3}{15}$
$P(free|NS) = \frac{0+1}{4+6} = \frac{1}{10}$


$P(setup|S) = \frac{0+1}{9+6} = \frac{1}{15}$
$P(setup|NS) = \frac{1+1}{4+6} = \frac{2}{10}$


$P(meeting|S) = \frac{0+1}{9+6} = \frac{1}{15}$
$P(meeting|NS) = \frac{2+1}{4+6} = \frac{3}{10}$

So, the solution:


List of posts

This post is part of a series of posts

  1. Preperation and introduction
  2. Naive Bayes by example (this post)
  3. Scrubbing natural language text.
  4. Naive Bayes’ Classifire.
  5. Writing Naive Bayes from scratch
  6. Using Scikit-learn library