Tuesday, July 21, 2015

Day 32 Banneker Institute

Bayes' Theorem


Hola mundo,

Today I'll be explaining Bayes' Theorem and it's use in predicting events. The Bayes' Theorem uses a prior hypothesis based on a set of parameters and prior knowledge of the system that you are trying to predict, but you do not build a hypothesis of the model only of the result. The theorem uses a likelihood that event happening, giving that your parameters are true. The prior and the likelihood are completely independent of each other. Bayes' Theorem also uses a term named evidence (E) which in my case I am not using because it's a normalization term.

$$P(a|D) = \frac{{\Pi}(a)L(D|a)}{E}$$
$\Pi(a)$ is the prior where a is the hypothesis in the form of parameters a.

$L(D|a)$ is the likelihood where D is the data of the system in question.

I use this theorem to find the probability of getting heads after flipping a fair coin. I ran a few trials and plotted the results of the formula below.

$$P*H^{k}(1-H)^{n-k}$$
Where P is the prior, His the probability of getting heads, n is the number of trials, and k is the number of times we get heads. Below is a plot with the probability density distributions of my trials. As you can see the maximum probability of my trials are close to 0.5 which is the probability of getting heads using a fair coin if you flip the coin an infinite amount of times.

"Happiness is when what you think, what you say, and what you do are in harmony."

-Mahatma Gandhi

No comments:

Post a Comment