Probability

Rules

Addition Rule

`P(A ∪ B) = P(A) + P(B) − P(A ∩ B)`

If A and B are independent then:

`P(A ∪ B) = P(A) + P(B)`

Product Rule

`P(A ∩ B) = P(A|B) * P(B)`

If A and B are independent then:

`P(A ∪ B) = P(A) * P(B)`

Product rule can also be written as Bayes’ theorem (baby version):

`P(A|B) = (P(A ∩ B)) / (P(B))`

Event Relationships

Two events A and B can be:

A and B disjoint =>

` P(A ∩ B) = 0`

A and B complementary =>

`P(A) + P(B) = 1`

Bayesian Inference

Posterior probability - P(hypothesis | data)

TODO - bayesian Inference

Binomial Distribution

A random variable has binomial distribution when:

  1. Trials are independent
  2. The number of trials is fixed
  3. Only two possible outcomes (success / failure)
  4. P(success) is the same for each trial

Probability of k successes in n trials

`((n),(k)) * p^k * (1 - p)^(n - k) `

Binomial coefficient

n choose k

`((n),(k)) = (n!) / (k! (n-k)!)`

Expected number of successes

Mean

`μ = n * p`

Standard deviation

`σ = sqrt(n * p * (1-p))`

Normal Distribution Approximation to Binomial

When n is sufficienly large, the binomial distribution can be approximated by the normal distribution.

Rule of thumb for “sufficienly large”:

`n * p ≥ 10, n* (1 − p) ≥ 10`

© Will Robertson