Law of Total Probability and Bayes' Theorem
Master the law of total probability and Bayes' theorem for probabilistic inference.
25 min read
Intermediate
Introduction
Two of the most powerful tools in probability: the Law of Total Probability (breaking problems into cases) and Bayes' Theorem (updating beliefs with evidence).
Learning Objectives:
- Apply law of total probability
- Use Bayes' theorem for inference
- Solve real-world problems
Law of Total Probability
If partition the sample space (disjoint and cover ), then:
Strategy: Break into cases based on which occurs.
Bayes' Theorem
Interpretation:
- : Prior probability (before seeing )
- : Posterior probability (after seeing )
- : Likelihood (how likely is under )
python
# Medical test example
def bayes_medical_test():
# Prior: 1% have disease
p_disease = 0.01
p_no_disease = 0.99
# Likelihood: test accuracy
p_pos_given_disease = 0.95 # sensitivity
p_pos_given_no_disease = 0.05 # false positive rate
# Total probability of positive test
p_pos = p_pos_given_disease * p_disease + p_pos_given_no_disease * p_no_disease
# Bayes: P(disease | positive test)
p_disease_given_pos = (p_pos_given_disease * p_disease) / p_pos
print(f"P(disease) = {p_disease:.2%} (prior)")
print(f"P(+ | disease) = {p_pos_given_disease:.2%}")
print(f"P(+ | no disease) = {p_pos_given_no_disease:.2%}")
print(f"\nP(disease | +) = {p_disease_given_pos:.2%} (posterior)")
bayes_medical_test()Key Takeaways
- Law of total probability: Sum over partition cases
- Bayes' theorem: Update beliefs with evidence
- Applications: Diagnostics, spam filtering, ML
Next: Independence of events!