Bayesian Theorem Problem Example- Teorema Bayes (Thomas Bayes, 1763) - Yoppy Soleman
Bayes' Theorem for Programmers
-
Upload
moshe-zadka -
Category
Art & Photos
-
view
1.609 -
download
6
description
Transcript of Bayes' Theorem for Programmers
Bayes' Theorem
For programmers
Joke
● What's the difference between a mathematician, an engineer and a programmer?
Punchline
● Mathematicians use natural log (base e)● Engineers use decibels (10 times log base
10)● Programmers use bits (log base 2)
Useful functions● odds(p)=p/(1-p)
– Gambler talk: 1/3 → “1-to-2”
● logit(p)=log(odds(p))– Remember: logs are base 2, or bits
● expit(p)=exp(p/(1+p))– Inverse of logit
What is belief?
● Belief(X) = logit(Probability you assign to X)– Measured in bits
● Fun fact: Belief(not X)=-Belief(X)
Examples
● Belief(X)=0: probability 0.5, zero knowledge
● Belief(X)=1: probability is 2/3● Belief(X)=-1: probability is 1/3● Belief(X)=5: probability about 0.97
● Belief(X)=10: “I’m 99.9% certain about this!”
● Belief(X)=-10: “There’s a 0.001 chance of that!”
● Belief(X)=infinity: probability 1, or “The religious belief”…
More examples
Accuracy of belief
● Overconfidence: >>1-expit(B) of beliefs of strength >B are wrong (for some B>0)
● Underconfidence: <<1-expit(B) of beliefs of 0<strength<B are wrong (for some B>0)
● Well-calibrated: Neither overconfident nor underconfident
Evidence
● Event E happened. Is X true?● E is helpful only when P(E given X) != P(E
given not X). But how much?● Likelihood(E given X) = P(E given X)/P(E
given not X)● Evidence(E about X) = log(Likelihood(E
given X))● Evidence is measured in bits!
THE FORMULA
Belief(X after seeing E) = Belief(X)+Evidence(E about X)
Bayes' Theorem
● “If you are well-calibrated, and update beliefs according to THE FORMULA, you remain well-calibrated”
● Corrolary: If you sometimes count evidence twice, or sometimes only weakly, you FALL OUT OF CALIBRATION!
Remember, Kids!
Bayes’ Theorem is math, not a suggestion.If you care about being right,you can’t afford to ignore it!