Christopher David DesjardinsIntro to Bayes Frequentist vs. Bayesian Bayesian I The new old kid on...

Post on 13-Oct-2020

2 views 0 download

Transcript of Christopher David DesjardinsIntro to Bayes Frequentist vs. Bayesian Bayesian I The new old kid on...

Intro to Bayes

Introduction to Bayesian Statistics

Christopher David Desjardins

University of MinnesotaDepartment of Educational Psychology

27 April, 2009

Intro to Bayes

Intro to Bayes

Intro to Bayes

Bayesian Introduction AKA Why are you doing this tous?

I We are in a paradigm shift.I Bayesian has been exploding in popularity in the last 15

years.I It is important to have a conceptual understanding of

another theory.I QME does not teach Bayesian ... huh?I I am interested in Bayesian!

Intro to Bayes

Bayesian Introduction AKA Why are you doing this tous?

I We are in a paradigm shift.I Bayesian has been exploding in popularity in the last 15

years.I It is important to have a conceptual understanding of

another theory.I QME does not teach Bayesian ... huh?I I am interested in Bayesian!

Intro to Bayes

Bayesian Introduction AKA Why are you doing this tous?

I We are in a paradigm shift.I Bayesian has been exploding in popularity in the last 15

years.I It is important to have a conceptual understanding of

another theory.I QME does not teach Bayesian ... huh?I I am interested in Bayesian!

Intro to Bayes

Bayesian Introduction AKA Why are you doing this tous?

I We are in a paradigm shift.I Bayesian has been exploding in popularity in the last 15

years.I It is important to have a conceptual understanding of

another theory.I QME does not teach Bayesian ... huh?I I am interested in Bayesian!

Intro to Bayes

Bayesian Introduction AKA Why are you doing this tous?

I We are in a paradigm shift.I Bayesian has been exploding in popularity in the last 15

years.I It is important to have a conceptual understanding of

another theory.I QME does not teach Bayesian ... huh?I I am interested in Bayesian!

Intro to Bayes

Small Group Discussion Questions

I Thinking about your readings by Gelman and your ownresearch.

1. What kind of statistic analyses will you do in your ownresearch?

2. Will you use theory and prior information to guide yourresearch? How?

3. Will you use theory and prior information to guide yourstatistical analyses? Why or why not?

4. Do you believe that statistical analyses are objective?

Intro to Bayes

Small Group Discussion Questions

I Thinking about your readings by Gelman and your ownresearch.

1. What kind of statistic analyses will you do in your ownresearch?

2. Will you use theory and prior information to guide yourresearch? How?

3. Will you use theory and prior information to guide yourstatistical analyses? Why or why not?

4. Do you believe that statistical analyses are objective?

Intro to Bayes

Manscript Question (Carlin & Louis, 2008)

I Suppose you submit your first manuscript to a journal.I The journal has a 25% acceptance rate for manuscripts like

yours.I And your manuscript is accepted! Congratulations!I Now what is your assessment of the probability of your

next submission to that journal on a related topic beingaccepted?

I 100%? Not 100% why?

Intro to Bayes

Manscript Question (Carlin & Louis, 2008)

I Suppose you submit your first manuscript to a journal.I The journal has a 25% acceptance rate for manuscripts like

yours.I And your manuscript is accepted! Congratulations!I Now what is your assessment of the probability of your

next submission to that journal on a related topic beingaccepted?

I 100%? Not 100% why?

Intro to Bayes

Manscript Question (Carlin & Louis, 2008)

I Suppose you submit your first manuscript to a journal.I The journal has a 25% acceptance rate for manuscripts like

yours.I And your manuscript is accepted! Congratulations!I Now what is your assessment of the probability of your

next submission to that journal on a related topic beingaccepted?

I 100%? Not 100% why?

Intro to Bayes

Manscript Question (Carlin & Louis, 2008)

I Suppose you submit your first manuscript to a journal.I The journal has a 25% acceptance rate for manuscripts like

yours.I And your manuscript is accepted! Congratulations!I Now what is your assessment of the probability of your

next submission to that journal on a related topic beingaccepted?

I 100%? Not 100% why?

Intro to Bayes

Manscript Question (Carlin & Louis, 2008)

I Suppose you submit your first manuscript to a journal.I The journal has a 25% acceptance rate for manuscripts like

yours.I And your manuscript is accepted! Congratulations!I Now what is your assessment of the probability of your

next submission to that journal on a related topic beingaccepted?

I 100%? Not 100% why?

Intro to Bayes

Goats and Cars

I Now a problem for you!I Suppose you’re on a game show, and you’re given the choice

of three doors: Behind one door is a car; behind the others,goats. You pick a door, say No. 1, and the host, who knowswhat’s behind the doors, opens another door, say No. 3,which has a goat. He then says to you, ”Do you want topick door No. 2?” Is it to your advantage to switch yourchoice?

Intro to Bayes

Trends in Statistics

I And the answer is ...I You swap but why?I The Monty Hall Problem Answer

I What was the important piece there? What would havemade swapping irrelevant?

I If the host didn’t know!I This is important information.

Intro to Bayes

Trends in Statistics

I And the answer is ...I You swap but why?I The Monty Hall Problem Answer

I What was the important piece there? What would havemade swapping irrelevant?

I If the host didn’t know!I This is important information.

Intro to Bayes

Trends in Statistics

I And the answer is ...I You swap but why?I The Monty Hall Problem Answer

I What was the important piece there? What would havemade swapping irrelevant?

I If the host didn’t know!I This is important information.

Intro to Bayes

Trends in Statistics

I And the answer is ...I You swap but why?I The Monty Hall Problem Answer

I What was the important piece there? What would havemade swapping irrelevant?

I If the host didn’t know!I This is important information.

Intro to Bayes

Trends in Statistics

I And the answer is ...I You swap but why?I The Monty Hall Problem Answer

I What was the important piece there? What would havemade swapping irrelevant?

I If the host didn’t know!I This is important information.

Intro to Bayes

Trends in Statistics

I And the answer is ...I You swap but why?I The Monty Hall Problem Answer

I What was the important piece there? What would havemade swapping irrelevant?

I If the host didn’t know!I This is important information.

Intro to Bayes

Frequentist vs. Bayesian

Frequentist

I Traditional statistical methodsI Rooted in Neyman, Pearson, and FischerI Probability statements about frequencyI p-values, 95% confidence intervals, Ho and Ha testing.I Asks what is the chance of getting these data based on the

parameters.I Parameters are fixed

Intro to Bayes

Frequentist vs. Bayesian

Bayesian

I The new old kid on the streetI Rooted in Bayes Theorem (Bayes, 1763)I Kept alive by Birnbaum (1962), De Finetti (1972), Good

(1950), Lindley (1965),and Savage (1954)I Probability statements conditional on data and prior beliefsI Bayesian p-values, 95% credible intervals, Bayes Factors.I Asks what is the chance of getting these parameters based

on the data.I Parameters are random

Intro to Bayes

Frequentist

Intro to Bayes

Bayesian

Intro to Bayes

Frequentist

Intro to Bayes

Bayesian

Intro to Bayes

Frequentist

Intro to Bayes

Bayesian

Intro to Bayes

Frequentist

Intro to Bayes

Why Are They Bayesian?

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Bayesian

Bayes Theorem

P (A|B) = P (B|A)P (A)P (B)

English

Posterior = Likelihood∗PriorMarginal

Simpler still ...

Posterior is proportional to the Likelihood * Prior

I The posterior is the chances of obtaining your parameterbased on the data.

I All inferences are based on the posterior.I Priors allow specification of your beliefs and can

continually be updated.I Bayesian is cohesive and coherent.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Reasons for Bayesian Inference (Berger 1985)

I You can use prior knowledge.I Inferences are based only on the data.I Reason for stopping the experiment doesn’t affect your

inferences.I More easy to interpret.I More coherent as all analyses are made based on the

posterior.I Any question can be answered through Bayesian analysis.I Bayes procedures possess numerous optimality properties.

I Can calculate actual probability of null hypothesis beingtrue.

I Ability to test logical hypotheses.I Ability to solve more complex models.

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why aren’t we all Bayesian?

I Statisticians actively resisted it for years.I Calculation of marginal distribution required serious

calculus!I Solution: Markov Chain Monte Carlo simulationsI A fancy, quick way to estimate the nasty calculus

I Subjectivity of prior information - What if I’m way wrong?I Solution: Use no priors or weak priors

I More uncertainty, decision makers like white and blacksolutions.

I There are guidelines that can be used in model selection.I Really when are we ever 100% certain about a model?

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Why we should be Bayesian in the social sciences?

I Statistics is not subjective nor are our analyses.I We have theories and ideas, why not incorporate them?I Can update your priors as you get more information.I Easier interpretationI Becoming easier and easier to run and require less

understanding of underlying probabilityI The way of the future? We don’t want to be left behind?I We’re in a paradigm shift . . .

Intro to Bayes

Things to ponder

I Are you sold on Bayesian? Why or why not?I Is knowledge of Bayesian relevant to your studies?I Where do you place yourself?

I A frequentistI A BayesianI Don’t care. It’s an esoteric differentiation relevant only to

pedantic QMErs.

Intro to Bayes

Things to ponder

I Are you sold on Bayesian? Why or why not?I Is knowledge of Bayesian relevant to your studies?I Where do you place yourself?

I A frequentistI A BayesianI Don’t care. It’s an esoteric differentiation relevant only to

pedantic QMErs.