Framing, biases, heuristics, the normative/descriptive gap, and ecologically rational choice W. Troy...

Post on 11-Jan-2016

215 views 1 download

Tags:

Transcript of Framing, biases, heuristics, the normative/descriptive gap, and ecologically rational choice W. Troy...

Framing, biases, heuristics, the normative/descriptive gap, and

ecologically rational choice

W. Troy Tucker

Applied Biomathematics, Inc.

Troy@ramas.com

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

A cognitive illusion

From Pinker 2002

Hunting-gathering lifestyle• Deep time

– 1st tools: 2.5 mya

– Paleolithic: 1.5 mya

– Anatomically modern: 500 kya

– Fully modern: 40 kya

Axioms of “rational” choice

• Transitivity, ordering – A>B, B>C, A>C• Monotonicity – more is better.• Invariance, stochastic dominance – expected value

determines preference.• Reduction, substitutability – probability calculus

doesn’t affect decisions.• Continuity – p can be anything between 0 and 1.• Finiteness – no infinite values.

Biases and heuristics

• Anchoring and adjustment heuristic• Availability heuristic• Representativeness heuristic• Simulation heuristic• Base rate fallacy• Conjunction fallacy• Framing• Status quo bias

Availability heuristic

• Ease of recall (bias), vividness (bias)– Megan’s law– Violent crime

• declined 10 straight years, as Fox news style and capability spread, 3-strikes laws, mandatory sentences, etc.

Representativeness heuristic

• Regression to the mean (bias)– Good sticks and bad carrots?

• Peak-end rule (bias)– Recall average not sum

• Perceived happiness

• Conjunction fallacy (bias)

• Base rate fallacy (bias)

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which is more likely?

A. Linda is a bank teller.

B. Linda is a bank teller and is active in the feminist movement.

Conjunction fallacy

• Linda problem:

)(P)(P)(P BBAA

• Another Linda problem:Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which is more likely?

A. Linda is a bank teller.

B. Linda is active in the feminist movement.

Base rate fallacy

• Green cabs:A cab was involved in a hit and run accident. 85% of cabs are green. 15% of cabs are blue. A witness says the cab was blue. In a reliability test, the witness correctly identifies the cab color 80% of the time. The witness is wrong 20% of the time. What is the probability the cab involved in the accident was blue, as the witness stated?

Base rate fallacy

In 100 accidents: 85 are green, 15 are blue.Witness sees 17 of 85 green cabs as blue (20% error).Witness sees 12 of 15 blue cabs as blue (80% correct).17/29 blue cabs seen in 100 accidents are really green.60% chance witness was wrong.Just because there are so many green cabs.

• HIV:About 0.01 percent of men with no risk behavior are infected with HIV (base rate). If such a man has the virus, there is a 99.9 percent chance that the test result will be positive (sensitivity). If a man is not infected, there is a 99.99 percent chance that the test result will be negative (false positive rate). What is the chance that a man who tests positive has the disease?

Base rate fallacy

Imagine 10,000 men who are not in any known risk category (blood donors). One is infected (base rate) and will test positive with practical certainty (sensitivity). Of the 9,999 donors not infected, another one will also test positive (false positive rate). Thus, two will test positive, one of whom actually has the disease.

Anchoring/Adjustment heuristic(Tversky and Kahneman)

• Anchoring effect (bias)– Excimer laser engineer salary?

• $99,790 vs. $68,860

• Effect size of anchor is 45%

• 13 trials over 10 yrs.

– Last 4 digits of SSN and number of docs in NY• r=0.4

Framing (1)

• Asian disease problem

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume the exact scientific estimate of the consequences of the programs are as follows:

Program A: "200 people will be saved"

Program B: "there is a one-third probability that 600 people will be saved, and a two-thirds probability that no people will be saved"

From Tversky and Kahneman 1981

Framing (2)

• Asian disease problem

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume the exact scientific estimate of the consequences of the programs are as follows:

Program C: “400 people will die"

Program D: "there is a one-third probability that nobody will die, and a two-thirds probability that 600 people will die"

From Tversky and Kahneman 1981

Framing (3)

• Asian disease problem

Program A: "200 people will be saved"

Program B: "there is a one-third probability that 600 people will be saved, and a two-thirds probability that no people will be saved"

Program C: “400 people will die"

Program D: "there is a one-third probability that nobody will die, and a two-thirds probability that 600 people will die"

From Tversky and Kahneman 1981

Framing (4)

• Framing effects are widely observed– Wason selection task – details of the story

– Bayesian reasoning – existence of % or decimals

– Positive and negative affect – prospect theory

– Endowment – discount vs. surcharge – prospect theory

• Some theories I like– Cosmides - social contract theory

– Gigerenzer - natural frequencies

– Wang - group size signals kith and kin

Framing (4)

• Framing effects are widely observed– Wason selection task

– Bayesian reasoning

– Positive and negative affect

– Endowment – discount vs. surcharge

• Some theories I like– Cosmides - social contract theory

– Gigerenzer - natural frequencies

– Wang - group size signals kith and kin

Bayesian reasoning (1)

12-18% correct Bayesian reasoning

If a test to detect a disease whose prevalence is 1/1000 has a false positive rate of 5%, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person’s symptoms or signs? ___%

Casscells et al. 1978 replicated in Cosmides and Tooby 1996

Bayesian reasoning (2)

If a test to detect a disease whose prevalence is 1/1000 has a false positive rate of 50/1000, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person’s symptoms or signs? ___ out of ___.

Casscells et al. 1978 replicated in Cosmides and Tooby 1996

76-92% correct Bayesian reasoning

1 51

Framing (4)

• Framing effects are widely observed– Wason selection task

– Bayesian reasoning

– Positive and negative affect

– Endowment – discount vs. surcharge

• Some theories I like– Cosmides - social contract theory

– Gigerenzer - natural frequencies

– Wang - group size signals kith and kin

• Asian disease problem

Program A: "200 people will be saved"

Program B: "there is a one-third probability that 600 people will be saved, and a two-thirds probability that no people will be saved"

Program C: “400 people will die"

Program D: "there is a one-third probability that nobody will die, and a two-thirds probability that 600 people will die"

From Tversky and Kahneman 1981

Kith and Kin (Wang 2007)

Framing (4)

• Framing effects are widely observed– Wason selection task – details of the story

– Bayesian reasoning – existence of % or decimals

– Positive and negative affect – prospect theory

– Endowment – discount vs. surcharge – prospect theory

• Some theories I like– Cosmides - social contract theory

– Gigerenzer - natural frequencies

– Wang - group size signals kith and kin

Prospect theory

Prospect theory

Prospect theory

Prospect theory

gains

losses

Loss aversion

Emotion and Reason

• Risk premium

• Ambiguity premium

• Brain processes them differently

• When ambiguous, assume worst-case– Emotion and reason

A. Rustichini, Science 310, 1624 (2005)

Ellsberg paradox

Glimcher and Rustichini, Science 306, 447 (2004)

Ellsberg paradox

Glimcher and Rustichini, Science 306, 447 (2004)

Anticipatory skin conductance

Bechara et al., Science 275, 1293 (1997)

Ultimatum game

• Selfishness axiom– utility maximization

• Homo economicus

– Definition of “rationality”– What are the “rational” optimum strategies?

• Proposer: always offer the smallest possible amount

• Responder: always accept

Ultimatum game

• 15 years of research in 25 western societies– More than 100 studies published– Homo reciprocans:

• Cares about fairness and reciprocity• is willing to change the distribution of material outcomes

among others at a personal cost to themselves• Rewards those who act in a prosocial manner• Punishes those who act selfishly, even when punishment costs

• Is this pattern a human universal?– Designed by natural selection to engage in reciprocal

altruism

Ultimatum game

• Evolutionary anthropology• 11 anthropologists and one economist• Twelve countries on four continents and New Guinea• Fifteen small-scale societies

– Three foraging societies– Six slash-and-burn horticulturalists– Four nomadic herding groups– Two sedentary, small-scale agricultural societies

• All games played for real stakes– Equal to one-day’s local wages

Ultimatum game

Ultimatum game

• Results– The selfishness-based model always wrong

– Cross-cultural variability• Economic organization explains much of this

– Market integration

– Payoff to cooperation

– Individual-level variables do not explain game behavior

– Experimental play reflects common interactional patterns

Ultimatum game

Neuroscience of risk perception

The brain has many domain-specific calculators(Marr 1982; Barkow et al. 1992; Pinker 1997, 2002)

Information format triggers specific calculators(e.g. Cosmides & Tooby 1996; Gigerenzer 1991)

Different calculators give contrasting solutionsOr calculate different components of total risk(e.g. Glimcher & Rustichini 2004 and references therein)

List of mental calculators(after Pinker 2002)

• Language (grammar and memorized dictionary)

• Practical physics (pre-Newtonian)

• Intuitive biology (animate differs from inanimate)

• Intuitive engineering (tools designed for a purpose)

• Intuitive psychology (theory of mind, autism, deception)

• Spatial sense (dead reckoner and mental maps)

• Number sense (1, 2, 3, many)

• Probability sense (frequentist Bayes)

• Intuitive economics (reciprocity, trust, equity, fairness)

• Mental database and logic (assertions linked with logical and causal operators)

People are bad risk calculators

… or often said to be bad when1. Presented with percentages, large numbers, or

single-event probabilities2. Experts tell them the risk 3. Presented with incertitude (versus variability) 4. Risk is seen to be imposed 5. Risk is out of personal control6. Rare events are observed - representativeness7. When children are at risk8. etc.

People are bad risk calculators

… or often said to be bad when1. Presented with percentages, large numbers, or

single-event probabilities2. Experts tell them the risk 3. Presented with incertitude (versus variability) 4. Risk is seen to be imposed 5. Risk is out of personal control6. Rare events are observed - representativeness7. When children are at risk8. etc.

When risk is imposed

…People perceive more riskEven when the risk is smaller than voluntary risks

Multiple mental risk calculators perceive riskSome perceive risk of disease, death, economic costSome perceive risk of social contract violation (e.g. Cosmides 1989, Guth 1995, Henrich et al. 2005)

Bilateral anterior insula: disgust (e.g. Sanfey et al. 2003)

The problem of “altruism”

• Kin-directed altruism(Hamilton 1964, Haldane)

– Explains social insects, parenting, nepotism

• How can non-relatives cooperate?

Cooperation with non-relatives

Reciprocal altruism(Trivers 1971, Axelrod & Hamilton 1981)

2 problems: how to evolve, how to maintain• Evolve is still not well understood

Prisoner’s dilemma: Tragedy of the commons

• Maintaining cooperation requires 5 emotions (at least)– Friendship, moralistic aggression, forgiveness, guilt/shame,

sympathy/gratitude

Risk of being cheated

• Wason selection task and logic– Evidence of cheater detection module: patterned

violation of logical deduction• (Cosmides & Tooby, Gigerenzer & Hug)

– Cheaters looked at longer, remembered better • (Chiappe, Brown, Dow, Koontz, Rodriguez, & McCulloch

2004; Mealey, Daood, & Krage, 1996; Oda, 1997)

– Neuropsychology - Bilateral limbic system damage to temporal pole and amygdala impairs detection

• (Stone, Cosmides, Tooby, Kroll, & Knight 2002)

Strong Reciprocity

• Ernst Fehr and Simon Gächter – team earns money when all cooperate– Punishers (moralistic aggression)

• Spend money to ensure freeloaders don’t prosper

• Note – this is “irrational”.

– People do pursue own self interest• But, definition of “self interest” includes fairness,

equity, justice, prudence, generosity, etc.

Strong reciprocity (2)

• Human emotional constitution embraces prosocial and altruistic notions of in-group and out-group identification, and reciprocity– A direct result of evolutionary history

• (Gintis 2005, Bowles and Gintis 2003)

– Moral principles are “evolved facts in the world”• Evolved and transformed according to natural laws

Ecologically rational choice• The normative/descriptive gap

– Panglossians vs. meliorists• Stanovich and West 2000

– Evolutionary social sciences are panglossian• Expect people are generally well designed.• Deviation from a normative model signals deficiency of the

model or an ecological mismatch.• Two categories of error, “pardonable errors by subjects and

unpardonable ones by psychologists” – Kahneman 1981.• The real trick is to disentangle the two.

– Novel environmental factors

END

Anthropology of risk

6 Risks:

– Accidents

– Subsistence failure

– Disease

– Inter-group

competition (war)

– Cooperation failure

(free riders)

– Paternity

Pueblo Bonito in Chaco Canyon, NM

Economic Decisions and Foraging Theory

• Harvesting resources from the environment is a problems all organisms must solve

• Natural selection should lead to efficient harvest and use of scarce resources

• Two early models– Patch Choice– Prey Choice

Prey choice• Fine-grained environments

– Composed of randomly distributed prey– Need to know which prey types to pursue

• Decision: pursue or not upon encounter• Currency: long term net rate of energy capture• Constraints:

– Search and handle mutually exclusive– encounters sequential, random, and in proportion to

abundance– No short-term effect of foraging on prey abundance– Forager has complete information

Construction of foraging models

• Decisions - what we want to explain– How long to stay in a patch, what subset of foods to pursue

• Currencies - what we assume to correlate with fitness– This is what is maximized or minimized

– Time, net calories, calories per hour, protein per hour

• Constraints - what limits the forager – Cognitive capability (rules of thumb)

– Physiological limitations - pigs can't fly

The Prey-choice model

ei = average net energy from prey type i upon encounter

hi = averge handling time per encounter for prey type i

i= abundance of prey type i

pi = probability of pursuing prey type i upon encounter

Prey choice model test

• The Ache of Paraguay– First contacted around 1978– About 200 individuals

• 15-50 people in a foraging group

Data

• data gathered about Ache foraging

From Hawkes et al. 1982

Result

From Hawkes et al. 1982

Time discounting

• Equity premium puzzle– 50:50 $50k or $100k = $51,209?!?– Glimcher and Kable - hyperbolic function