PSYCHOLOGY OF DECISION MAKING II. Nov 12, 2010. Agenda Ultimatum game Social dilemmas and moral...
-
Upload
devan-session -
Category
Documents
-
view
213 -
download
0
Transcript of PSYCHOLOGY OF DECISION MAKING II. Nov 12, 2010. Agenda Ultimatum game Social dilemmas and moral...
PSYCHOLOGY OF DECISION MAKING II.
Nov 12, 2010
Agenda
Ultimatum game Social dilemmas and moral apects of
group decision making Game theory and group decision making Principles of negotiating
Ultimatum game
Draw numbers from the pile There is a letter and a number on the piece of
paper Those who are letter A, will receive CZK 10 and
will divide it between yourself and one person from B group
Write your proposal on the of paper Now draw again except this time, if the
person from B group will not accept the amount, neither of you will receive any money
Write your proposal on the of paper
Social dilemmas
Social dilemma = one is better off doing A (defection = zběhnutí) but all are better off doing B (cooperation), examples of defection: health insurance abuse, not paying TV concession atc.
Defection is common as shown by social psychology research, especially due to envy but also just through mere failure to think of others needs, especially those far away in time or space
Normative and prescriptive theory of social dilemmas Cooperative theory
You should always cooperate, despite how many people didn’t
However, often it does not work, we end up with no overall good and just hurting myself
Utilitarianism theory Cooperate to keep around 50% of cooperators Demanding to figure out since we don’t know the
number of contributors Self-interest theory
Always do what’s best for me and ignore everyone else = look for your own interest and therefore everyone
will benefit Weighted utilitarianism theory
Weight self-interest more than others’ interest so that you do for others at a constant amount of self-sacrifice
Justice
Joseph Henrich compared ultimatum game outcomes between American students from UCLA and Michigueng aborigine population
As opposed to US students, Michiguengs accepted any small amount offered, which merely logically is more optimal (higher utility)
Michiguengs described their behavior as common sense
Where does our sense of justice come from?
Motives in social dilemmas
Altruism – based on empathy Competition – prevents cooperation Fairness, equality and envy Fear and greed – defect while others
might or to get ahead Do as others do – we tend to do what we
believe others will cooperate or defer Trust – relation based on trust reinforces
cooperation
Voter’s illusion
Voter’s illusion = self-interest as motive to vote is illusion, some vote from moral obligation, others mistakenly believe they affect others’ choices (voting is a diagnostic in outcome shows own voting patterns but is not causal to make others vote) Shafir and Tversky (1992) in prisoners dilemma
showed people defected over 80% when they knew what other person did (despite whether s/he defected or cooperated), but only 32% defected if they were uncertain about the other person
Bornstein and Ben Yossef (1994) showed we tend to sacrifice our profits for our group at the expense of another group – parochialism
Intuitions, heuristics and naive theories Unclear where they come from, and
difficult to change them. 4 principles seem to work: Equality – even distribution Contribution (equity) – proportion to their input
to an overall enterprise Maximization (efficiency) – applies when option
differ in total amount, ultimatum game shows choice between equality and maximization
Compensation – compensating misfortunes, intuition suggests bigger compensation should be greater when the misfortune might have been avoided easily or caused by humans as opposed to nature
Biases
Punishment is often used not as deterrent as shown by research (Baron, Ritov, 1993) and seems more like heuristic of retribution Kahneman, Knetsch, Tversky (1986) asked students to divide $20 with another anonymous
student, either at 10/10 or 18/2. Not all students were chosen to get money. Other student could not reject the offer. 76% divided money equally. Second part of experiment, students not receiving money could divide between 2 students (1 previously equal, 2 unequal) and could give 5/5/0 or 6/0/6. 74% sacrificed $1 to punish the unequally dividing student.
Also punishment is used to undo harm even then other penalty would benefit other more (Baron, Ritov, 1993) – clean up your mess heuristic Lerner and Simmons (1966) found students punish - give low ratings in social attractiveness -
participants who gave electric shocks to subjects learning nonsense syllables, unless they were sure they could put the subjects into a reward receiving group in next round. Punishment is due when equity cannot be restored.
Very few people also showed ability to maximize utility in a moral dilemma (distribution of kidneys, Ubel , 1996) – ex ante equity heuristic for uneven distribution
Preference for rule that favors us when we are concerned with the distribution shown by van Avermear (1974) – filling in questionnaires, divide money to other participant, took any excuse (amount or time) depending if it was in their favor. So we are selfish, but always on fair grounds.
Opponent
Solomon Asch and conformity One person with opposing standpoint is
the key to avoiding social conformity phenomenon or decrease it’s impact
Even the least useful of an opponent helps: Vernon Allen and his “almost blind”
opponen for Asches task 64% instead of 97% of participants
conformed
Game theory
von Neumann, J., Morgenstern O. Theory of Games and Economic Behavior. Princeton University Press,1944
Games used as simulation of real-world decisions in individual, social, and political sense
Prisoner dilemma
Two suspects are arrested by the police. The police have insufficient evidence for a conviction, and, having separated the prisoners, visit each of them to offer the same deal.
If one testifies for the prosecution against the other (defects) and the other remains silent (cooperates), the defector goes free and the silent accomplice receives the full 10-year sentence. If both remain silent, both prisoners are sentenced to only 1 year in jail for a minor charge. If each betrays the other, each receives a five-year sentence.
Each prisoner must choose to betray the other or to remain silent. Each one is assured that the other would not know about the betrayal before the end of the investigation. How should the prisoners act?
Prisoner dilemma outcome matrix
Player 2P
layer
1
Defe
ctC
oopera
te
Cooperate
Defect
Iterated prisoner dilemma
Repeated and players remember the steps fundamental to certain theories of human
cooperation and trust: the evolution of cooperation (Robert Axelrod, 1984)
Nobel Prize winner Robert Aumann (2005, with Schelling)in his 1959 paper, rational players repeatedly interacting for indefinitely long games can sustain the cooperative outcome (often different from the equilibrium!)
In game theory, folk theorems are a class of theorems which imply that in repeated games, any outcome is a feasible solution concept, if under that outcome the players' minimax conditions are satisfied. The minimax condition states that a player will minimize the maximum possible loss which he could face in the game. An outcome is said to be feasible if it satisfies this condition for each player of the game
Strategies for prisoner dilemma Axelrod: Evolution of
cooperation (1984)
Aumann: mathematician, Nobel prize holder (2005) for contribution in conflict and cooperation applying game theory
Strategies for iterated prisoner dilemma By analyzing the top-scoring strategies, Axelrod stated several
conditions necessary for a strategy to be successful. Nice The most important condition is that the strategy must be
"nice", that is, it will not defect before its opponent does (this is sometimes referred to as an "optimistic" algorithm). Almost all of the top-scoring strategies were nice; therefore a purely selfish strategy will not "cheat" on its opponent, for purely utilitarian reasons first.
Retaliating However, Axelrod contended, the successful strategy must not be a blind optimist. It must sometimes retaliate. An example of a non-retaliating strategy is Always Cooperate. This is a very bad choice, as "nasty" strategies will ruthlessly exploit such players.
Forgiving Successful strategies must also be forgiving. Though players will retaliate, they will once again fall back to cooperating if the opponent does not continue to defect. This stops long runs of revenge and counter-revenge, maximizing points.
Non-envious The last quality is being non-envious, that is not striving to score more than the opponent (impossible for a ‘nice’ strategy, i.e., a 'nice' strategy can never score more than the opponent).
Negotiating
Simple when there are 2 people and a single pie, but usually there are various pies and more people, such as the inheritance example of Dr.Uhlar
Commonly it is the division of the gains and losses from trade / exchange of some sort that are being negotiated
Negotiation cannot reach maximum utility solution unless they fully trust each other, therefore pareto-optimal outcomes are attempted with aid of theoretical concepts although research often shows we fail to reach even the pareto-optimal solution
References
Baron J. Thinking and deciding. 2nd ed. Cambridge, United Kingdom: Cambridge University Press, 1994.
Brafman, O., Brafman, R. Houpačka: proč se chováme iracionálně? Dokořán, 2009.
Gigerenzer, G.: Gut feelings: The intelligence of of the unconsicous. 2008.
Janis, I.L., Mann, L. Decision-making: A psychological analysis of conflict, choice and commitment. The Free Press, 1977.
von Neumann, J., Morgenstern O. Theory of Games and Economic Behavior. Princeton University Press,1944.