Fandom

SAASS Comps Prep Wiki

Axelrod, The Evolution of Cooperation

248pages on
this wiki
Add New Page
Talk0 Share

Robert Axelrod, Evolution of Cooperation

Author. (1943- )

University of Michigan Political Science and Public Policy Professor; Univ. of Berkeley (68-74)

Complexity theory and international security work for World Bank, United Nations and DoD

Context.

Cold War arms control and rise of computer simulation

Scope.

  • make some assumptions about individual motives and then deduce consequences for the behavior of the entire system … to develop a theory of cooperation that can be used to discover what is necessary for cooperation to emerge

Evidence.

Results of multiple computer programs competing to solve Prisoner’s Dilemma

Central Proposition.

  • four properties which tend to make a decision rule successful: avoidance of unnecessary conflict by cooperating as long as the other player does, provocability in the face of an uncalled for defection by the other, forgiveness after responding to a provocation, and clarity of behavior so that the other player can adapt to your pattern of action.

Other Major Propositions.

  • If the discount parameter, w, is sufficiently high, (if the future is important) there is no best strategy independent of the strategy used by the other player
  • Tit for Tat, reply in kind (nice, retaliatory, forgiveness)
  • Since players (states) meet again, there is a need for norms, order and some rules. This will be to both parties benefit.
  • Social structure and cooperation depends on Labels (how you are view, classified), Reputations (previous response to an activity), Regulations (what rule/norms the majority follows), and Territoriality (interaction closer w7neigbors, relationship).
  • Provocations, can lead to a spiral of defection
  • Echo effect, respond to an act like an echo
  • In real life, more than in a game, ideology, politics, commitments, alliances, leadership must be taken into account.
  • Four simple suggestions are offered for individual choice: do not be envious of the other player’s success; do not be the first to defect; reciprocate (return in kind) both cooperation and defection; and do not be too clever
  • If everyone in a population is cooperating with everyone else … no one can do better using any other strategy providing that the future casts a large enough shadow onto the present
  • Any strategy which may be the first to cooperate can be collectively stable only when w is sufficiently large (i.e. future interactions matter)
  • For a nice strategy to be collectively stable, it must be provoked by the very first defection of the other player. (i.e. retribution is immediate for any infraction)
  • If the other player is certain to defect, there is no point in your ever cooperating.
  • The strategies which can invade ALL D(efection) in a cluster with the smallest value of p are those which are maximally discriminating (can tell the difference between strategies)
  • If a nice strategy cannot be invaded by a single individual, it cannot be invaded by any cluster of individuals either.
  • If a rule is collectively stable, it is territorially stable.


- (Prisoner’s Dilemma TIT FOR TAT Computer Tournament)

- Cooperation Theory – individuals who pursue self-interest, but without central authority to force cooperation

- Tit for Tat’s success – nice, retaliatory, and forgiving; strategy is collectively stable if no strategy can invade it


Notes from Gloves: Axelrod, ch1-2, 4, 8-9 / is cooperation based solely on reciprocity possible? viii / concept is strategic rather than genetic evolution x / Darwinian survival of the fittest resolves into amicable brotherhood xi / setup is the same as the international system - anarchic world of sentient states 3 / Hobbes said we cooperate because otherwise, life would be "solitary, poor, nasty, brutish, and short" - but that cooperation was impossible without a central authority (thus Leviathan, or the central govt) 4 / assumption - life is not a zero-sum game 5 / "reciprocity is a way of life in the Senate" - but it wasn't always so 5 / 'cooperation can be explained as a consequence of individual senators pursuing their own interests' 6 / self-interest can include interest in benefitting others, but that benefit to others is not the prime motivator of cooperation 7 / Prisoner's Dilemma cooperation requires that there's no finite end to the interaction and there's no communication (but there is a history) 10-11 / the history of interaction casts a shadow on the present, affecting the current strategic choice 12 / there is no best rule in a game with another sentient player - it depends 14 / continuing interaction stabilizes the strategy 16 / PrisDil parameters are not that restrictive - just a few basic requirements and the PrisDil can be applied to many interactions 17 / lots of complicating factors left out 19 / four elements of a winning PrisDil strategy: cooperation first, provocability in case of nefarious behavior, forgiveness, and clarity (predictability) 20 / three stages of cooperation: repeated interactions, even in a hostile environment, can lead to clusters who attempted trust and were rewarded; even in a world filled with competitive strategies, these clusters can grow; once reciprocity is established, it is resilient 21 / cooperation is visible in non-sentient biological systems, meaning that rationality isn't required 22 / individual recommendations from the four winning elements: don't envy, don't defect first, reciprocate cooperation AND defection, and don't think too much 23 / if coop theory is known to participants, they are more likely to adopt coop strategies 24 / "when a single defection can set off a long string of recriminations and counter-recriminations, both sides suffer" 38 / systematic error - being too competitive for your own good 40 / evolutionary processes make it more likely that effective strategies will become greater and greater parts of the environment 50 / "not being nice may look promising at first, but in the long run it can destroy the very environment it needs for its own success" 52 / three conditions that result in a strategy's nonexploitability: you know you will encounter the strategy, you know you have encountered it when you encounter it (it's recognizable), and it's easy to comprehend the nonexploitability (predictable punishment) 53-4 / good strategy: niceness prevents getting into trouble; retaliation deters the other from continuing to defect; forgiveness restores mutual cooperation; clarity makes it predictable, setting the stage for future trust 54 / ch4 / PrisDil in the WWI trenches - two players are small units facing each other, choices are shoot to kill or shoot to avoid causing damage 75 / at the national level, equal losses meant a win for the Allies due to their greater resources - but at the local level, mutual restraint was preferred 76-7 / due to misses, there was an inherent tendency toward deescalation 80 / raids ended the live and let live system by removing from the actors their freedom of action 82-3 / trench example shows the element of morality (sorry about that attack, we had to do it though we didn't want to) 85 / rituals also grew from this example, extending the value of predictability 86 / four additional factors of social structure: 1) labels, which provide stereotyping and status hierarchies 146-50; 2) reputation, which provides some predictability - but there are considerations whether it would be better to have your strategy known (if not exploitable) or unknown (if exploitable) 150-4; 3) regulation, or the way the govt elicits the cooperation of the governed (tougher standards decrease voluntary compliance, e.g.) 155-8; 4) territoriality, which weights relationships based on proximity (geography, or along a liberal-conservative scale, etc.) - think of the evolution game on the computer - this leads to intricate and complex patterns of interaction that sometimes showed benefits of strategies other than the nice, reciprocal, forgiving, predictable strategy 158-67 / conversion is an evolutionary tool that's not life or death 169 / stable cooperation strategies require that previous interaction cast a sizeable shadow on future decisions, and that there must be some varied strategies against which to compete, and a chance for success 174-5 / "the overall level of cooperation tends to go up and not down" 177 / an expectation of future interaction mellows reprisals - but they jump right back up if interaction is or appears to be imminently terminated 179 / preventing collusion requires the opposite strategy of enabling cooperation - hard to do, because cooperation is robust 180 / it is better to be quick to anger but quick to forgive than to be slow to anger and quick to forgive 184 / automatic responses are predictable - which is good 187


Sugar's Tips on Axelrod

pg3-5 Why do people cooperate without a central authority to force it? We create norms around behaviors that provide mutual benefit (like Senate “folkways”)– competition is not a zero sum game, this is what makes the prisoner’s dilemma more complicated than chess in this way (prisoners dilemma presents chances for mutual benefit with cooperation or vice versa, chess is purely adversarial)

pg7 Self interest and interest in others (siblings, coalition partners) are not mutually exclusive 

pgs 20-21 lists four propertiesand three stages by which players interacting tend to make a “decision rules” that become mutually beneficial. These count on both players responding to past actions of the other and forming norms based on these, based largely on the “Tit for Tat” tendency in tacit bargaining. Funny, kinda sounds like Confucius (“Do not do to others what you would not like them to do to you”) and Jesus (Do unto others what you would have them do unto you) were onto something…

pg150 Thus, if your past actions determine how others react to you in the absence of central authority to compel them to compete or cooperate with you, your reputation becomes important in determining how the other side reacts to you - your past track record will most often determine their response to you in a prisoner’s dilemma situation.

pg152 The best reputation according to this theory is to be a bully – Machiavelli would agree (Better to be feared than loved), and this would support the theory that violence is the…know, you whatever that Latin quote was in the test question. But if the other guy does this, you’ve got problems, like two new guys in prison trying to beat up the other one to avoid becoming the cell b*tch (not that I’d know anything about that) Theoretically, it’s best to talk tough while avoiding throwing the first punch, if you do hit first and he hits back harder, don’t beg profusely for forgiveness if you want to pick your own bunk at the end of the day.

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.