FANDOM


Author & Context. Jervis is a respected and influential scholar of international relations. His work on misperception in politics is considered groundbreaking.

Thesis/Argument

Decision Makers usually perceive the world inaccurately, but do so according to patterns and ways we can detect and understand. The argument is that by studying past examples of divergence between perception and reality in decision making, we can compensate for misperception and improve the decisions that underpin international relations. Jervis argues that misperception is not the exception, but the normal state of psychology. Jervis outlines and defines models and conditions explaining how misperceptions occur, primarily by demonstrating the causes that make us filter data in a way that is preferable and comfortable to the way we think (cognitive consistency). Jervis seeks to educate the reader by building awareness of these patterns. His book exists at the nexus of history, international relations, and psychology.

Themes

  • To help us become RAM, and move away from MOD II and III. Not fully possible, boils down to what’s in your head.
  • Deterrence and spiral models are not adequate to explain the intentions of other states
  • With deterrence, much is assumed about the enemy’s intent, rationality, and cost-tolerance. A Game of Chicken, bets outcome if you stay on course.
  • With spiralism, any act is considered an act of aggression, creating a spiraling dynamic. A Prisoners Dilemma, activity happens that drives you to react and it fuels the next reaction.
  • Hence, decision makers must attempt to apply a variety of models and gain new adversary outlooks, any one of which could be a true representation of the adversary
  • Cognitive consistency represents the interaction between theory and data; this causes incoming data to be assimilated into existing mental models
  • Innovation is prevented by commitment to an existing image; those most involved in the policies attached to the old image will be least able to innovate
  • Learning from history; decision makers form images from life lessons
  • Overgeneralization is a pitfall of learning from history; causality is obscured
  • Centralization is rarely as present as it may seem; decision makers will often view adversaries as more centralized and calculating than they are truly capable of being
  • Premature Cognitive Closure; Know that you cannot get all info, therefore you have a tendency to premature look in and be wrong.
  • Overestimating one’s own importance as influential or the target of adversary action; decision makers mentally exaggerate the degree to which they play a central role in others policies
  • Influence of fear and desire; likely plays a role, though there is not enough evidence to conclude anything meaningful about the role of emotion
  • Cognitive dissonance; decision-makers will seek to justify their own behaviors to reassure themselves that they have made the best possible use of all information and have been consistent; dissonance influences their decisions and subsequent decisions. (Dissonance is a sonic term; the tone is not in harmony)
  • Strategies to minimize misperceptions
  • Assumptions and predictions should be made explicit so they can be evaluated
  • Encourage the formulation and application of alternative images
  • Organizations should not allow prospects and identities to become tied to specific theories and images of other actors
  • Be aware and guard against common misperceptions. (Mistake to view others as better centralized, planned and coordinated than you)


Applications to Strategy

  • Strategists must interpret the world; seek accuracy by understanding perceptive pitfalls
  • Leader makeup matters; his misperceptions will guide planning and execution of strategy
  • Consideration of alternative points of view can diminish poor strategy based on faulty images
  • Look for mental patterns from history, but avoid literal or generic interpretations


Sugar's Tips on Jervis


Yet another example of a "classic" work on topics that are very much in vogue today as we look at adapting to complex environments, and recognizing how our own mindsets contribute further to that complexity.



Cj 3 : With the recognition that we tend to use the rational actor model (especially at the time Jervis writes, which is only a few years after Allison's Essence of Decision opens eyes to the other models), there are different ways to look at how a rational actor might process and interpret the same signals as each seeks to judge the intentions of others, and is swayed in varying degrees by the eternal elements of "fear, honor, and interest", in Thucydides formulation. On the deterrence side, strength is the best argument, and demonstrations of capability, will, and intent to defeat military strategies are seen as the best way to guarantee security by convince others from using force against you (competition, game of chicken). In the spiral model proponents view, which is an inherently cynical realist one (assumption of Hobbesian universe), building up only guarantees that others will perceive your defensive capabilities as offensive ones, and attempts to show strength will only start an arms race, race for resources, etc...this cooperation is better than a "death spiral" of competition. The trick is to try and balance how much capability you show, trying to maintain credibility to prevent conflict (if you're trying to preserve the status quo, that is) while not pushing over the line towards the conflict you're trying to avoid (see how the Athenians tried to do this against Corinth when we get to the opening chapters of the Peloponnesian War).



Ch 4: This chapter is about recognizing one's own biases in interpreting information, which are very much influenced by our past experience and the worldviews that we're comfortable with, and that we can often only overcome these, even in the face of incontrovertible evidence that our views are in error, with great difficulty and conscious effort. This helps to explain why being a "combat veteran" in a previous war does not necessarily guarantee success in the next one, especially if one erroneously applies the lessons of the last war to the present one – which might be why many innovators are either young or new to their fields (p197). If you're really good at using a hammer, and have even gotten lots of awards and promotions for it in the past, you tend to see lots of nails out there the next time you're asked to fix things, whether you’re trying to or not. On the other side, if you know that your opponent is a hammer kind of guy, you might be able to use that against him to lead him down the primrose path (kinda like the Roadrunner switching out the nail with a detonator plunger as the Coyote swings away with his ACME mallet).




Ch 6: History can give great context, and clue you into the major considerations that must be addressed when you’re dealing with a similar situation (no need to invent the wheel - but we do often, if you compare current roles and missions debates with historical ones, or the new COIN manual to ’63 Galula and the ’40USMC Small Wars Manual. The trick is to get the most out of the sameness, while recognizing the critical differences that may make the previous example either inapplicable or even harmful if imitated. Again, tough to overcome the lessons of prior experience, especially if it was a searing one and these past experiences (i.e. “No more Verduns”, “No more Vietnams”, etc ) can have a significant influence on how you react to new situations.




Ch 8 : This gets to what Rittel describes in his treatise on Wicked Problems, which came out shortly before Jervis published. http://en.wikipedia.org/wiki/Wicked_problems . We tend to see a symptom and infer a central theme or cause, when in fact it may be multiple and competing causes (including chance and chaos) which are manifesting themselves in the observable phenomena we’re discerning. Thus, we often get it wrong, or imply meanings that aren’t there. Allison’s Models 2 and 3 help us deal with this, but not completely.




Ch 9: We tend to take credit for things we did not do, and disassociate blame from ourselves from bad things we did. Pretty human tendency, but on the flipside, we may not be making the impression on the other side that we think we are, for better or worse.




Ch 10: We tend to see what we want or expect to see – especially if we don’t design checks and balances to keep ourselves honest. But the author also states that people tend to avoid extremes when they have an ability to shape the situation, and that they tend to be more open to conflicting information when one can still act on what they believe will happen. (p380)




Ch 11: Boyd believed in this: for him, the art of the engagement was to induce strategic paralysis - to overwhelm your foe by creating a disconnect between what was perceived and what was real until the foe becomes ineffective. Our desire to avoid cognitive dissonance can sometimes cause us to stick with our previous positions longer when we should when the facts begin to dispute our predictions, creating an “inertia” that can hinder successful adaptation. This is also a tendency commonly used by salesmen and fundraisers to their advantage – studies have shown that if they get you to agree with something a priori, and then put you in a position where you have to affirm your position with action, people will often do it just to avoid contradicting their previous positions, even if they wouldn’t have bought or donated before.





Cheers,




Sugar

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.