More on this book
Community
Kindle Notes & Highlights
Read between
December 31, 2019 - May 22, 2020
how the framing of a concept or situation can change the perception of it, such as how a newspaper headline can frame the same event in dramatically different ways, causing the reader to take away different conclusions. This change in perspective can be used as an effective influence model for good or bad, especially in moments of conflict.
hope for a peaceful reconciliation with their home country.
In any case, perceived unfairness triggers strong emotional reactions.
FUD, which stands for fear, uncertainty, and doubt. FUD is commonly used in marketing (“Our competitor’s product is dangerous”), political speeches (“We could suffer dire consequences if this law is passed”), religion
A related practice is the use of a straw man, where instead of addressing your argument directly, an opponent misrepresents (frames) your argument by associating it with something else (the straw man) and tries to make the argument about that instead.
straw man, where instead of addressing your argument directly, an opponent misrepresents (frames) your argument by associating it with something else (the straw man) and...
This highlight has been truncated due to consecutive passage length restrictions.
Another related mental model is ad hominem (Latin for “to the person”), where the person making the argument is attacked without addressing the central point they made.
like straw man and appeal to emotion, these models attempt to frame a situation away from an important issue and toward another that is easier to criticize.
When you are in a conflict, you should consider how its framing is shaping the perception of it by you and others.
Influence models like those we’ve been discussing in the past two sections can also be dark patterns when they are used to manipulate you for someone else’s benefit (like at the casino).
Trojan horse can refer to anything that persuades you to lower your defenses by seeming harmless or even attractive, like a gift. It often takes the form of a bait and switch,
a Potemkin village, which is something specifically built to convince people that a situation is better than it actually is.
The term is derived from a historically questionable tale of a portable village built to impress Empress Catherine II on her 1787 visit to Crimea. Nevertheless, there are certainly real instances of Potemkin villages, including a village built by North Korea in the 1950s near the DMZ to lure South Korean soldiers to defect, and, terribly, a Nazi-designed concentration camp in World War II fit to show the Red Cross, which actually disguised a way station to Auschwitz.
Considering a conflict through a game-theory lens helps you identify what you have to gain and what you have to lose. We have just looked at models that increase your chances of a good outcome through influencing other players.
If diplomacy by itself doesn’t work, though, there is another set of models to turn to, starting with deterrence, or using a threat to prevent (deter) an action by an adversary.
carrot-and-stick model, which uses a promise of a reward (the carrot) and at the same time a threat of punishment (the stick) to deter behavior.
Containment acknowledges that an undesirable occurrence has already happened, that you cannot easily undo it, and so instead you’re going to try to stop it from spreading or occurring again in the future.
Another containment tactic is quarantine, the restriction of the movement of people or goods in order to prevent the spread of disease.
flypaper theory, which calls for you to deliberately attract enemies to one location where they are more vulnerable, like attracting flies to flypaper, usually also directing them far away from your valuable assets.
more vulnerable, like attracting flies to flypaper
In a computing context, this is known as a honeypot, which is used to attract and trap malicious actors for study, in the same way honey lures bears.
Without containment, bad circumstances can spread, possibly leading to a domino effect, where more negative consequences unfold in inevitable succession like falling dominoes
In the game-theory context, this effect could be a series of player choices that lands you in a bad outcome.
While in each turn it is attractive to betray the other players because you get outsized yields that turn, doing so, especially repeatedly, in most cases leads to everyone else following suit, leaving you and...
This highlight has been truncated due to consecutive passage length restrictions.
because people are generally bad at determining both the likelihood that events might occur and
the causal relationship between events.
The first is the slippery slope argument: arguing that one small thing leads to an inevitable chain of events and a terrible final outcome
the slippery slope argument: arguing that one small thing leads to an inevitable chain of events and a terrible final outcome (in the eyes of the person making the argument).
The second model is broken windows theory, which proposes that visible evidence of small crimes, for example broken windows in a neighborhood, creates an environment that encourages worse crimes, such as murder. The thinking goes that broken windows are a sign that lawlessness is tolerated, and so there is a perceived need to hold the line and prevent a descent into a more chaotic state (see herd immunity in Chapter 2).
loss leader strategy, where one product is priced low (the gateway drug) to increase demand for complementary products with higher margins.
If you are in no position to meaningfully deter or contain an emerging conflict that you’d like to avoid, appeasement may be a necessary evil. This involves appeasing opponents by making concessions in order to avoid direct or further conflict with them.
worry with appeasement: you may just be delaying the inevitable.
“the only winning move is not to play.”
a war of attrition, where a long series of battles depletes both sides’ resources, eventually leaving vulnerable the side that starts to run out of resources first.
describing any situation where circumstances have changed significantly, leaving the status quo unequipped to deal with new threats.
Joy’s law is a mental model named after Sun Microsystems cofounder Bill Joy, who remarked at an event in 1990, No matter who you are, most of the smartest people work for someone else.
as Rumsfeld’s Rule: You go to war with the army you have. They’re not the army you might want or wish to have at a later time.
Joy’s law further stresses that great people are unlikely to be concentrated in a single organization.
nature versus nurture. Nature refers to traits being explained by genetics, and nurture refers to traits being explained by all the environmental factors that don’t come from your genes
A similar model from author Robert X. Cringley in his book Accidental Empires describes three types of people required in different phases of an organization’s life cycle—commandos, infantry, and police.
Whether invading countries or markets, the first wave of troops to see battle are the commandos
They work hard, fast, and cheap, though often with a low level...
This highlight has been truncated due to consecutive passage length restrictions.
Grouping offshore as the commandos do their work is the second wave of soldiers, the infantry.
they require an infrastructure of rules and procedures for getting things done—all the stuff that commandos hate ….
But there is still a need for a military presence in the territory they leave behind, which they have liberated. These third-wave troops hate change. They aren’t troops at all but police. They want to fuel growth not by planning more invasions and landing on more beaches but by adding people and building economies and empires of scale.
foxes versus hedgehogs, derived from a lyric by the Greek poet Archilochus, translated as The fox knows many things, but the hedgehog knows one big thing.
Hedgehogs are big picture; foxes appreciate the details.
Those who built the good-to-great companies were, to one degree or another, hedgehogs. They used their hedgehog nature to drive toward what we came to call a Hedgehog Concept for their companies. Those who led the comparison companies tended to be foxes, never gaining the clarifying advantage of a Hedgehog Concept, being instead scattered, diffused, and inconsistent.
Hedgehogs tend to have a focused worldview, an ideological leaning, strong convictions; foxes are more cautious, more centrist, more likely to adjust their views, more pragmatic, more prone to self-doubt,
inclined to see complexity and nuance. And it turns out that while foxes don’t give great sound-bites, they are far more likely to get things right.