More on this book
Community
Kindle Notes & Highlights
by
Sam Kyle
Read between
December 17, 2018 - December 13, 2019
Think about Napoleon deciding to invade Russia, leaders at NASA ignoring the O-ring issues on the Challenger, Margaret Thatcher deciding to get behind a “poll tax” that ended up getting her ousted by her own party, or George Bush making the decision to invade Iraq. These people were professional decision makers, with all the relevant information available to them, and yet they still made poor decisions. If they can’t get it right, what hope is there for the rest of us?
When it comes to making decisions, or doing anything, really, the person with the fewest blind spots wins. The more we can eliminate any blind spots we have when making decisions, the more we will improve our ability to make correct decisions, and thus improve our organizations.
For our purposes in this book, the definition of a good decision is when we use a good process on a consistent basis to make decisions. While a good process does not guarantee a good outcome, it sure does improve your chances!
One cause of poor decisions is that we don’t know what we don’t know. It’s not only that we don’t understand the problem, but we fail to understand ourselves. We lack the knowledge, and the self-knowledge, to approach a decision in a smart way. Do you know HOW and WHY you make decisions the way you do? Do you have the self-knowledge to know when your biases may be getting in the way? Do you really understand the problem? Do you use a consistent process when you make decisions? Do you take full responsibility for your decisions?
In fact, many accomplished professionals utilize journaling as a daily practice to help them understand and improve themselves.
First, our minds tend to rationalize the outcome of our decisions, because it’s hard to admit when we are wrong.
Second, it’s hard to draw cause-and-effect relationships we can learn from, because of the time between the results of our decisions and the decision itself. Rarely do we see that the problems we’re facing today were caused by decisions we made months or years ago.
When I first started to use a decision journal, it was clear to me that I had optimism bias. This means I was focusing on the possible positive outcomes while not anticipating and preparing for potential negative ones.
It’s counterintuitive, but if you want to make better decisions and increase your productivity, you need to take the time to think about your decisions.
I encourage you to use a decision journal throughout this book. In each chapter, you will have a list of exercises that will help you to think about and improve your decisions.
“Whenever you’re making a consequential decision . . . just take a moment to think, write down what you expect to happen, why you expect it to happen and then actually, and this is optional, but probably a great idea, is write down how you feel about the situation, both physically and even emotionally. Just, how do you feel?”
Keeping a decision journal prevents something called hindsight bias, which is when we tend to look back on our decision-making process, and we skew it in a way that makes us look more favorable.
Using a decision journal
The situation or context. The variables that govern the situation. The complications or complexity as you see it. Alternatives that were seriously considered, and why they were not chosen. A paragraph explaining the range of outcomes you deem possible, with probabilities. A paragraph explaining what you expect to happen, and the reasoning. (The degree of confidence matters, a lot.) The time of day you’re making the decision, and how you feel physically and mentally. (If you’re tired, for example, write it down.)
Chapter 1 checklist:
Charles Frankel, who was an American philosopher and founding director of the National Humanities Center, once wrote, “A system is responsible in proportion to the degree that the people who make the decisions bear the consequences.”
Chapter 2 checklist:
Chapter 3 checklist:
The first is Planck knowledge. People who have Planck knowledge really know what they are talking about; they’ve done the work, and paid their dues. Then there are people who have chauffeur knowledge. They’ve learned the talk.
Charlie Munger is not only one of the best investors in the world; he’s also one of the best thinkers. Munger popularized the concept of a latticework of mental models in the speech he gave at USC.
If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.”
Chapter 4 checklist:
Chapter 6 checklist:
Chapter 7 checklist:
Chapter 8 Checklist:
The USCB ecologist/economist Garrett Hardin once said that “The scientific mind is not closed: it is merely well-guarded by a conscientious and seldom sleeping gatekeeper.” The way it does that is with the concept of the default status: The “resting position” of common sense, whereby the burden of proof falls on assertions to the contrary.
Harder to trace in its origin, Hanlon’s Razor states that we should not attribute to malice that which is more easily explained by stupidity.
In all human systems and most complex systems, the second layer of effects often dwarfs the first layer, yet often goes unconsidered. In other words, we must consider that effects have effects. Second-order thinking is best illustrated by the idea of standing on your tiptoes at a parade: Once one person does it, everyone will do it in order to see, thus negating the first tiptoer. Now, however, the whole parade audience suffers on their toes rather than standing firmly on their whole feet.
We tend to most easily recall what is salient, important, frequent, and recent.
The Scottish economist David Ricardo had an unusual and non-intuitive insight: Two individuals, firms, or countries could benefit from trading with one another even if one of them was better at everything. Comparative advantage is best seen as an applied opportunity cost: If it has the opportunity to trade, an entity gives up free gains in productivity by not focusing on what it does best.
The usefulness of additional units of any good tends to vary with scale. Marginal utility allows us to understand the value of one additional unit, and in most practical areas of life, that utility diminishes at some point. On the other hand, in some cases, additional units are subject to a “critical point” where the utility function jumps discretely up or down. As an example, giving water to a thirsty man has diminishing marginal utility with each additional unit, and can eventually kill him with enough units.
Though asymmetric insurgent warfare can be extremely effective, over time competitors have also developed counterinsurgency strategies. Recently and famously, General David Petraeus of the United States led the development of counterinsurgency plans that involved no additional force but substantial additional gains. Tit-for-tat warfare or competition will often lead to a feedback loop that demands insurgency and counterinsurgency.
6. Fat-Tailed Processes (Extremistan) A process can often look like a normal distribution but have a large “tail”—meaning that seemingly outlier events are far more likely than they are in an actual normal distribution. A strategy or process may be far more risky than a normal distribution is capable of describing if the fat tail is on the negative side, or far more profitable if the fat tail is on the positive side. Much of the human social world is said to be fat-tailed rather than normally distributed.
Also popularized by Nassim Taleb, a Black Swan is a rare and highly consequential event that is invisible to a given observer ahead of time. It is a result of applied epistemology: If you have seen only white swans, you cannot categorically state that there are no black swans, but the inverse is not true: seeing one black swan is enough for you to state that there are black swans.