More on this book
Community
Kindle Notes & Highlights
by
Dan Heath
Read between
April 21 - April 25, 2020
It turns out that emergencies follow predictable patterns.
There are patterns in time (more 911 calls during the day than at night) and patterns in geography (more calls from areas with older citizens than younger ones).
And then there are the nuances: Curiously, mealtimes at nursing homes create a spike.
No, those are the times when a caregiver is guaranteed to check on a patient and discover that something bad has happened.
The ambulances are parked at the local fire stations, and when a 911 call comes in, EMTs or paramedics will drive out to help the person. It’s a reactive system.
The real-time location of all the ambulances is pinpointed on the maps, and each one is surrounded by a halo that shows the area it could reach within 10 minutes.
This is the model of an early-warning story: Data warns us of a problem we wouldn’t have seen otherwise—say, needing ambulances deployed closer to nursing homes at mealtimes.
Japan has one of the world’s best early-detection systems for earthquakes, including an observation center that collects information from more than 3,200 seismographs and seismic intensity meters around the country, according to a 2012 article by Alex Greer, a professor who specializes in emergency preparedness.
“One of the most important things that an online connection to the cloud gives you is the ability to spot trends in advance before they start creating problems,” John Macleod, an IBM Watson IoT technical specialist, told Computerworld.
With the rise of the Internet of Things, this kind of advance-warning solution will become more and more common.
The anti-terrorism “If You See Something, Say Something” campaign is another example of early-detection work that hinges on human beings.
But in recent years, doctors’ ideas about cancer have changed. No one thinks anymore that “it’s only a matter of time” before cancer kills a patient.
The health system’s goal is to keep the animals from escaping the pen—that’s the equivalent of a cancer that becomes deadly—and the pen represents our system of early detection and treatment.
The turtles are incredibly slow, so the pen is kind of pointless. They never would have escaped anyway. Turtles represent sluggish, nonlethal cancers, of which there are many.
From the perspective of public health, then, the only animal that matters is the rabbit. It represents a potentially lethal form of cancer. It can hop out of the pen at any time, but if we act quickly, we can stop it before it escapes.
Some early-warning systems work wonders: They can keep elevators from failing and customers from churning. Other times, they may cause more harm than benefit, as in the thyroid cancer “epidemic” in South Korea.
Have you ever rolled your eyes when you heard a fire alarm? That’s alarm fatigue, and it’s a critical problem.
As we design early-warning systems, we should keep these questions in mind: Will the warning give us enough time to act effectively? (If not, why bother?) What rate of false positives can we expect?
Most mass shootings are planned at least six months in advance. Typically, 8 in 10 shooters tell at least one other person of their plans. Many actually post threats on social media. Their actions could have been prevented if the right people had been paying attention or had taken them seriously.
Sandy Hook Promise launched a training program to educate students on the warning signs, which include: a strong fascination with firearms, acting aggressively for seemingly minor reasons, extreme feelings of social isolation, and bragging about access to guns.
Sandy Hook team realized that they needed to broaden their focus to include students vulnerable to bullying and self-harm (especially suicidal tendencies and cutting).
To make matters worse, it’s the curse of preventing rare problems that we may never really know when we’ve succeeded.
“When I think back to the Sandy Hook school tragedy, I know that there was a sequence of events—a chain—that had to link up perfectly for events to unfold as they did,” said Hockley in a TEDx talk.
“When we look at it, we don’t see the dominoes, we see the spaces in between,” said Hockley in the talk, “when someone could have done something or said something to stop the next domino from falling over.”
In this chapter, we’ll scrutinize three kinds of ghost victories.
In the first kind of ghost victory, your measures show that you’re succeeding, but you’ve mistakenly attributed that success to your own work.
The second is that you’ve succeeded on your short-term measures, but they didn’t align with your long-term mission.
That first type of ghost victory reflects the old expression “A rising tide lifts all boats.”
You might ask, well, why didn’t the low-income people call?
The rich people believed they would get served, so they called, and they were served. The poor people believed they’d be neglected, so they didn’t call, and they were neglected. Boston had created two self-fulfilling prophecies.
And the places where more walkability was most needed were the places that had been historically neglected.
The bulk of the repair budget now goes to strategic, proactive efforts to overhaul damaged sidewalks in the areas where it will make the most difference.
In his book Thinking, Fast and Slow, the psychologist Daniel Kahneman wrote that our brains, when confronted with complexity, will often perform an invisible substitution, trading a hard question for an easy one.
This is the essence of intuitive heuristics: When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”
This substitution—of easy questions for hard ones—is something that happens with both downstream and upstream efforts.
Choosing the wrong short-term measures can doom upstream work.
The truth is, though, that short-term measures are indispensable. They are critical navigational aids.
Getting short-term measures right is frustratingly complex.
There is also a third kind of ghost victory that’s essentially a special case of the second. It occurs when measures become the mission.
In England in the early 2000s, the Department of Health had grown concerned about long wait times in hospital emergency rooms, according to a paper by Gwyn Bevan and Christopher Hood.
So the department instituted a new policy that penalized hospitals with wait times longer than four hours.
In some hospitals, patients had been left in ambulances parked outside the hospital—up until the point when the staffers believed they could be seen within the prescribed four-hour window. Then they wheeled the patients inside.
People “gaming” measures is a familiar phenomenon.
We need to escalate the rhetoric: People aren’t “gaming metrics,” they’re defiling the mission.
When people’s well-being depends on hitting certain numbers, they get very interested in tilting the odds in their favor.
And so if they couldn’t make crime go down, they just would stop reporting crime.
You could refuse to take crime reports from victims, you could write down different things than what had actually happened.
“The chiefs felt like they were keeping the crime rate down for the commissioner. The commissioner felt like he was keeping the crime rate down for the mayor. And the mayor, the mayor had to keep the crime rate down because otherwise real estate prices would crash, tourists would go away. It was like the crime rate itself became the boss.”
Think about this: An NYPD official is held accountable for rape statistics. There are two ways to make those numbers look better.
The first way is to actually prevent rape—to project the police’s presence into dangerous areas and thereby deter the violent acts.