Right Kind of Wrong Quotes

2,398 ratings, 3.91 average rating, 244 reviews
Open Preview
Right Kind of Wrong Quotes
Showing 31-60 of 137
“A bad system will beat a good person every time. —W. Edwards Deming”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“We don’t like beeps going forward, but innovation won’t happen without them.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“What if a team on the Electric Maze methodically stepped to find beeps as quickly as possible, eliminating the hesitation? A solution can be found in under seven minutes. A team’s inability to accomplish this task in twenty minutes can be seen as a direct result of misconstruing the context. This context calls for experimentation, and it helps to team up and support one another through the inevitable failures. Instead, students react emotionally to beeps—as if they had been engaged in a routine task with a playbook for exactly where and when to step. They’ve spontaneously viewed the maze as a test they were supposed to get right the first time. They’ve brought an execution mindset to a learning task.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“What people don’t easily put into practice is that the way to perform well in the maze is to collect as quickly as possible information about which squares beep. Logically, teams should applaud their colleagues for discovering both quiet squares and beeping squares. Both provide vital new information about the path. Instead, people experience the tiny intelligent failure of a new beep as a mistake and feel embarrassed by it—an embarrassment that’s amplified by others’ reactions. It shows lack of appreciation of context. A new beep is the right kind of wrong. Let’s call it a “beep going forward.” It’s a metaphor for the missteps in our lives in unfamiliar situations. Just as the maze presents a trial-and-failure task that cannot be solved without stepping on beeping squares, when we face novel contexts in our lives, we must be prepared for failures as we navigate the new terrain. If feeling ashamed of or anxious about a new beep in the maze is irrational (albeit human), so, too, is it irrational to feel embarrassed by the “beeps going forward” in our lives.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Dalio learned to change his thinking, explaining, “I just want to be right—I don’t care if the right answer comes from me.” No longer protecting his need to be right, he could then make decisions that were more effective”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Larry Wilson put it simply: Are you playing to win? Or playing not to lose? Playing to win meant a willingness to take risks in pursuit of challenging goals and satisfying relationships. Playing not to lose, which most of us do most of the time, meant avoiding situations where failure was possible. Playing to win, Larry maintained, was the stuff of great advances and great joy alike but necessarily brought setbacks along the way. Playing not to lose meant playing it safe, settling for activities, jobs, or relationships where you feel in control. The decision, Larry would be quick to explain, was essentially cognitive. You could make up your mind to play to win and thus start on the path to changing your thinking.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“How we think about events is what matters, not the events themselves. Unfortunately, most of the time our thinking is what Maxie called “irrational but believable.” That thinking is harmful, he pointed out, because when we think events cause our feelings directly, we’re victims.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Sixty-odd years ago a young insurance salesman in Minneapolis named Larry Wilson was miserable. Every time he was rejected by a prospective customer he felt like a terrible failure, an anxious loser unwilling to make the next telephone call. You might say he had a fixed mindset: Why bother to make a call if he was only going to fail again? He was ready to quit his job. But then his boss taught him a simple trick: he could change how he thought about those rejections. Because it took a beginning salesperson about twenty calls before making one sale and the average commission was $500, that meant on average a call was worth $25. Now, whenever Larry was told no, he forced himself to cheerfully think, “Thanks for the twenty-five dollars.” This simple change not only made him feel better, it also allowed him to do his job better because he could focus on customers instead of on how miserable he felt. Soon, he was averaging ten calls for each commission of $1,000, and whenever he was rejected, he would think, “Thanks for the one hundred dollars.” Essentially, he had reframed his thinking about failure.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Modern psychologists have identified a handful of opposing cognitive frames in which one frame is healthier and more constructive but the other is more common. Essentially, the more constructive frames embrace learning and accept setbacks as necessary and meaningful life experiences . The more common and natural frames, in contrast, interpret mistakes and failures as painful evidence that we’re not good enough.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“As Brené Brown says about parents, “When you hold those perfect little babies in your hand, our job is not to say, ‘Look at her, she’s perfect. My job is just to keep her perfect—make sure she makes the tennis team by fifth grade and Yale by seventh.’ That’s not our job. Our job is to look and say, ‘You know what? You’re imperfect, and you’re wired for struggle, but you are worthy of love and belonging.’ ”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“But if I see my actions as bad (guilt), it fosters accountability. It is thus better to feel guilty than ashamed; as Brown tells us, “Shame is highly, highly correlated with addiction, depression, violence, aggression, bullying, suicide, eating disorders… [while] guilt [is] inversely correlated with those things.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“When we see failures as shameful, we try to hide them. We don’t study them closely to learn from them. Brown distinguishes between shame and guilt. Shame is a belief that “I am bad.” Guilt, in contrast, is a realization that “what I did is bad.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“She defines shame as “an intensely painful feeling or experience of believing we are flawed and therefore unworthy of acceptance and belonging.” Some researchers see shame as “the preeminent cause of emotional distress in our time.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“What you can take away from this research is that framing matters. For instance, how did you think about that close call? Did you see it as a failure (a miss that almost happened) or as a success (a good catch)? If you’ve framed the close call as a success, you’re more likely to tell your colleagues or family about it, making all of you more able to learn from it.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“In a very different study with similar conclusions, my colleagues Bradley “Brad” Staats and Francesca Gino—then professors at the University of North Carolina—studied how seventy-one surgeons learned from failure versus success on a total of 6,516 cardiac surgeries in ten years. The surgeons learned more from their own successes than from their own failures, but learned more from others’ failures than from others’ successes. This effect—again ego protecting—was less pronounced if a surgeon had a history of personal success. Failures presumably stung less sharply with that cushion of prior success.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“The researchers concluded that failure is “ego threatening, which causes people to tune out.” Further support for this explanation came from the fifth study, where participants observed others take similar tests—rather than taking them themselves. This time they learned equally from the failures (and the failure feedback) as from the successes. Without the ego threat, the shortcomings of failure feedback were erased. It seems we’re pretty good at learning from other people’s failures! In real life, however, we often don’t hear about them.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Over and over, people learned less from being given information about what they got wrong than about what they got right.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“By way of contrast, if you’re driving down the street and a car suddenly appears in an intersection, you’ll slam on your brakes to avoid an accident, in part aided by an amygdala-triggered intense fear reaction. In this case, the fast pathway was lifesaving. But today, the chances are that you’re more often activated by a perceived threat than a true threat. The amygdala, which protected us from many real threats in prehistoric times, operates according to a “better safe than sorry” logic.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Confirmation biases are fueled by our natural motivation to maintain self-esteem, which helps us tune out signals that we might be wrong. Those who score high in narcissism experience a greater confirmation bias. Alas, as my colleague Tomas Chamorro-Premuzic notes, “Narcissism levels have been rising for decades.” But everyone—not just the irrationally self-centered and overconfident—is prone to letting ego get in the way of something that is clearly rational and in our best interest: learning to improve. Rational, yes, but effortful.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Failing well, perhaps even living well, requires us to become vigorously humble and curious—a state that does not come naturally to adults. Psychologists and neuroscientists have discovered that, far too often for our health and success, a kind of automatic sense that we’re right blinds us—the confirmation bias again. We literally fail to see disconfirming evidence. Other times, we’re privately aware that we’ve failed but reluctant to admit it.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“I have to ensure that everyone has a voice,” Aaron said when I asked him about his most important responsibility as a team leader. “There were times when it was awesome to have the flight engineer’s opinion, but there were a couple times where he treated his perspective as the end-all be-all.” That was when Aaron intervened. He asked others on the crew to offer their view. “Tom, what do you think?” “Petty Officer Robbins, what about you?” This is an important point about psychological safety: it needs to be cultivated lest crucial voices be lost. Making sure that everyone is heard is not a matter of good manners or inclusivity for its own sake. Rather, it’s what helps to keep an aircraft in the air and to safely land it.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Aaron explained that in each of the instances where something had gone wrong, he and his team were able to “think beyond the thing.” Instead of getting stuck in “the thing,” or the immediate error, they were able to “think beyond” and work together to do what’s called catch and correct.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“First, it made it less likely that nurses—frontline workers with relatively less clout in the hospital hierarchy—would be ignored if they reported an early warning signal about, for example, changes in a patient’s breathing or cognition. RRTs legitimized such calls. Second, even inexperienced nurses felt more secure about speaking up if something about a patient didn’t look or feel right—even a change in mood might be enough.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“We began to see RRTs as a tool for amplifying ambiguous threats. Just as people speaking to crowds through a megaphone amplify their voice, so do RRTs and Andon Cords amplify ambiguous signals of a complex failure . Amplify does not mean exaggerate; it just helps a quiet signal be heard. Amplifying an ambiguous threat that something might be amiss for a patient ultimately led to a reduction in heart failures.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“We began to see RRTs as a tool for amplifying ambiguous threats. Just as people speaking to crowds through a megaphone amplify their voice, so do RRTs and Andon Cords amplify ambiguous signals of a complex failure . Amplify does not mean exaggerate; it just helps a quiet signal be heard.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“If only one of twelve pulls of the Andon Cord stops the assembly line for a genuine problem, you might think the company would be upset by wasting supervisors’ time chasing the eleven false alarms. It turns out that the opposite is true. A pulled Andon Cord that does not identify an actual error is framed as a useful drill. The false alarm is instead experienced as a valuable learning moment, a welcome education on how things go wrong and how to adjust so as to reduce that possibility. This is not a cultural nuance. It’s a practical approach. Every Andon Cord pull is seen as a valuable episode that in the long run saves time and promotes quality.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“My engineering background had made me a fan of Perrow’s groundbreaking book Normal Accidents, first published in 1984, which had a lasting influence on experts’ thinking about safety and risk. Perrow focused on how systems, rather than individuals, produce consequential failures. The importance of that distinction cannot be underestimated. Understanding how systems produce failures—and especially which kinds of systems are especially failure-prone—helps take blame out of the equation. It also helps us to focus on reducing failure by changing the system rather than by changing or replacing an individual who works in a faulty system.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“preventer. The silver lining in every perfect storm is this: each complex failure contains multiple opportunities for prevention.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“it’s tempting to chalk up the two crashes to software bugs that led automatic sensors to malfunction. Idiosyncratic failures in complex technology. But as before, look more closely and you will see some of the usual culprits defining complex failure: multiple causes in a reasonably familiar setting, with its false sense of security; missed signals; and interactive complexity in a shifting business environment.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well
“Complex failures have more than one cause, none of which created the failure on its own. Usually a mix of internal factors, such as procedures and skills, collides with external factors, such as weather or a supplier’s delivery delay. Sometimes, the multiple factors interact to exacerbate one another; sometimes they simply compound, as with the straw that broke the camel’s back.”
― Right Kind of Wrong: The Science of Failing Well
― Right Kind of Wrong: The Science of Failing Well