Right Kind of Wrong Quotes

Rate this book
Clear rating
Right Kind of Wrong: The Science of Failing Well Right Kind of Wrong: The Science of Failing Well by Amy C. Edmondson
2,398 ratings, 3.91 average rating, 244 reviews
Open Preview
Right Kind of Wrong Quotes Showing 61-90 of 137
“complex failures occur in settings where you can find plenty of prior knowledge and experience.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“What about accountability? Executives in industries as varied as hospitals and investment banks have asked me this question. Surely individuals must face consequences for failure to avoid an overly lax culture? If people aren’t blamed for failures, how can they be motivated to improve? This concern is based on a false dichotomy. In actuality, a culture that makes it safe to admit failure can (and in high-risk environments must) coexist with high performance standards. A blame culture primarily serves to ensure that people don’t speak up about problems in time to correct them, which obviously doesn’t help performance. This is why blameless reporting is so valuable. As you will see, uninhibited, rapid reporting of anomalies is vital for high performance in any dynamic context.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“It’s easy and natural to look for single causes and single culprits, but for complex failures this instinct is not only unhelpful, it’s inaccurate. And it makes it harder to talk openly and logically about what really happened and how to do better next time. Later I’ll talk about approaches to reducing complex failure, but for now I want to emphasize that a psychologically safe environment in which people know they will not be blamed for mistakes or disappointing results is the bedrock that allows organizations and families alike to experience less of the wrong kind of failure and more of the right kind.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“The reflex to blame someone, to pin the fault on a single individual or cause, is nearly universal. Unfortunately, it reduces the psychological safety needed to practice the science of failing well.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Unfortunately, most warning systems do not warn us that they can no longer warn us. —Charles Perrow”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Equally valuable is embracing preventive tactics of all kinds—from training to error-proofing. This is not the sexy part of failing well—not the part that gets social media likes or hailed as the latest management fad. Given its enormous value (just ask Alcoa stockholders or commercial airline passengers!), this is a shame. A vital part of failing well is preventing basic failures. If you aspire to zero harm and failure-free work at the point of delivery, it’s essential to make friends with human error. Yes, to err is human. And to forgive (ourselves, especially) is indeed divine. But adopting simple practices to prevent basic failures in our lives and organizations is both possible and worthwhile. You might even say it’s empowering.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Mulally argued that transparency increased performance pressure. “You can imagine the accountability!” he exclaimed in an interview, presenting a hypothetical scenario to clarify: “Are you going to be red on an item, and then are you going to go through the week and come back and say to all your colleagues, ‘I was really busy last week, I didn’t have a chance to work on that.’ ” Mulally understood that blameless reporting does not mean low standards, nor does it lower the pressure to get the job done. Quite the opposite. With greater transparency comes a sense of mutual accountability, which drives people to solve problems together.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Psychological safety both enables and is enabled by blameless reporting. The policy sends the message “We understand that things will go wrong, and we want to hear from you quickly so we can solve problems and prevent harm.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“As golf champion Yani Tseng puts it, “You always learn something from mistakes.” What can we take away from the practices of elite athletes? It seems to me that they learn how to confront their mistakes by focusing instead on possibility—on the achievements palpably within reach even if they eluded you today. They show us how to care more about tomorrow’s goal than today’s ego gratification.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Although elegant and practical, the Andon Cord, for me, embodies simple leadership wisdom. It conveys the message “We want to hear from you.” You refers to those closest to the work—those best positioned to judge its quality. Not only are employees not reprimanded or punished for reporting error, they are thanked and recognized for their close observation.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“The genius of the Andon Cord lies both in how it functions as a quality-control device to prevent defects and in its embodiment of two essential facets of error management: (1) catching small mistakes before they compound into substantial failures, and (2) blameless reporting, which plays a vital role in ensuring safety in high-risk environments.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“psychological bias known as the fundamental attribution error exacerbates the problem. Stanford psychologist Lee Ross identified this fascinating asymmetry: when we see others fail, we spontaneously view their character or ability as the cause. It’s almost amusing to realize that we do exactly the opposite in explaining our own failures—spontaneously seeing external factors as the cause. For example, if we show up late for a meeting, we blame traffic. If a colleague is late for a meeting, we may conclude he is uncommitted or lazy.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“This is why we cannot afford to ignore mistakes. Basic failure’s ubiquity serves as an invitation to strive to minimize it. My goal is to make basic failures fewer and further between. (It’s the opposite of how we think about intelligent failures, which I believe we should strive to increase, to accelerate innovation, learning, and personal growth.) But behaviors and systems that prevent basic failure can save lives, create immense economic value, and bring personal satisfaction.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Assumptions are taken-for-granted beliefs that feel like facts. Because we aren’t consciously aware of them, we don’t hold them up for scrutiny.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Assumptions, by definition, take shape in our minds without explicit thought. When we assume something, we’re not directly focusing on it. We fail to challenge assumptions because they seem to us self-evidently true. Assumptions thus leave us with erroneous confidence that our model or our way of thinking is correct,”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“two characteristic features of basic failures: They occur in known territory. They tend to have a single cause.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Never making a mistake is not a realistic or even desirable goal for any of us. Yet all basic failures are caused by mistakes.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Nearly all basic failures can be averted with care and without need of ingenuity or invention. The important thing to remember about errors is that they are unintended—and punishing them as a strategy for preventing failure will backfire. It encourages people not to admit errors, which ironically increases the likelihood of preventable basic failure.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“the best reason to learn how basic failures work is to prevent as many of them as possible. A few insights and practices drawn from an extensive research literature on errors and error management can help you do just that.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Remember that errors—synonymous with mistakes—are by definition unintended.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“The only man who never makes a mistake is the man who never does anything. —Theodore Roosevelt”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Expanding its strategy services, the company started to engage clients in the innovation process. The Simmons failure was partly explained by IDEO’s lack of appreciation for what the client’s manufacturing organization was set up to produce. The behind-closed-doors approach that served the company so well in product innovation services, shielding clients from failures along the way, backfired for strategic innovation services. To do better, IDEO began to hire more people with business degrees to complement the skills of the design, engineering, and human-factors experts. IDEO started to collaborate with clients to help them become better failure practitioners.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Put it this way: The corporate clients paying for those failures at IDEO aren’t hovering over the designers’ shoulders to watch the failures unfold. By the time the project is delivered to its eager customer, it’s poised for success. This is part of the risk-mitigation strategy of any successful innovation department.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“But how does IDEO keep up its enviable reputation with all of that failure? The answer is simple. Most of IDEO’s failures happen behind closed doors. And they happen through disciplined, iterative teamwork that draws on multiple areas of expertise. IDEO’s also a place where company leaders—starting with the visible David Kelley exhorting teams to fail fast and often—have worked hard to build an environment of psychological safety for risk-taking.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Employees are valued for their technical expertise but even more for their willingness to try new things that might not work. To encourage teams, one of the mottoes at IDEO is “Fail often, in order to succeed sooner,” and David Kelley, CEO until 2000, was known to routinely wander the Palo Alto studio cheerfully saying, “Fail fast to succeed sooner.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Over time, he became a failure evangelist, viewing these behind-closed-doors sessions as a chance to “fail as much as you want, as long as you do it 100 percent.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Elite failure practitioners, as you will see again and again, are flexible in their thinking, willing to let go of one line of inquiry to consider another.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“It starts with curiosity. Elite failure practitioners seem to be driven by a desire to understand the world around them—not through philosophic contemplation, but by interacting with it. Testing things out. Experimenting. They’re willing to act! This makes them vulnerable to failure along the way—about which they seem unusually tolerant.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“Taking the time to learn from what went wrong is often the most cringe-inducing aspect of intelligent failure. Not all of us can remain as cheerful as Thomas Edison. You’re not alone if you feel disappointed or embarrassed, and it’s easy to want to push those feelings away. That’s why it’s important to reframe and resist blame and push yourself to be curious. It’s natural to fall prey to self-serving analysis—“I was right, but someone in the lab must have altered something”—which takes us away from discovery. But a true desire to learn from failure forces us to confront facts more fully and rationally. You’ll also want to avoid superficial analysis—“It didn’t work. Let’s try something else”—which generates random rather than considered action. Finally, avoid the glib answer “I’ll do better next time,” which circumvents real learning. What’s necessary is to stop and think carefully about what went wrong so as to inform the next act.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well
“By not doing the work to discover the vulnerabilities that needed to be fixed before a full-scale launch, the pilot failed the company and its customers. The solution is to create incentives that motivate pilots not to succeed but rather to fail well. An effective pilot is littered with the right kind of wrong—numerous intelligent failures, each generating valuable information.”
Amy C. Edmondson, Right Kind of Wrong: The Science of Failing Well