Modern technology has now reached a point where improved safety can only be achieved through a better understanding of human error mechanisms. In its treatment of major accidents, the book spans the disciplinary gulf between psychological theory and those concerned with maintaining the reliabiblity of hazardous technologies. Much of the theoretical structure is new and original, and of particular importance is the identification of cognitive processes common to a wide variety of error types.
James Tootle Reason was a professor of psychology at the University of Manchester, from where he graduated in 1962 and where he was a tenured professor from 1977 until 2001. He wrote books on human error, including such aspects as absent-mindedness, aviation human factors, maintenance errors, and risk management for organizational accidents. In 2003, he was awarded an honorary DSc by the University of Aberdeen. He was a Fellow of the British Academy, the British Psychological Society, the Royal Aeronautical Society, and the Royal College of General Practitioners. He received a CBE in 2003 for his services in the reduction of the risks in health care. In 2011 he was elected an honorary fellow of the Safety and Reliability Society. Among his many contributions is the introduction of the Swiss cheese model, a conceptual framework for the description of accidents based on the notion that accidents will happen only if multiple barriers fail, thus creating a path from an initiating cause all the way to the ultimate, unwanted consequences, such as harm to people, assets, the environment, etc. Reason also described the first fully developed theory of a just culture in his 1997 book, Managing the Risks of Organizational Accidents.
I read this book because working in a pharmacy, I hoped that having an understanding of the psychological basis of error might be helpful in avoiding it.
The book starts with some history. In the early twentieth century Freud was pondering on apparent slips and “accidents” having a basis in the subconscious. I suppose according to Freud, if someone made a slip in dispensing medication it would be because they had some deep seated dislike of a patient, or harboured unconscious opinions about their treatment! Thankfully, other views from the early twentieth century have aged better. In 1905 Ernst Mach wrote that “knowledge and error flow from the same mental sources, only success can tell one from the other.” Mach is referring to the fact that certain helpful types of behaviour, can also cause problems. For example, people have the ability to learn skills which involve a high level of automatic facility, allowing musicians to play musical instruments, typists to type, drivers to drive cars - all without thinking about the mechanics of every string plucked, key pressed or gear changed. But this automatic facility, so useful in many situations, can be a liability when circumstances alter. Step from a manual car into an automatic and you can run into problems when your left foot wants to press a clutch that isn’t there. In a pharmacy, if you have dispensed hundreds of boxes of a medication in a particular strength, there is an opening for error when you come to dispense an unusual strength of that same medication.
I suppose an awareness of this kind of situation does potentially help guard against times when routine brings the possibility of diminished conscious control. But Human Error is not the book to go to if you want simple answers. First there are those bad outcomes arising from useful behaviour. Then there’s the sense that an error is rarely confined to one person. When things go wrong it is usually the result of lots of people making many decisions meeting varied circumstances, which finally lead down to the unfortunate individual who makes a blunder - the last piece in a malign jigsaw puzzle. Then there are the traps in all the means we employ to guard against error - automated systems leading to loss of skills in dealing with problems; or systems protected by layers of defence tending to soak up hidden deficiencies until there is a sudden failure. Oddly, I came away from this book with a greater acceptance of error, even in trying to find a way to avoid it. Error is inevitable, and if you make error a forbidden sin, then you can never discuss or learn from things going wrong.
It is perhaps ironic that Human Error is a highly academic book, which leaves nothing to chance in its numbered sections, sub sections and sub sub sections. It does not flow. Concepts have to be nailed down into endless acronyms, leaving me floundering amongst SLIMs, SLIs, THERPs, PSFs, PIFs and SUs. Even the name Three Mile Island gets turned into TMI. I did not enjoy wading through this academic acronym code. I can’t see any problem with calling Three Mile Island by that name as many times as required.
Nevertheless, if you can live with the style, and accept that you won’t find an easy prescription that will make you a more accurate, less error prone person, this is a very interesting book. I would recommend it to anyone working in a job where a small slip can have serious consequences; or to anyone making big decisions, where small, unintended consequences in those decisions can store up serious problems for the future.
A difficult read but worth it to understand human error from the perspective of the cognitive psychologist. Reason is the author of the "Swiss Cheese Model" which holds that accidents are the result of the layering of latent failures on top of unsafe acts and local triggers. The book also details the difference between skill-based, rule-based, and knowledge-based error types and presents a theory of error forms which include frequency gambling and similarity matching. While there is little in the way of application, knowing the fundamentals is always a good first step. I suggest reading "Why We Make Mistakes" by Joe Hallinan first as it is a light, fun read which provides an overview of many of the topics in "Human Error". Then when you read this book you can draw connections to stories in Hallinan's text which make the psychology easier to understand.
When I bought this book I did not expect it to be such a gripping. It covers some pretty interesting classifications of error forms and types and provides many theoretical information and references. I particularly enjoyed the classifications Skill-based, Rules-based and Knowledge-based.
The case studies were interesting and gave a nice final touch to the topics being exposed.
I would enjoy some info on techniques for error prevention but this was not the intent of the author, so it's all good.
For those that do not have English as first language and are not used to the language, this book can be a challenge, given that the style is closer to an academic paper than to a casual reading - although the author cracks a few jokes here and there. For me it was a pleasant reading, but I saw some people on Amazon complaining about the style.
If you are involved with safety, QA or any other area where human errors can have significant impact, consider checking this book out to understand the nature of your team's issues (stress, too much distraction, stubbornness, overconfidence, lack of training, etc.)
This book is a very academic look at the sources and types of human error. I read this partly out of general curiosity and partly out of professional development. After all, addressing human error is key in just about any manufacturing field.
I'll start off the review itself by noting that this book is not designed as a quick read. It is much more like a psychology textbook, so be warned before you dive in. With that said, the book is very strong in terms of clearly defining what is known and what is assumed, and it provides plenty of references for those who want to dig into a subject deeper.
The author starts off as you might expect by defining error and establishing certain categories that will be used later to discuss general aspects of errors. He next moves into descriptions of task types and how errors typically show up in each of these tasks. He broke down most tasks into skills-based (most automatic), rules based, and knowledge based (most cognitive load). I hadn't seen this particular breakdown before, and I think it does a good job of delineating different types of tasks in a reasonable way.
Next up is a description of how errors happen in each of the three task types. He also spends a good amount of time discussing detection of errors, which I also found interesting. The author focuses a LOT of time on nuclear power plants for examples, which I think is just because there is more data there, but he also mentions things like chemical plants specifically from time to time, which was directly useful for me.
Finally, the author spends some time talking about how to reduce and eliminate errors. I was hopeful that this section would be the most useful, but I unfortunately found it lacking in providing anything concrete or even directional.
Overall, the book was just OK. Part of the problem is that it was written in 1990. Which to folks like me doesn't sound that old, but is getting up there in terms of any scientific discipline. Also, in addition to the low readability somewhat inherent to the textbook style, the author doesn't provide a lot of examples that I think would help illustrate his points more effectively. Perhaps for someone more versed in the field these wouldn't be necessary, but as a layperson I think it would have helped me grasp the content more fully.
Overall, I wouldn't recommend this book to most folks. If you are really interested in basic error theories go for it, but as someone in industry I didn't feel like it is directly useful in its current form. I do have another book or two on my list that might be more what I'm looking for, so hopefully one of those will fill the void a bit better.
I work a lot with Excel models and slides on due diligence and other projects. I encounter errors very often. Few observations from my experience:
Factors contributing to errors
-Challenging timeline / approaching deadline
-Stress created by your manager or client (in most cases, stress comes from your manager)
-Lack of sleep
-Multitasking
-Lack of independent review. I noticed that it’s very important to split the execution from the review roles. A person who does the analysis is more biased and therefore can’t spot errors as effectively as an external reviewer
Relationships biase. When I ask a junior person to review my own work, in most cases the junior person can’t find an error. I think it’s due to the junior team member’s assumption that I do work better them her (which may not be the case)
Factors that reduce errors:
-Sufficient time
-Low stress level
-Curiosity and critical thinking
-Good model design (e.g., design that includes multiple safety check devices)
-Independent review by a more senior / experienced person
It’s important to distinguish 2 error types. Mistake is a planning error. Slip is the execution errror.
Active errors: effect is visible momentarily. Latent errors: effect is not visible right away. Laten errors are more dangerour than active errors.
“One of the important lessons of these case studies is that disasters are very rarely the product of a single monumental blunder. Usually they involve the concatenation of several, often quite minor, errors committed either by one person or, more often, by a number of people”
“In one training study concerned with word processing, subjects who were denied the opportunity to commit errors performed worse than other groups who were allowed to make errors”
“Training should teach and support an active, exploratory approach”
“Most adults approach training with the belief that errors are undesirable. Moreover, they do not like to be made to feel stupid. To counteract this, it is helpful to present heuristics (It is good to make mistakes: they help learning’, etc.). The goal of such heuristics is to change the attitude of trainees from ‘I mustn’t make errors’ to ‘let me see what I can learn from this error’”
“It is now widely held among human reliability specialists that the most productive strategy for dealing with active errors is to focus upon controlling their consequences rather than upon striving for their elimination”
I read "Managing the Risk of Organizational Accidents" and "Organizational Accidents Revisited" prior to starting James Reason's "Human Error". This book is referenced in the other two, either in similar discussion topics or directly as a reference. It's also referenced in other books on the same topic, so I wanted to read it at some point.
It was worth the read for me. I like the lengthier discussion on Rasmussen's skill-, rule-, and knowledge-based thinking schemes and their implications. The survey of accidents (current to the time of publication, 1990) and the brief discussion of error reduction techniques and risk management was also enlightening.
Overall though, this is one I wish I had only read a portion of, as advised by the author himself early in the book. I did not need to read the chapters on the history of error research (though they were interesting) and probably could have skipped some of the lengthier sections detailing certain cognitive research studies.
If you work in nuclear power generation or another complex technology industry, this is worth your time. Probably not cover to cover though. I'd advise skimming the book and diving into what seems more interesting and/or useful.
Interesting, but dry, book describing human errors and discusses many findings on how and why humans make errors.
It is at its most helpful in the attention it gives to classifying human errors affecting complex systems: active errors, whose effects are recognized almost immediately; and latent errors, whose adverse effects may lie dormant for a long time until they combine with other factors to cause a system breach.
Latent errors have been shown to pose the greatest threat to the safety of complex systems. The book uses fascinating discussions from six famous catastrophes to illustrate the importance for reliability engineers to focus on design and maintenance issues to help avoid these latent errors.
While active errors and equipment failures are important and deserve attention, it is shown time and time again that latent errors are behind most disasters
By far best part of the book is the discussion of the six famous disasters: 3 Mile Island, Chernobyl, space shuttle Challenger, Bhopal, the capsizing of the Herald of Free Enterprise, and the King's Cross fire in 1987.
A scholarly, but highly readable, examination of the state of error detection and prevention which at the time was cutting edge. His case studies are very illuminating and demonstrate that it is rarely, if ever, simply a case of human error when there are disasters in complex systems like nuclear power stations, but an alignment of latent faults which, while none of them of themselves would cause the disaster, combine to make it both happen and unpredictable.
It's a surprisingly easy read for such a dry subject. I learned a lot about the way we form knowledge, skills and how we make mistakes. The last few chapters were very specific to nuclear and industrial applications, so only about 2/3 of the book were of any use to me. I highly recommend it to anyone interested in the subject.
Probably 3 1/2 stars. Some of this book was really interesting. I got a bit list in the sections having to do with genetics, but for the most part, the writer made the scientific understandable. I learned more about evolution and random selection as well as history of early man.
This one is not as easy to read as regular prose, it is a dry scientific text with details of experiments and models. It is a good reference if you're researching human errors.
I read 2+ chapters of this book and decided it was too technical for me. The author appears very knowledgeable, but this is a textbook and definitely beyond my knowledge level.
Activation Specific activators - mental scratchpad General activators - frequency of prior use
Boundary categories - Was there a prior intention to commit this particular violation? If no, then erroneous or unintended violation If yes, then sabotage
Routine violations - (a) natural tendency to take the path of least effort; and (b) a relatively indifferent environment (ie, one that rarely punishes violation or rewards observance)
Exceptional violations - tasks or circumstances that make violations inevitable no matter how well-intentioned the operators might be
Distinguishing between errors and violations - violations, like drinking and driving; close following; disregarding speed limits; erroneous behavior: failing to see signs, failing to check mirrors, misjudging speed of oncoming vehicles.
Violations declined with age; errors did not.
Defences: The limited window of accident opportunity - personal safety equipment, guards preventing direct contact with dangerous materials or moving parts; engineering controls such as automatic safety devices and levels of containment
Unsafe acts - more than just an error or violation - one that is committed in the presence of a potential hazards: some mass, energy or toxicity that, if not properly control, could cause injury or damage.
I'm very interested in exploring the origins of human errors. And this book by James Reason seemed liked a good read. I was about to travel and I didn't want to buy a paperback or a hardcover book--there are severe weight limitations for air travel. But I have a Kindle, and this book was available in the Kindle edition.
I should say that I tend not to write negative reviews. But in this case, I have to make an exception. Now my complaints are not with the content of the book. I wish it was written a bit better, but I knew what I was buying when I got the book. But I'm horrified at the number of typos in the Kindle edition! I assume the real-world-book doesn't have these (I hope). But every single page on the Kindle had a typo, or missing text, or strange characters, and so on. It was VERY frustrating reading the book (and nearly impossible to read the Appendixes). Clearly, Dr. Reason haven't given even a first look at how his book on Human Error was being rendered on the Kindle. To write a book about individual and systems failures and then to publish such a poor quality book is amazing to me.
If Amazon doesn't figure out the quality standards for Kindle editions, then there really isn't a point of taking a risk and buying these books in this format.
Ultimately, it is the author that is responsible for the quality of his book. I hope Dr. Reason will see this review and move his editors to make some very needed changes. Until then, DO NOT buy the Kindle version of this book--it's not worth the money!
This is a difficult read-unless you're studying for your doctorate in sociology or psychology-which I am not. I picked this book up after reading another one of the author's book on accidents which was a far easier read. However, I did get the basics on types of human errors and a general overview of the analyses used in the field-which are inconclusive. Humans will always make errors. We have moved along since this book was first published 1990, in finding the root causes of accidents in nuclear power and other hazardous industries. With modern tech, the march continues on.
This book is probably one of the worst reads I have encountered. It is the basis of the most awkward workplace incident investigation and analysis methods I have ever used. After opening this book I understand why this analysis process is so cumbersome and ineffective. Visit the link for related information. http://www.asasi.org/2004_PPTs/Young%...
Very detailed analysis of the human components of industrial accidents. The material has been around a while but it's still highly applicable to todays world of work. The modes of errors (GSM) theory has been picked up fairly recently and has become an important part of many progressive companies. Great book for practitioners focused on the human aspects of accident prevention.
Proposes small theories on how we make mistakes that cause accidents. The cause of accidents are far more simple than we might expect but not as simple as human error to be done with.