How Cognitive Biases Contribute to Human Error
Sometimes it appears that workers make mistakes due to irrational thinking. However, to minimize human error, it’s essential to understand a little psychology around the factors that contribute to it. Cognitive biases shed some light on how workers make mistakes. Workers believe they are making rational decisions when they are operating within their biases. Knowing a little about biases can help us see how workers believe they are making the right decision, even though it’s sometimes the wrong one that leads to an accident.
What are Cognitive Biases?
Cognitive biases are systematic errors in thinking that occur when people process information, and they influence decisions and judgments that we make. Is everyone affected by biases? Yes! Having a brain means you have biases. Biases are evolutionary, and human beings have biases for good reasons. They allow us to use prior knowledge and experiences to make quick decisions for our survival. Think of biases as shortcuts in thinking. If we didn’t have shortcuts, our brains would not be able to handle the cognitive load from thinking about everything there is to think about. It would take forever to make a decision, and that’s not good in an emergency. So, there is no way to eliminate biases, but understanding them helps us to better understand how workers make mistakes.
Biases that Contribute to Human Error
There are hundreds of cognitive biases. Below are examples of three common biases that affect performance in the workplace. These are the dominant biases contributing to skill-based, rule-based, and knowledge-based behavior.
Frequency Bias and Skill-based Behavior
When a worker performs a task so frequently it becomes intuitive, they may develop frequency bias. They may become inattentive to the task because they perform it so much, become comfortable with risks, and are less sensitive to hazards that are present. Frequency bias is suggested to be the dominant mechanism contributing to skill-based behavior. Skill-based behaviors are mostly intuitive, physical actions like operating a hand drill, recording information from a gauge, or clicking valve positions.
Similarity Bias and Rule-Based Behavior
A worker’s previous experience with a task can override the fact that they are wrongly applying knowledge to a similar task. For example, a maintenance worker may not notice subtle differences in repairing a newer model of machinery because his knowledge is limited to an older but very similar model of the machinery. Similarity bias is suggested to be the dominant bias contributing to rule-based behavior. Rule-based behaviors are “thinking” behaviors. The maintenance professional may think, “I’ve repaired similar machinery based on this procedure and applying this set of rules,” and apply the procedure and rules without referring to the manual.
Confirmation Bias and Knowledge-based Behavior
A worker may not seek counter-evidence against a belief about something but instead only collects information that supports his or her belief. So, what the worker doesn’t know isn’t the problem, but what they think they know that isn’t true is. A widely known example is when the drill crew on the Deepwater Horizon decided the pressure gauge must have been faulty when they noticed anomalies in the pressure response instead of investigating it further. They only looked for information to support that the gauge was malfunctioning. Confirmation bias is suggested to be the dominant bias contributing to knowledge-based behavior. Workers become over-confident about their knowledge and beliefs and do not look for any information that does not support what they want to be true.
How can we minimize human errors?
When workers make decisions from cognitive biases, they believe they are making rational decisions. We all have trouble recognizing our own biases, and trying harder to recognize them when we are performing a task isn’t the answer. It’s better to look for error-likely situations at your facility and implement best practices to improve human reliability.
Join us in Austin, Texas, October 25-26, 2022, to learn more about stopping human error. You’ll learn tried and true strategies as well as how to design a human performance improvement program. It’s the last time this course will be offered this year!
Research referenced for this post:
Psychological Biases Affecting Human Cognitive Performance in Dynamic Operational Environments