Cognitive Biases

It is increasingly recognized in the field of behavioural psychology that cognitive biases, also known as systematic errors in cognition, are important factors contributing to flawed clinical decision making, diagnostic errors and safety incidents. A cognitive bias can be described as a systematic error in thinking that distorts perception, processing and interpretation of evidence, favours certain viewpoints over others, affects judgment, and often leads to erroneous decisions. A predominant theory to explain the occurrence of cognitive bias is the dual process theory of human cognition, which asserts that decision making is a function of two different modes of thinking known as System 1 and System 2. Initially proposed by the psychologists Keith Stanovich and Richard West, System 1 involves intuitive, subconscious, fast, effortless, and automatic thought processes that are appropriate for making routine decisions. In contrast, System 2 is characterized by deliberate, conscious, slow, effortful, and controlled reasoning that may also include complex calculations. According to Kahneman (2011), System 1 operates continuously with no sense of voluntary control, generates complex patterns of ideas, and creates impressions, intuitions, intentions, and feelings, but only the analytical System 2 can construct thoughts in an orderly series of steps. Systematic errors of intuitive thought are often difficult to identify, understand and prevent, which makes System 1 particularly prone to cognitive biases. System 2 can play a significant role in monitoring and overriding incorrect judgements; however, it requires continuous vigilance, effort and attention.

Cognitive biases are often associated with the use of instinctive rules of thumb and shortcuts in thinking called heuristics. These mental shortcuts associated with System 1 mode of thinking allow people to cope with the limitations of working memory, simplify available information, make judgments and decisions rapidly, and solve problems efficiently without taking an excessive amount of time to analyze details. Although helpful and effective in most everyday situations, heuristics may result in systematic biases, faulty synthesis of information and suboptimal decisions. Some of the common cognitive biases contributing to patient harm are outlined below.

Mitigation Strategies

Taking into consideration highly complex socio-technical healthcare system, multidisciplinary clinical care processes, unpredictable process inputs, and dependence on human interventions, a multifaceted approach is required to mitigate the effects of cognitive biases. However, there are no simple, easy and universally applicable solutions that will make a difference in all situations. A number of mitigation strategies leading to better decision making, optimized performance and reduced likelihood of human error involve systematic application of the process design and human factors engineering principles, tools and techniques including:

  • Design robust systems and processes that are resilient to unavoidable human errors
  • Reduce excessive reliance on sustained attention and alertness
  • Minimize cognitive workload and make it easy for people to do the right thing
  • Apply methods to detect, correct or mitigate errors before further processing
  • Optimize complex interactions between people and technical subsystems
  • Minimize the need for maintaining or transforming information in working memory
  • Prevent information overload by controlling the timing and volume of information
  • Understand and mitigate the influence of heuristics and cognitive biases on decision making
  • Create, promote and sustain strong organizational safety culture
  • Establish clear behaviour expectations for safety, high reliability and process excellence
  • Identify and remove inadvertent incentives for risky behaviours
  • Minimize the number of handoffs and follow a standard handoff process
  • Use standardized work to support sustainable improvements
  • Implement structured communication protocols between care providers
  • Use repeat-back and read-back communication techniques coupled with clarifying questions
  • Apply innovative approaches to minimize distractions and interruptions
  • Ensure the use of forcing functions or constraints in safety-critical tasks
  • Use differentiation to eliminate “look-alikes and sound-alikes”
  • Reduce the need for training and instructions by using affordances, checklists and visual cues
  • Reduce system complexity and increase situational awareness
  • Design tasks that are compatible with the skills, abilities and limitations of people
  • Identify and address organizational factors that contribute to vulnerabilities in a process
  • Examine and address effects of multiple performance shaping factors on human performance
  • Actively engage System 2 mode of thinking in the diagnostic process
  • Gather relevant data from all available sources and verify the accuracy of information
  • Develop a differential diagnosis until the final diagnosis can be established with confidence
  • Maintain questioning attitude and consider the worst-case scenario
  • Investigate contradictory findings and consult with an independent expert source
  • Use STAR (Stop, Think, Act, Review) self-checking technique before responding to a situation
  • Implement evidence-based bundles to improve critical care processes and clinical outcomes
  • Recognize and mitigate limitations of human cognitive systems for multitasking
  • Ensure that all system components operate in a predictable and consistent manner
  • Identify and control key process inputs to increase the predictability of outcomes
  • Reduce variation in process performance and outputs
  • Modify physical environment with the aim of reducing the probability of human errors
  • Design workstations through the application of biomechanical concepts and principles
  • Focus on both the physical and cognitive elements of the user-system interface
  • Design the user interface to accurately reflect the dynamics of the physical system
  • Ensure that display representations are compatible with the user’s mental model
  • Consider sociocultural characteristics, needs, expectations, and preferences of people
  • Determine implications of the alternative design solutions on human performance
  • Emphasize the importance of testing before full implementation of the new process
  • Perform usability evaluations throughout the design cycle
  • Leverage the use of simulation to test, evaluate and improve the process
  • Apply signal detection theory in designing alert and alarm systems
  • Ensure that information and data displays are free from visual clutter
  • Reduce the learning curve associated with the new technologies, tools and tasks
  • Automate only improved, verified and validated processes
  • Provide alignment among system components and leverage interdependencies
  • Ensure complementary allocation of functions between humans and technology
  • Use appropriate levels of automation for different human information processing stages
  • Apply 5S methods to create and maintain organized, safe and efficient workplace
  • Perform and maintain Failure Mode and Effects Analysis (FMEA)
  • Take a broader view of the organization to reduce the risk of system sub-optimization
  • Be aware that design changes may create new opportunities for errors