Diagnostic Error
Models of Cognitive Reasoning and Bias
- Reference: Croskerry 2013
- Popularised model is Kahneman’s dual process theory of Fast and Slow thinking. Designates cognitive processes as either Type 1 (fast, heuristic-driven) and Type 2 (analytical) thinking
- Type 1 thinking appears to be by far the most commonly utilised method (95%), is more prone to error and is largely unconscious
- Type 2 thinking appears to be fairly reliable, safe and effective but slow and methodical
- Biases are systematic errors and have been coined ‘predictable deviations from rationality’
- Cognitive biases – Error in thinking
- Affective biases – Ways in which our feelings influence our decision-making)
- Quality of decision-making is also affected by:
- Environmental factors
- Context, team factors, patient factors, workload, ergonomic design
- Individual factors
- Affective state, fatigue, cognitive load, decision fatigue, sleep deprivation, interruptions
- Environmental factors
- The theory is that through alerting the analytical thinking part of our brain (toggling / executive override mode) to situations with potential for bias, clinicians can initiate a debiasing intervention to prevent this from occurring
- In addition, addressing the ambient factors above to improve decision-making conditions may prove crucial in optimising decision-making capacity
High-risk situations
How big is the problem?
Diagnostic error in primary care settings is seen in 7-17% of cases.
In the ED, 3% of cases suffer medical error, 70% of which is due to diagnostic error.
A substantial portion of malpractice suits are related to misdiagnosis.
It is unclear how much of this error can be attributed to cognitive errors vs. system issues.
Examples of cognitive biases
Aka Cognitive Dispositions to Respond (CDR’s)
Aggregate bias – Physicians believing population-level data does not apply to their exceptional patient.
Anchoring bias – Tendency to lock onto salient features of the presentation too early in the diagnostic process and fail to adjust this in light of new information.
Ascertainment bias – Thinking shaped by previous experience e.g. gender bias / stereotyping
Availability bias – More recent and readily available answers are favoured
Base rate neglect – Underlying incident rate is ignored
Commission bias – Tendency towards action to help the patient when in face mindful inactivity would be better
Confirmation bias – Looking for information gained through consultation to confirm initial preconceived ideas, often in tandem with anchoring bias
Conjunction rule – The incorrect belief that the probability of multiple events being true is greater than one overriding cause i.e. Occam’s razor
Overconfidence – Confidence in judgement outweighs the actual probability of accuracy
Search satisfying – Ceasing to look further once a cause is ‘found’
Diagnostic momentum – Not altering diagnosis or plans when subsequent information comes to light
Framing – Reacting differently depending on how information is presented to you
Omission bias – Not considering broad differentials or not performing an action when it is perceived as difficult
Feedback sanction – Slow or absent feedback of decisions leads to ignorance of consequences of previous wrong decisions/actions
Fundamental attribution error – Tendency to blame patients for their illnesses (disposition) rather than situational factors
Gambler’s fallacy – If coin tossed 10 times and lands heads each time, belief that the 11th will be tails
Hindsight bias – Knowing the outcome may profoundly influence perception of past events leading to over- or under-estimation of decision-making capabilities
Multiple alternatives bias – Reverting to less options in differential for ease of cognitive load
Order effects – The order in which information is received/given alters the tendency to recall
Outcome bias – Tendency to opt for diagnoses that will result in good outcomes
Playing the odds – Tendency to opt for benign diagnoses in equivocal cases (opposite of ‘worst-case scenario’ thinking)
Posterior probability error – Physician gambles on previous events recurring (opposite of gambler’s fallacy)
Premature closure – Accounts for high proportion of diagnostic errors. “When the diagnosis is made, the thinking stops”
Representativeness restraint – Looking for prototypical presentations of disease only, thus missing atypical presentations
Sutton’s slip – Only looking for the obvious and thus missing more subtle presentations of illness
Sunk costs – The more a clinician invests in a diagnosis or management plan, the less likely they are to change this decision/path.
Triage cueing – E.g. Assuming ambulatory care patients are well and acute care patients are unwell. “Geography is destiny”
Unpacking principle – Failure to elicit all relevant information, often enacted by interrupting a patient in providing their history
Vertical line failure – Predictable thinking patterns utilised and failure to think laterally ‘What else could it be?’
Visceral bias – e.g. countertransference
Yin-Yang out – Assuming that patients that have undergone extensive evaluation previously will not benefit from further diagnostic error
Strategies (Croskerry. 2013)
Educational
- Clinical reasoning theory sessions
- Bias inoculation – Develop specific tools to assess for affective and cognitive bias and for subsequent debiasing
- ‘Consider the opposite’ marginally reduced anchoring in judgements of personality traits
- ‘Cognitive forcing strategies’ show variable minor benefits
- Simulation training involving bias training
- Learning by observation of others biases/mistakes
- Specific training for specific content areas e.g. radiology training to identify SAH
Workplace
- Get more information
- Structured data acquisition – e.g. differential diagnosis checklist tools
- Affective debiasing – Deliberately consider emotional context of decisions
- Metacognition, reflection and mindful practice
- Slowing down strategies
- Encourage scepticism
- Recalibrate – i.e. identify high-risk situations and alter thinking patterns accordingly
- Group decisions – Second opinions may improve rationality
- Personal accountability – People make better decisions when accountable for them
- Supportive environments – Avoiding fatigue, hunger, cognitive overload and sleep deprivation. Make protocols/guidance readily accessible
- Exposure control – Limit availability or use of information prior to case review e.g. avoid looking at previous/nursing/resident notes before clinical assessment to avoid inherent biases in them
- Sparklines – Infographics showing prevalence of certain illnesses can reduce base rate bias
- Decision support systems – e.g. ISABEL
- Ensuring feedback is built-in to systems
- Combined approach to ECG interpretation e.g. first-look and systematic
- Decision support tools built into workflow
Nudge
- Low-cost signals in the environment that alter people’s behaviour in a predictable way without forbidding any options or significantly changing economic incentives
- Example: Putting fruit at eye level in the supermarket to predictably increase purchase
Boost
- Seek to improve decision-making power of the individual
- Work with heuristics to make them smarter and intuitive
- Guided by principle that heuristics work better when the individual’s cognitive skillset and environment work in tandem
- Boosts can therefore exist in the environment OR cognitive skills domains
Nudge plus
- Classical nudges take advantage of the inherent biases in Type 1 thinking while nudge plus builds on this by incorporating a self-reflective component
- Theory is that classical nudge alters behaviour while the reflection component extends the autonomy of the individual and may promote longer term behavioural change
The future of cognitive debiasing
Serious Game Development
- Sellier et al. 2019
- Training interventions reliably improve reasoning in specific domains, but have failed to prove beneficial in generalised novel problems unless extensive (e.g. formal statistics courses) or trainees knew they were being tested
- This works in the lab but NOT in the field, where cues to bias are absent
- At worst, training is a Hawthorn effect or could interfere with heuristics
- At best, debiasing training may be domain-specific
- Morewedge et al. 2015 showed a game-based training intervention delivered as a single-shot exhibited large and long-lasting debiasing effects in the laboratory context
- Incorporated 4 debiasing strategies
- Warnings about bias
- Teaching its directionality
- Providing feedback
- Extensive coaching and training
- Benefits seemed to be due to personalised feedback across multiple bias-eliciting paradigms and domains
- The authors of this study delivered 1 game-based intervention targeted at reducing confirmation bias to business students before or after they completed an entirely separate business case analysis without knowledge of linked analysis and in a different paradigm/sphere
- The intervention reduced confirmation bias by ~30%
- Details of the game itself (MISSING: The Pursuit of Terry Hughes) are broadly available at:
- Symborski, C.W., Barton, M., Quinn, M.M., Korris, J.H., Kassam, K.S., & Morewedge, C.K. (2017). The Design and Development of Serious Games Using Iterative Evaluation. Games and Culture, 12(3), 252-268
Clinical Decision Support Tools
- Digital support developed into EMR to assist identification of blind-spots, broaden differentials and aid decision-making
Cognitive Load Theory
Tasks broken down into:
- Intrinsic Load – Things associated with the task itself
- Extrinsic Load – Everything not essential to the task itself e.g. background noise
- Germane Load – The background processes of the brain to make sense of information
Aim is to manage intrinsic load, minimise extrinsic load and maximise germane load.
ED physicians are interrupted roughly 10 times per hour, with frequent task changing and multi-tasking attempts.
Tools to address cognitive load
- Delegation
- Call for help
- Write more, remember less
- Complete tasks one at a time
- Batch tasks
- Avoid decision-density i.e. flatten the decision curve
- Close the loop on all directives/decisions
- Manage interruptions
- Make time for quiet e.g. handover, thinking
Last Updated on August 18, 2021 by Andrew Crofton