Patient Safety Focus: Patient Safety Concepts

PSF header


 

PATIENT SAFETY CONCEPTS

The Monday Morning Quarterback
Hindsight Bias in Medical Decision Making

By Karen Appold

Have you ever heard someone say, “I knew that was going to happen?” But everyone knows that predicting the future always becomes much easier once the future is in the past. Sayings like “hindsight is 20/20” and “Monday morning quarterback” are also popular ways to describe such predictions. These expressions refer to what psychologists call hindsight bias, the tendency for people to regard past events as expected or obvious, even when in real time the events perplexed those involved.

Quarterback
Monday morning quarterbacking in medical decision making 
can be minimized if decisions are evaluated 
based on the information available at the time of the decision, 
and not after the outcome is known.

Safety analysis experts are now applying the hindsight bias model to medical decision making. Here hindsight bias describes the tendency to judge the events leading up to an accident as errors because the bad outcome is known (1).

Take, for example, the way in which clinicians interpret a lab result. The clinician has a certain amount of information that allows him to assess the probability of a diagnosis or likelihood that a condition will worsen. He then makes decisions based on those probabilities. However, when authorities, like a consulting clinician or an expert witness, assess whether or not the treating physician made a good decision, and the outcome is unknown, the experts will often support the decision. But if the experts learn that a patient had a negative outcome, then they are more likely to think that the decision was poor. In other words, knowing the outcome in the present affects the experts’ analysis of events in the past.

This thinking process is a universal human phenomenon. “Cognitive psychologists have realized that hindsight bias is almost unavoidable. When a negative outcome occurs, the tendency is to think that if a different action would have been taken, the outcome could have been prevented,” explained Rodney A. Hayward, MD, director of the Robert Wood Johnson Foundation Clinical Scholars Program and professor of medicine and public health at the University of Michigan in Ann Arbor. “Ultimately, a decision is made with the intent to reduce the risk of something bad happening. But any action bears certain costs and risks.”

Hayward recalls a particular scenario involving a lab test result that shows how hindsight bias can easily happen. A patient’s creatinine level increased from 1.3 to 1.6 units over 3 months. The patient appeared to be fairly stable, so the clinicians weren’t too concerned and scheduled a follow-up visit in 3 months. Three weeks later, however, the patient was hospitalized because his creatinine level spiked. Upon reflection, the clinicians questioned whether they should have re-checked the patient sooner.

“But the reality is that the difference between those two creatinine levels is not that unusual and is almost within the range of lab error,” Hayward said. “In addition, it is unclear if that slight bump was a warning sign. If we re-checked everyone with such a slight increase, we would constantly inconvenience patients and it would be costly.”

This is a good example of hindsight bias in medical decision making because when something negative happened (the patient was hospitalized), the original decision (to not re-check him earlier) was questioned. But if the decision was evaluated in real-time, most likely everyone would have agreed that a routine 3-month follow up appointment was appropriate and the slightly higher creatinine level was not of concern.

To minimize hindsight bias in this instance, clinicians should question whether a patient with such a small change in creatinine level in a 3-month period should be rechecked weekly. In general, a good technique for minimizing hindsight bias is to formally consider alternative explanations that do not involve errors.

Examples of Hindsight Bias

  • “I knew they were going to lose.”
  • “That’s exactly what I thought was going to happen.”
  • “I saw this coming.”
  • “That’s just common sense.”
  • “I had a feeling you might say that.”

In fact, making too many changes based on small random errors in lab tests can actually have negative outcomes. “Some research suggests that if you check an international normalized ratio (INR) to monitor warfarin therapy too frequently and make constant adjustments, you will actually do less good than if you check it at slightly longer intervals and make smaller adjustments. This is because you may overcorrect for random variations in INR by getting too much information from lab tests,” Hayward explained.

To some degree, hindsight bias can’t be completely avoided. “When we criticize certain decisions and policies, we need to consider the cost, as well as the risk and consequences of alternative actions.”

One problem with hindsight bias is that it leads us to criticize or even punish healthcare workers who make decisions that lead to poor patient outcomes. To avoid the negative consequences of hindsight bias, Hayward advises against adopting new policies or punishing employees without reflecting on the full process. It is better to evaluate whether the decision was reasonable and determine the best action to take in the future.

REFERENCE

  1. Hindsight bias. Available online. Accessed August 13, 2010.

Karen Appold is an editorial consultant for the clinical laboratory industry. 
Email: karenappold@comcast.net

Page Access:

Patient Safety Focus Editorial Board

Chair
Michael Astion, MD, PhD
Department of Laboratory Medicine 
University of Washington, Seattle

Members
Peggy A. Ahlin, BS, MT(ASCP) 
ARUP Laboratories 
Salt Lake City, Utah 
James S. Hernandez, MD, MS 
  Mayo Clinic Arizona 
Scottsdale and Phoenix

Devery Howerton, PhD

Centers for Disease Control and Prevention 
Atlanta, Ga.

Sponsored by ARUP Laboratories, Inc.
ARUP Logo