July 2011 Clinical Laboratory News: The Slippery Slope of Errors

The Slippery Slope of Errors 
Steps Labs Can Take to Avoid the Normalization of Deviance

By Karen Appold and Michael Astion, MD, PhD

People who prefer to sleep in are often in the habit of rushing to get to work on time. In their rush, they may multitask to make up for the lack of time by combing their hair and using their mobile devices while driving. Obviously, these behaviors increase the chances of a serious accident. But late sleepers who have never been in a wreck gradually adopt these risky actions as part of their normal routine to compensate for sleeping in. They may even rationalize that more sleep improves their work performance. Despite that little voice telling them they should go to bed earlier instead of risking an accident, their unsafe morning routine has become acceptable to them. This is an example of the normalization of deviance.

In the workplace, the normalization of deviance refers to the gradual acceptance of incidents or activities that were initially defined as deviant and unacceptable. Typically this occurs because organizations fail to deploy their mission, vision, and goals at the ground level with employees. Factors that favor the normalization of deviance include inadequate or incompetent supervision, time pressure, and resource scarcity.

Lab Examples of the Normalization of Deviance

  • Repeating quality control until it is in range
  • Allowing relabeling of mislabeled specimens that are replaceable
  • Allowing physicians to opt out of a critical value policy
  • Accepting outdated specimens
  • Labeling specimens in batch at the nurses’ station
  • Accepting a culture of rudeness on the phone
  • Ordering a CBC, TSH, and basic metabolic panel on all ambulatory patients

Lessons Learned from The Challenger

Diane Vaughan, PhD, coined the term “normalization of deviance” in her book, “The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA.” Vaughan is a professor in the Departments of International and Public Affairs and Sociology at Columbia University in New York.

She concluded that the Space Shuttle Challenger disaster on January 28, 1986, which killed all seven crew members, occurred due to normalization of deviance, rather than misconduct. The events leading up to this tragic accident are a good example of how normalization of deviance transpires. Organizations typically accept an initial deviant occurrence or two because it was unintended and little or no harm occurred. However, the initial incidents establish a precedent for accepting more and more deviance.

In the case of the Challenger, when the first deviation from the expected performance happened involving O-ring pressure seals during flight, the problem was fixed. Overtime, more damage took place on subsequent flights. Eventually, more serious damage became acceptable to NASA. “Because failure did not occur and the causes of the problem continued to change, the pattern continued,” Vaughan explained. “It became normal to fly with the flawed design.” Eventually though, problems with the O-rings caused the Challenger disaster.

The normalization of deviance often results from patterns of information that disguise the seriousness of the problem. For example, as NASA moved forward with launches, each decision seemed rational. “Technical deviations indicating something was wrong were interspersed with information that all was well,” Vaughan said. “What in retrospect seemed like strong warning signals of danger prior to the accident did not have the same meaning to insiders at the time events were occurring. Signals were mixed, weak, and routine.”

Don’t Let Deviance Become the Norm

In the laboratory, normalization of deviance tends to occur when employees become less sensitive to slight changes or take small shortcuts to improve work performance, according to Mark R. Chassin, MD, president of The Joint Commission in Oak Brook Terrace, Ill. The problem is that laboratory professionals don’t realize this behavior can cause unsafe situations that eventually can harm patients. Chassin believes they fall into this behavior when strong safety systems aren’t in place and the lab doesn’t have a clear commitment to follow safety protocols without fail.

As a former commissioner of the New York State Department of Health, Chassin oversaw medical institutions’ quality programs and had the opportunity to observe normalization of deviance events in clinical laboratories. He recalled one instance in which patient and specimen identification labels were slightly cut off as they were dispensed from the label maker. As the problem worsened, some letters and numbers were omitted. “That small abnormality went unreported until it caused several patient misidentifications,” Chassin said. “If people would have been more attentive to that slight change earlier and called it to the attention of the right individuals sooner, then it wouldn’t have led to the misidentifications of patients and specimens.”

Another example involved reporting of highly abnormal test results. A technologist had difficulty reaching clinicians when reporting critical lab values. He became satisfied with leaving the information with a clerk or non-clinician, even though the protocol called for leaving it with a nurse or doctor. Since nothing bad resulted from the technologist’s actions, this deviation continued. He eventually developed a habit of being less careful about adhering to the critical value policy, even though this increased the likelihood of an adverse event.

Taking Action

The Joint Commission has put in place several standards to help labs combat the normalization of deviance, and it has established safety guidelines, which identify situations that increase the likelihood of patient harm.

“If there is a problem, no one should feel intimidated about identifying it, regardless of how junior an individual may be,” Chassin said. “For example, a lab technologist asks a doctor, ‘Are you sure you want to order this test?’ The doctor replies, ‘That’s a really stupid question; it is on the ordering sheet.’ This kind of intimidating behavior, although subtle, is likely to result in the technologist not questioning the doctor again, even if he or she feels that the doctor is creating an unsafe situation for patients. Behavior that diminishes communication needs to be eliminated when building a safety culture. Otherwise, such behavior promotes the normalization of deviance.”

Looking for The Joint Commission Standards?

The Joint Commission standards are available in print and electronic formats and can be purchased from Joint Commission Resources online

SUGGESTED READING

Vaughan, D. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago, Ill: University of Chicago Press 1996.

Karen Appold is an editorial consultant for the clinical laboratory industry. 
Email: karenappold@comcast.net

Page Access:

Patient Safety Focus Editorial Board

Chair
Michael Astion, MD, PhD
Seattle Children's Hospital
Seattle, Washington

Members
Peggy A. Ahlin, BS, MT(ASCP)
Consultant
Salt Lake City, Utah 

Corinne Fantz, PhD
Emory University
Atlanta, Georgia

James S. Hernandez, MD, MS
 
 Mayo Clinic Arizona
Scottsdale and Phoenix

Brian R. Jackson
ARUP Laboratories
Salt Lake City, Utah

Sponsored by ARUP Laboratories, Inc.
ARUP Logo