Today's consumer electronics market resembles an arms race. Companies battle for customers by releasing new devices with an ever-expanding list of features and functions. In advertisements, all these new elements seem very appealing, but in many cases they go beyond what consumers actually need or even want.

To me, Apple stands out as an exception to this rule. Their products generally don't have more features and functions than price-matched competing products.

In fact, they usually have fewer, and are therefore less cluttered with controls, menus, and other distractions. If you ask people who purchase iPhones, iPads, and Airbooks why they like the device, they frequently gush about its overall simplicity and user friendliness.

Healthcare also has its own arms race. Doctors, hospitals, biotech, and diagnostic companies seemingly conspire to add even more products and services to our arguably already over-featured healthcare system. As laboratory professionals know, laboratory menus are expanding rapidly and physicians tend to adopt new tests much faster than they abandon old ones. On top of that, next-generation sequencing and other genome-based tests threaten to accelerate the volume growth in laboratory data.

In terms of patient safety, I think we need to ask ourselves how the addition of more laboratory data will impact the doctors who must use it to make a diagnosis.

Complexity: A Major Contributor to Diagnostic Errors

In a 2005 study, Graber and colleagues analyzed the contributing factors to 100 cases of diagnostic error in internal medicine (1). Most of the cases involved both system and cognitive factors. Of the system factors, the vast majority were categorized as organizational failures such as faulty communication processes as opposed to purely technical failures like instrument malfunctions. Of the cognitive factors, the largest category was information processing, followed by verification, data gathering, and knowledge.

A reasonable inference from this study is that system and data complexity are major contributors to diagnostic error. It strikes me that system complexity might not be such a problem if our information systems weren't burdened with so much data and that data complexity might not be such a problem if healthcare systems were more capable of handling it. But the bottom line is that there's too much data for doctors to safely handle given healthcare's complicated organizational processes.

Doctors: Do They Use All the Data?

Doctors may think they take advantage of all the available diagnostic data, but studies suggest they are wrong. Humans are actually not very good at metacognition, which means intuitively understanding how our own brains work. So although doctors want to believe that they incorporate multiple data points in sophisticated cognitive ways into their diagnostic decisions, in fact, they are likely only using a small subset of available data in fairly simple ways.

A 1983 study of rheumatoid arthritis assessment illustrates this nicely (2). The authors provided two rheumatologists with patient case histories containing five pieces of quantitative information: articular index; functional capacity; pain level; time-to-resolution of early morning stiffness; and the patient's global assessment. The physicians were then asked to weigh each of the five variables and rate the current disease activity for each patient on a scale of zero to 100. With this data, the authors constructed a regression model to determine how much weight the physicians actually placed on each of the variables. The model showed that although the two physicians believed that they integrated all available data in a balanced fashion into their final decision, each of them actually made their assessments almost entirely based on a single variable (Figure 1, below).

Figure 1



Perceived vs. Actual Contributions to Diagnosis
Perceived vs. Actual Contributions to Diagnosis - Physician 2
When given the same data set, two physicians rated their use of the data (perceived) versus what researchers observed (actual). Rather than using all the data, both physicians relied heavily on one data point. Source: Reference 2. Revised with permission.




In another study outside the medical field, researchers gave horse race handicappers data sets of different sizes on the horses and asked the handicappers to predict the winners (3). When given five relevant pieces of information for each horse, the subjects were able to predict the winners with approximately 17% accuracy. Their confidence in making a prediction about the winner was about 19%; in other words it correlated well with their accuracy.

As the number of information bits provided by the researchers increased, however, the handicappers believed that their predictions were becoming more accurate. In fact, the data showed their accuracy stayed about the same! At 40 data points, they become overconfident or inaccurate by a factor of nearly two!

Examples of the Negative Consequences of Too Much Laboratory Data
Excess Laboratory Data Potential Negative Consequences
Daily inpatient complete blood count and basic metabolic profile in a stable patient. Unnecessary data could distract the clinical team from focusing on other data that more accurately reflect changes in the patient's clinical issues.
Complete metabolic profile on an asymptomatic adult during a preventive care visit. Isolated result outside the reference interval could lead to additional testing and follow-up visits.
HPV testing on a teenage girl. Positive result could lead to colposcopy and cervical biopsy, despite the fact that most HPV infections resolve on their own (4).
Direct-to-consumer genetic testing in an asymptomatic patient. Discovery of a low-penetrance mutation could lead to aggressive followup and potentially risky anticoagulant therapy (5).

Patient Safety: The Gamble

I believe the extreme complexity of clinical medicine today will likely contribute to the growing problem of medical errors. As laboratory professionals, we need to be aware that laboratory data is a significant and growing contributor to this complexity.

In gambling, overconfidence may lead to an individual placing a foolish bet and possibly losing lots of money. While these individuals may suffer financially from their poor decisions, in medicine, doctor overconfidence can be a major patient safety problem.

For example, a doctor settles on a diagnosis before adequately ruling out competing diagnoses. This puts patients at risk in two ways. Not only might patients be exposed to unnecessary therapy, but they might be deprived of therapies that they actually need. Other examples are presented in the table.

So what's the solution to this dilemma? I suggest the medical field heed the advice of Apple's legendary CEO Steve Jobs. If we can find ways to focus doctors' attention on just those laboratory data points actually needed for diagnosis, we may be able to improve patient safety.

REFERENCES

  1. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.
  2. Kirwan JR, Chaput de Saintonge DM, Joyce CR, et al. Clinical judgment in rheumatoid arthritis. II. Judging ‘current disease activity' in clinical practice. Ann Rheum Dis 1983;42:648–51.
  3. Central Intelligence Agency. Psychology of intelligence analysis. The CIA Library website. (Accessed May 2013).
  4. American Society for Colposcopy and Cervical Pathology. 2012 updated consensus guidelines for the management of abnormal cervical cancer screening tests and cancer precursors. J Lower Genital Tract Dis 2013;17:S1–27.
  5. Tenenbaum JD, James A, Paulson-Nuñez K. An altered treatment plan based on direct to consumer genetic testing: Personalized medicine from the patient/pin-cushion perspective. J Personalized Med 2012; 2:192–200.
Brian Jackson



Brian Jackson, MD, MS, is vice president and chief medical informatics officer at ARUP Laboratories, Salt Lake City, Utah, and a member of the Patient Safety Editorial Advisory Board.



Email: [email protected]