Rooting out the cause of medical errors in the laboratory is a daunting, but crucial undertaking. The path that a specimen travels from when it’s drawn to when it reaches an assay is full of opportunities for errors, including mislabeling, contamination, and even loss of the specimen. Laboratory directors who want to pursue an error-free environment are reorganizing their labs according to workflow principles that are the foundations of programs such as Lean and Six Sigma. This month, Strategies looks at how one lab is confronting, resolving, and preventing errors as part of a multi-year effort to improve quality.
Located in Pennsylvania, Geisinger Medical System’s laboratory network is vast—intaking and processing samples across the state at inpatient and rapid response facilities; as well as physician offices, hospitals and at outreach sites. In order to manage this traffic flow, the system’s Chairman of Laboratory Medicine, Conrad Schuerch, MD, knew he had to undercover and then tackle errors by penetrating the specimens’ movements through the lab system’s network.
Schuerch traces the start of his institution’s error reduction effort to 2000, when Geisinger established a centralized call center to intake all lab-related inquires. The call center representatives had a traditional customer service mission—to take care of and prevent problems, just as any other customer service reps do at other agencies. “We wanted to improve the service level and because the operation was big, we wanted to get the calls regarding questions, add-on tests, and orders into a call center so medical technologists wouldn’t have to take time away from the bench to respond to all of these outside phone calls,” he explained.
One of the best ways to address problems identified through the call center was to communicate them to the rest of the staff, prompting Schuerch to institute a daily meeting called the “morning report” in 2001. During the morning report, call center representatives, medical technologists, information systems specialists, and laboratory managers meet to not only present problems, but also resolve them and identify how to prevent the problem in the future. Issues like computer and instrument downtime, as well as specimen mislabeling or processing errors are all brought out at the meeting.
“At the morning report, we try to fix the problem,” said Schuerch. “If someone comes in and says this specimen was not correctly divided after it arrived, we may unearth three or four problems that contributed to the negative outcome through our morning discussion. We’ll ask questions about an incident—why it went to the wrong part of the lab, why someone didn’t find it and pick it up and why it wasn’t discovered earlier.” There are often probing questions about potential effects on other patients or future patients by similar error mechanisms. Responsible parties claim their assignments and the problem remains active in the meeting minutes until resolved and preventative measures are in effect, he added.
Of course, not every problem is resolved in a morning meeting; in fact most problems won’t be resolved within one day, which is why all of these issues are entered and tracked by an error database. “Recurring problems are recognizable in our error database,” said Schuerch. “These and other particularly complex problems, which may cross many departmental boundaries, become the subject of formal performance improvement projects.”
Creating an Error-Reducing Culture
Four years after beginning the morning report, Schuerch said that problems are resolved faster, better, and are even prevented, and the meeting usually takes only 10-25 minutes. But he admitted that during the early days, it took some time before an open dialogue on the problems took hold. Many people will get defensive and deny responsibility when confronted with errors, but people have to have ownership of these errors in order to create this culture. “That’s why we put the data in front of them,” said Schuerch, referring to posting error rates in the different areas of the lab. For example, in the phlebotomy area, each staffing group’s contamination rates are posted, as well as their improvement rates. “They own it and they work on it,” he added.
Establishing this error-reducing culture is management’s responsibility, advised Schuerch. For instance, if laboratory directors want their staff committed to error reducing systems like the morning report, they need to make it part of each employee’s annual performance evaluation. “You have to make these your performance parameters if you are serious about it, then it will begin to percolate,” said Schuerch. “You have to make your staff talk about error reduction at every meeting by making it an agenda item. Even though it’s being driven from the top, people do want to own quality.” But the staff needs to break out of the old ‘we’re doing a good job’ paradigm, he added. “They need to start focusing on the little errors and deficiencies and realize that lots of these can cause problems for a patient.”
Julie McDowell is the Editor of Strategies. She can be reached by email.
From the Strategies Mailbag
Re: New Diagnostic Tools to Assess Stroke Risk: A Look at the Single vs. Multimarker Approaches, By Julie McDowell, July 14, 2005 (http://online.aacc.org/AACC/publications/strategies/071405.htm)
Dear Ms. McDowell:
I read with great interest your article on new diagnostic tools to assess stroke risk, recently published in Strategies. No individual can be an expert on all tests and the article, though well written, ignores some of the relevant information about Lp-PLA2 and I point out the following:
In the two studies directly addressing cerebrovascular accidents (CVA), Lp-PLA2 does predict stroke independently of all other risk factors, including hsCRP. In other smaller studies, CVA has augmented the predictive accuracy of Lp-PLA2 for cardiovascular events. This does not mean that it is not synergistic with other markers but it does identify a subset of patients at increased risk for stroke. In that conventional sense, Lp-PLA2 does predict stroke alone. However, the labeled indication in the package insert clearly says, "to be used in conjunction with clinical evaluation and patient risk assessment as an aid in predicting risk for coronary heart disease, and ischemic stroke associated with atherosclerosis." The important fact is that it provides independent information over and above all traditional risk factors to assess stroke risk.
The ARIC study was, in fact, a prospective trial. It contained nearly 200 stroke cases, and used a selection of over 700 non-cases from the cohort of over 12,000 subjects. ARIC was a study of long-term risk which is very different than a trial evaluating a marker for acute care and the suggested use of Lp-PLA2 makes that clear. The Rotterdam study (Oei et al, Circulation 2005) provides a very potent confirmation of the fact that Lp-PLA2 predicts long-term risk of ischemic stroke.
I agree–not only is it “unrealistic”, it is inappropriate to think of Lp-PLA2 as a test similar to troponin. Lp-PLA2 is a more chronic risk assessment tool, far more like LDL in its clinical application than like troponin. None of the published research suggests that Lp-PLA2 should be compared with troponin. Nor should it be compared directly to other markers which are designed to diagnose a stroke acutely, which is why the Triage testing was developed. The PLAC test is as stated, “the first blood test designed to assess a patient's risk of ischemic stroke”. This is a first-in-class test for ischemic stroke risk stratification and thus has the potential to provide clinicians with an important new tool.
Joseph P. McConnell, Ph.D., DABCC
Co-director: Biochemical Genetics Laboratory
Cardiovascular Risk and Porphyria Sections
The Mayo Clinic and Foundation
Rochester, MN 55905