Artificial intelligence, super computers, and predictive analytics may be the wave of the future, but clinical laboratorians shouldn’t be intimidated by these buzzwords. Labs don’t need IBM Watson and a huge research budget to reap the benefits of informatics for improving laboratory efficiency and quality. Plenty can be accomplished working on desktop computers with free or relatively simple software, said Christopher McCudden, PhD, DABCC, FAACC, FCACB, a clinical biochemist at Ottawa Hospital in Ottawa, Ontario, Canada.
McCudden, who taught himself the statistical programming language R, uses data for big research projects or system improvement initiatives, such as tracking down doctors who seem to be ordering unnecessary or expensive tests. He also uses it for simple tasks, such as figuring out whom to contact about updates to a test: He simply runs a report to find out who orders the test in question most frequently.
Getting to this point took a lot of work, but his laboratory has become a resource within the wider healthcare system for data analysis and information, McCudden emphasized. “We bring information to the table rather than just pushing out a result,” he said. There are many ways for laboratorians to get started with informatics. McCudden outlined several in an opinion piece he co-authored last year (J Lab Precis Med 2017; doi: 10.21037/jlpm.2017.09.07).
What Data Can Do
A great place to start with lab informatics is in improving test utilization, according to Julia Drees, PhD, DABCC, scientific director of clinical chemistry at Kaiser Permanente Regional Laboratory, Northern California in Richmond. “That’s very simple but very powerful because if you suspect that people are ordering this expensive test wrong, you can say, who’s ordering it?” Drees commented.
One of her colleagues analyzed laboratory data to figure out why clinicians were ordering immunofixation electrophoresis multiple times a year. This expensive test is best suited for identifying the clone of the paraprotein upon diagnosis, and the clone rarely changes over the course of disease. But the IFE test at their laboratory also provided levels of IgG, IgA, and IgM.
It turned out that it was this quantitation the providers wanted with their frequent orders. They used it to determine if levels were decreasing with treatment. The lab was able to redirect the majority of repeat orders to the less expensive test that simply quantifies IgG, IgA, and IgM.
Another of Drees’ colleagues, a physician, performed an analysis to show his fellow clinicians that they were frequently ordering vitamin D tests on patients older than age 70, while the data showed that those patients were some of the least likely to be insufficient for vitamin D, probably because they were taking multivitamins.
A second valuable use for laboratory data is in identifying populations for reference ranges. Using data compiled from several sources, Drees’ laboratory can select a healthy reference population by excluding samples from certain patients whose charts contain any diagnosis codes, prescriptions, or results related to the disease or condition of interest. This gives the staff confidence that the reference range is derived from a truly healthy population, she said.
Another simple but powerful use of laboratory informatics is to monitor and improve turnaround time. Daniel S. Herman, MD, PhD, assistant professor of pathology and laboratory medicine at the Hospital of the University of Pennsylvania in Philadelphia, monitors how quickly the lab returns troponin results for emergency patients. The laboratory has been monitoring this with monthly reports and is now collaborating with data specialists to see if they can produce daily, and ultimately, live dashboards.
“There are various factors that impede our ability to quickly perform our testing and return [results] to clinicians,” Herman said. “Building these reports will allow us to identify which ones we could actually improve upon.”
His laboratory also used data to improve hemoglobin A1c turnaround time. The staff started by setting a goal to return results by the end of the day for all specimens received by noon and by noon the next working day for the rest. Technologists used the turnaround time reports to identify problematic testing factors, such as higher volume on Mondays and delays for repeat tests, and suggested operational changes. They are now meeting their goal for more than 97% of test orders, Herman said.
While Herman and Drees both work at larger institutions and have support from data analytics experts, projects on a smaller scale still produce actionable results, according to experts.
Even before digging into data projects, one of the most foundational informatics tasks a laboratory should undertake is making sure the laboratory information system (LIS) and electronic medical record system are working effectively, noted Brian R. Jackson, MD, MS, associate professor of pathology at the University of Utah and medical director of IT and pre-analytic services at ARUP Laboratories in Salt Lake City. “That’s not sexy, but it’s what every lab needs to do,” said Jackson, who also serves on CLN’s Patient Safety Focus editorial board.
This includes actively managing the test menu options so that clinicians can quickly find the appropriate tests. For example, if the hospital infectious diseases team has decided that a certain respiratory virus panel is the appropriate first line test for influenza A in hospitalized patients, a clinician likely will search the menu for “influenza” and may not find the right test if it’s listed under “r” for “respiratory.”
Jackson also recommends performing chart reviews based on what clinicians see online, not just how results look on paper. In addition, labs should ensure they have a good test directory website and that stakeholders know about it and can find it. Finally, he suggests that labs make it easy for clinicians to contact them by getting into the loop with whichever systems clinicians use to communicate, such as secure text messaging.
“If the lab is not doing that stuff really well, then it’s not usually going to be well positioned to go beyond that in terms of interesting informatics,” Jackson said.
Beyond the Basics
For laboratorians who are ready to take the next step, the first hurdle is getting access to data, McCudden said. LISs generally are built for inputting data—not taking data out. Access will require help from the information technology team and often support from administrators, who may worry about the security and privacy of the data.
Then there is the question of tools. Basic spreadsheet programs, such as Excel, can be useful for small datasets, McCudden said. However, laboratorians who expect to analyze data regularly using large datasets and who want automated reports should consider third-party software such as Tableau or Microsoft’s Power BI, or even learn to program in R (the software is free and there are many free online educational tools).
Though necessary, tools are secondary to the business problem at hand, said Jackson at ARUP. “It’s important for labs to remember, it’s not about technology,” Jackson said. “It’s really fundamentally about the problem you’re trying to solve. … The technology makes it easier, but it doesn’t do it for you.”
Often the data are only a starting point, McCudden agreed. “It’s not a substitute for talking to people, holding meetings, and implementing change management,” he said.
A first look at the data should always include a reality check to make sure it’s not garbage, McCudden noted. Is one doctor ordering all the tests for an entire division? Is it taking 6 hours to report a troponin result? Something may be wrong with the data. Check your findings using knowledge of the laboratory. Talk to people in other departments. The last thing the laboratory needs is a glossy report based on bad data. “It’s one of the biggest risks,” he said. “You can get 10,000 rows of data really easily, but it could be a massive pile of junk if it’s not validated.”
Armed with good data, there are many business problems laboratories can take on. For example, data on how often a test is performed can be used to determine how often to run quality control, or whether a test should be batched, sent to a reference lab versus performed in-house, or whether a new instrument is warranted. Data on test volume helps with scheduling staff and couriers, too. Turnaround time data can be used to monitor whether a laboratory is meeting clinical goals, to discover bottlenecks, or see if changes have been effective.
The key, Herman emphasized, is to choose a project that is actionable and to talk to your colleagues to make sure the data makes sense. “If you don’t understand where the data’s coming from, it’s really tough to draw the right conclusions from it,” he said, “and very easy to make mistakes.”
Julie Kirkwood is a freelance journalist who lives in Rochester, New York.+Email: firstname.lastname@example.org.