It is no secret among clinical pathologists that our laboratories generate tremendous quantities of healthcare data. However, elucidating how to mine this wealth of information to improve efficiency, reduce costs, and enhance patient care is not always as clear.
A scientific session on Wednesday, December 16 at 11:00 am Central, titled “How Clinical Laboratory Data is Impacting the Future of Healthcare,” will shine a light on the path to changing this narrative, empowering laboratorians to be key participants in shaping the use of healthcare data.
“Ultimately, the goal of this session is to help lab professionals understand what artificial intelligence is, describe the types of problems data analytics can solve, and provide a framework for evaluating and developing new algorithms,” says Christopher McCudden, PhD, an associate professor of pathology and laboratory medicine at the University of Ottawa.
At first glance, applying data analytics to tackle even simple laboratory problems seems complex. However, for most laboratorians the data and tools are readily available and often easily implementable. According to McCudden, “laboratorians should start by clearly defining a problem. From there, they should assess the available data (volume, type, and quantity), evaluate available tools to solve the problem, and then focus on developing a solution, be it simple data visualization or an AI algorithm.”
By developing an understanding of the tools as they are needed—and seeking help from those who have experience solving similar problems—the use of large datasets becomes far less daunting.
In fact, the use of large datasets, including laboratory data, is becoming increasingly common in healthcare settings. For example, many hospitals use sepsis prediction alerts. The algorithms driving these alerts run in the background of the electronic health record, warning providers when a patient may become septic.
In labs, the use of moving averages has also become common. Moving averages are an analytic method by which patient results are monitored to assess instrument performance. Such an approach may reduce the use of liquid quality control material while simultaneously monitoring assay performance in real time. This approach can decrease costs and potentially limit the number of erroneously reported patient results.
However, labs should also think beyond the relatively narrow scope of using lab data to monitor instrument performance. Ideally, laboratorians should serve on interdisciplinary teams with other clinical colleagues to build algorithms that offer insight across other domains of patient care.
For example, large clinical datasets can be used to back up opinions, provide a foundation for a project, and demonstrate whether an intervention was effective. Data analytics is an ever-growing part of our instrumentation and increasingly will be used on the front lines of patient care to help make decisions.
The plea from McCudden mirrors that of Michael Laposata, PhD, from his plenary on Sunday evening, as both urge that laboratorians serve on interdisciplinary teams to improve the delivery of quality healthcare.
According to McCudden, it is important for laboratorians to “…demonstrate value, be open and collaborative, and be proactive by reaching out to people doing the work. The data analytics train has left or is in the process of leaving the station. It will leave without us unless we can get on board and bring value to this area and benefit greatly from the process and the results.”
The famous poem The Rime of the Ancient Mariner states “Water, water everywhere nor any drop to drink,” mourning being lost at sea. However, laboratorians do not have to suffer the same fate in the sea of data. With a framework to tackle large datasets and a little practical guidance from others, we can rest assured that solutions to these problems, and improved patient care, are within our grasp.