Female, African American laboratorian collecting lab samples

In this second of a two-part examination of total lab automation (TLA), I explore what TLA can and can’t do, as well as prerequisites for and considerations in installing TLA systems.

Expectations are very high when it comes to TLA, but I offer a few words of caution based on our experience here at Inselspital–Bern University Hospital in Bern, Switzerland. In relocating the routine laboratory, we decided to simultaneously renew the core lab instrumentation and install a TLA solution. We did so to reduce unnecessary manual work and gain high standardization and documentation for the clinical lab and the institutional biobanking by joining and automating the workflows. The main point we learned from these efforts is that if introducing a TLA system is seen as a trigger and leverage to lean lab processes and IT structures, eliminate unnecessary variability, introduce binding standards, and consider all the deviations of standard processes implemented over decades, then it will be a success story. One that would have been possible without installing even a single TLA module.

Conversely, TLA deployed solely to render existing, complicated processes faster and less laborious will fail. TLA is not necessarily faster or more flexible than a committed worker. Moreover, TLA definitely isn’t a must for every laboratory. The more diverse samples and requests are, the less efficiency automation brings. The mantra hence must be simplify, then automate. And, as everywhere, the Pareto principle applies: Labs that connect 20% of their instrumentation that processes 80% of their samples already have an efficient solution. The difficulty is in deciding which processes should reasonably be automated.

Prerequisites

Introducing TLA in a lean, highly standardized lab environment is much easier than mirroring complex procedures based on manual workflows, making upfront standardization an inevitable prerequisite. A good starting point is a lab’s table of all analyses. Labs should scrutinize all tests and consider which are obsolete, whether specific tube types or more common ones will work for an analysis, if separate aliquots are needed, and other similar considerations. Working through the table will reveal areas of improvement to tackle early on.

Following this process, I recommend that labs critically review their system’s master data, with a hard look out for dependencies that might have been introduced previously to ease manual processing but that now hamper straightforward automated workflows.

If possible, labs also should reduce the variety of sample types, as having many different tube formats renders automation technically challenging. This effort could be combined nicely with a patient blood management initiative and lead to overall smaller sample tubes. If a lab defines (and orders) the sample tubes to be used, this process can be fast and simple. However, making this change could be more challenging if a lab depends on external submitters.

Another major TLA change concerns lab employees. TLA—a highly technical process that needs staff to manage robotics and embedded IT—is not yet as user-friendly as fully automated analytical instruments. In contrast, if a TLA system does not properly work for any reason, its emergency processes at least have to be maintainable manually, depending on the service-level agreement with the TLA vendor. This requires considerable flexibility in staff knowledge and planning.

Cost is of course a key factor in planning to buy and install a TLA system. TLA doesn’t save staff expense, but it brings standardization and documentation at a very high level.

A Question of Space

TLA needs considerable space. A large space without walls and columns offers the easiest setup, but other possibilities exist. Narrow wall openings enable sample transport to other rooms, and sample escalators let samples climb or descend to other floors.

Different types of conveyors allow flexibility in rearranging due to new modules or analyzers. Notably, conveyors are convenient for sample transport, but not necessarily to reach maintenance openings and the like. Hence accessibility concerns should be a priority, and it might be more efficient to let samples drive a few meters more, to make daily maintenance accordingly easier. It might even be reasonable to split automation lines if one sample type is processed only at one specific analyzer or if the analytical solution consists of large connected instruments that are so huge they don’t fit in the dedicated area. A secondary benefit of conveyor systems is, nevertheless, that cables, tubes, IT connectors, etc. can be guided through the base of the conveyor system and reach the respective module and analyzers so that they are easy to access and maintain. If a laboratory has remote premises, optimal sample retrieval points can be designated.

Middleware is another important aspect of TLA. Some TLA applications enable direct links between the laboratory information-management system and routing engine without having to intercalate a middleware. This frees labs from depending on the routing engine provider to attach instruments of another company, resulting in much more flexible and independent solutions.

Labs handling samples of a certain biological security level will need to think through contamination and decontamination issues that affect planning of the TLA.

Although TLA systems are set up to run smoothly and without manual errors, labs have to have procedures in place for emergency breakdowns. This means analyzers should be loadable manually, and sufficient centrifugation capacity should be easily accessible.

Side Benefits

In providing high process quality and extensive documentation, TLA also offers several side benefits. Having time stamps and in some cases photos of samples enables labs to track and document preanalytical errors in a way that fosters discussions with sample submitters, like when barcodes are not attached properly or samples are underfilled or contaminated. For laboratory customers, TLA automated sample storage significantly improves how labs handle new requests from already processed and stored samples. Last, but not least, TLA is a big leap towards healthcare integrated biobanking and enables on-the-fly generation of aliquots for cryostorage in research and routine.

Alexander B. Leichtle, MD, is associate professor at the University Institute of Clinical Chemistry and Directorate of Teaching and Research at Inselspital–Bern University Hospital in Bern, Switzerland. +Email: [email protected]