May 2008: Volume 34, Number 5
Stuck in the Middleware
Labs Look for Answers as New Consortium Aims to Untangle the Mess
By Deborah Levenson
Mention “middleware” in a room full of lab directors, and you are likely to hear varied definitions of what it is and a wide array of uses for it. But generally speaking, the term has come to describe any hardware and software that acts as an intermediary between instruments and lab information systems (LIS) and allows them to exchange data or perform data management tasks the LIS can’t. Middleware’s popularity took off in the early 1990s when personal computers that generated reports and buffered results became more common. Since then, middleware has become mainstream in POCT and a common, relatively inexpensive solution for core labs grappling with a wide range of problems including shortages of med techs, increases in test volume, more complex devices and lab operations, a federal push for electronic health records, and an overall trend toward ensuring quality of care.
But all too often making health data flow from instruments to the LIS and ultimately, the hospital information system, has meant making the lab home to a hodgepodge of hardware and software, with an attendant and frequently unruly mix of vendors, contracts, and training needs.
Now a group of companies intends to help ease the flow of health information through the core lab and beyond. Building on the success of the Connectivity Industry Consortium (CIC), which produced the successful POCT1 standard for connectivity of middleware used with POC devices, the new consortium aims to issue similar standards specific to the core lab (See Box, below). Formerly known as CIC-2, the new group has changed its name to the IVD Industry Connectivity Consortium (IICC) and is now in the process of choosing officers, according to Mojgan Lefebvre, Chief Information Officer for bioMérieux and nominee for President of the group. Eric Olson, Siemens’ Vice President for Infomatics and eBusiness has been nominated as Chief Technology Officer.
The group plans to draw on several existing standards for healthcare IT with the goal of “creating a digitally connected world,” Lefebvre noted. “But this initiative goes beyond software interfaces and technical protocols. It’s about ensuring efficient transmission, parsing, and validation of clinical data. Getting the right information into the right person’s hands, at the right place and the right time is key.”
The IICC standards will ultimately save time, a commodity labs hold dear, according to IICC insiders. They emphasized that having better connectivity within the core lab will also eliminate the need for so many workstations, time spent on training, and a myriad of other tasks crucial to making middleware programs work.
Connectivity: Success in POCT
Two major data management systems for POCT now comply with the CLSI’s connectivity standard, easing the transfer of data in institutions with labs that use these systems from Telcor (Lincoln, Neb.) and Medical Automation Systems (Charlottesville, Va.). Connectivity enabled by the standard and compliant systems not only improves patient care, but also eases billing and compliance documentation.
With the forthcoming IICC standard, core labs may one day enjoy the sort of connectivity POC users and lab staff at Carolinas Medical Center-NorthEast in Concord, N.C. now have. With the aid of her RALS+ system from Medical Automation Systems and its ability to connect to Roche Accuchek Inform glucose meters, Laboratory Education and POC Coordinator Deb Norkett, MT, ASCP has seen marked improvements in compliance and billing. She interfaces all manual tests through the Accucheck, which also allows her to enter pregnancy test results. After taking advantage of that function, she now achieves a 100% billing rate, while in the past she grappled with a 15% deficit in billing because of lack of documentation in the emergency room. “Before using RALS, we couldn’t capture figures on the volume of certain manual tests, so they were built into the room charge,” she recalled.
Middleware that connects easily to RALS+ has helped Norkett comply with a CAP standard that requires her to track lab tests from collection to result, creating the need to identify who reports a particular test. Using the Accucheck, she can always capture that information, versus a rate of 65% when she didn’t have the glucometer and data management system combination.
The new consortium—organized with AACC’s help—currently includes several IVD and instrument manufacturers among its members, according to Emery Stephans, CEO of Enterprise Analysis Corporation (EAC) in Stamford, Conn. EAC is a consortium member that advises diagnostic and pharmaceutical company clients on information technology issues. Stephans hopes to recruit more companies into IICC. Within the next 12–15 months, the consortium will establish a set of connectivity standards that deal with middleware’s troublesome, basic problems. These include lack of bidirectional connectivity—the ability to move data from instruments to the LIS and HIS and back—as well as a dearth of interoperability with commercial software, and standardized device connections.
The consortium will not be starting from scratch, Lefebvre emphasized. It will draw on several existing standards, including the POCT1 standard, which is now available from CLSI and serves as a model for the process. The consortium will also incorporate aspects of the Health Level 7 (HL7) standard, which was created to improve connectivity and workflow across the entire hospital.
The forthcoming IICC standard, in contrast, will tackle information technology issues specific to the core lab. “HL7 is a standard language,” explained Stephans. “HL7 does not give companies who make instrumentation and LISs guidance about how to define those messages.” The IICC standard will help companies agree upon the manner in which instruments, middleware, and the LIS will exchange information, resulting in more efficient flow of data through the lab. That exchange depends on details such as which fields will carry crucial information like date, time, and patient ID.
The Rise of Lab Middleware
Problems with connectivity and middleware are largely products of its history, as well as factors that made middleware popular in the first place, Stephans recounted. Back in the 1990s, LIS companies commonly had close business relationships with diagnostic instrument companies. Abbott, for example, sold instruments, reagents, and software. As labs’ data needs became more complex, so did LIS systems. But their price and the associated training and instrument connection costs were prohibitive for many because LIS companies often charged as much as $5,000 to connect an instrument. As a result, labs turned to middleware, and LIS companies found they could make more money by focusing on technology for other parts of hospitals, such as image archiving systems for radiology departments, Stephans recalled.
“Meanwhile, hospital lab expectations for transmission of data increased because physicians wanted their data quickly and conveniently. And hospitals, to improve quality, created protocols that led to additional testing, especially for oddball results. So they created rules that said if A happens, do B,” Stephans said. LISs couldn’t handle all the rules inherent in these new lab processes, so labs turned to middleware.
Middleware’s installation was also far easier to accomplish than an LIS upgrade. Rodney Markin, PhD, MD, Professor of Pathology and Microbiology at University of Nebraska Medical Center in Omaha, likened the degree of difficulty in setting up middleware and changing the LIS to “the difference between turning a car and turning the Queen Mary,” referring to the giant cruise ship. “It’s harder to change an LIS with 1,000 clients than a smaller piece of software. So technology used at the bench level progresses much faster than the LIS does, compounding connectivity problems.”
Problems and Promise
One underlying problem with middleware is that companies selling it are not organized as an industry, according to Stephans. They are “odds and ends in a patchwork,” much like the way their products function in some labs. The companies aren’t entirely at fault for this situation. “Hospital and laboratory needs are ill-defined because every lab, hospital, and health system has its own notion of what middleware should be,” Stephans explained. So there was never an incentive for the companies to join forces to make sure their products could efficiently transfer data to one another.
Another factor in problems with middleware, however, lies within the lab and strategic planning by lab staff. Without forethought and planning, middleware can become “a band-aid solution” that “results in ancillary systems, which create more work,” noted Christopher Fetters, Global Manager of Upstream Marketing for Abbott Point of Care and an editor of a CLSI document that translates the POCT connectivity standard for healthcare workers. “With each new piece of software, you create a need for additional purchasing, installing, training, support, maintenance, and troubleshooting,” he pointed out.
Helping with the Med Tech Shortage
But with proper planning, middleware is a major boon to labs facing a dearth of lab techs as quality protocols become more complex and physicians need more convenient access to patient data. “Middleware is so important now that there’s a medical technologist shortage,” according to James H. Nichols, PhD, Medical Director, Clinical Chemistry for Baystate Health in Springfield, Mass., and Associate Professor of Pathology, Tufts University School of Medicine. He helped write the document that explained the POCT1 standard to healthcare workers and sits on Abbott’s Medical Advisory Committee and on speakers’ bureaus for both Roche and Abbott.
Middleware can perform mundane tasks and save med techs’ time for work requiring their expertise and training, Nichols explained. “More and more routine work is going to middleware. For example, it’s often programmed to automate the manual verification of test results to the LIS. Having it automated saves the med tech for higher level review. He or she can focus on flagged samples that require expertise and training,” Nichols explained.
One of the directors of a core lab that runs 5 million tests per year noted that his lab’s middleware and complete automation yields big savings on personnel costs. “I’d probably need an additional 15 to 16 lab techs without it,” noted Kenneth Blick, PhD, Director of Chemistry, POCT, and Automation at University of Oklahoma Health Sciences Center, in Oklahoma City (See Box, below). Blick, who is also Professor in the Department of Pathology, predicted that middleware’s role in the lab will continue to grow as it increasingly takes on specimen management, QC, and autovalidation roles that were once the domain of the LIS.
A federal push for adoption of electronic health records (EHRs) is also driving more use of middleware in the core lab, according to the lab directors and Fetters, who noted that the federal government really seeks better flow of data. “Hospitals must comply with HIPAA, which most people hear about in discussions about security of health information. But the law’s purpose was to ensure that your EHR could be portable and that different physicians and hospitals could see all of your information in one place and add to it,” Fetters said. “The problem is that health information has been stored in different places. HIPAA aims to encapsulate that information and make it transportable.”
Linking Middleware and Total Automation
Unlike many hospital lab directors and their staffs, the University of Oklahoma Health Science Center’s Kenneth Blick, PhD, Director of Chemistry, POCT and Automation, is very happy with his middleware and the support he receives for it.
An advocate of careful, strategic planning for use of both middleware and total automation, he emphasized that it isn’t cheap. “You get what you pay for. If it’s inexpensive and not well-supported, you must keep it running and support it yourself. You’ll have problems if you just go out and buy something.”
Blick avoided this scenario by making middleware part of what he terms a partnership in total lab automation with his vendor. “We chose a vendor with software and experience with networking middleware software to instruments. We wanted support for QC and autoverification and autorelease programs that didn’t run well on our particular LIS.” He deals with one company for his instruments, middleware, automated specimen tracking system, and all software for chemistry and hematology. “The middleware in the core lab is well-supported because it’s part of the overall plan.”
Many lab directors, especially those in smaller institutions, might wonder how they can get the money for all of this. Blick offered some advice: Emphasize just how much physicians need you. “I told my institution that my lab is actually for the physicians, and that they want their information in real time. If a patient is sitting in the ER because his or her specimen is sitting on a rack, that patient is not getting quality of care.”
The Value of Planning
On a practical level, IICC’s purpose is to provide the interoperability that simplifies organization of middleware and planning for future purchases, “so it doesn’t take on a life of its own,” explained Jay Jones, PhD, who is among the AACC leaders working on the IICC. He is Director of Chemistry and Regional Laboratories and the Division of Laboratory Medicine at Geisinger Health System in Danville, Pa., and a recognized leader in efficient use of information technology. But labs need to do their part with strategic planning regarding where particular middleware applications belong as IICC works on laying the groundwork for more “plug and play” instrumentation, he emphasized. “You need to figure out how to consolidate and provide interoperability so that in the future, you avoid having silos that can’t communicate with one another.”
While thinking about future middleware and interoperability needs, Jones recommended thinking about the flow of information through applications as going into various “buckets” in two main function areas: process and clinical practice (See Box, below). On the process side, he suggests grouping together in one bucket applications that filter rules, as autoverification, quality control, and specimen management software does. Similarly, he recommends putting together in a second bucket the software that controls instruments, including middleware that manages reagents, sets test parameters, and optimizes workload. Technical support middleware—such as applications that can help solve technical problems remotely, issue alerts about such problems, or perform peer calibration—should form a third bucket of functions.
On the clinical practice side, labs should organize middleware in separate buckets for decision support systems and data mining. Decision support systems include applications that help physicians interpret and order lab tests. For example, Jones says, the lab might use the software to suggest a second test when initial assays yield abnormal results. Data mining applications are especially useful for hospitals that are moving toward EHRs and for practicing evidence-based medicine, he noted. These might include programs that can pick out patients’ PSA scores and see how rapidly they rise in certain groups of patients.
An Organized Approach to Middleware
Jay Jones, PhD, a member of the new IVD Industry Connectivity Consortium and Director of Chemistry and Regional Laboratories and the Division of Laboratory Medicine at Geisinger Health System in Danville, Pa., recommends that lab directors think strategically about middleware. He suggests clustering applications related to process functions into “buckets” for filtering rules, controlling instruments, and technical support. Likewise, practice-related applications for real-time decision support systems and data mining should be placed in their own “buckets.”
The View from the Bench
The core lab at the University of Iowa is a microcosm that illustrates both the promise of middleware and its problems. Denny Drees, MTA, ASCP, Clinical Laboratory Scientist in the Automated Chemistry Section of the Core Lab works in an environment where middleware and the LIS make the vast majority of decisions about what to do with samples, leaving med techs to deal with just 3% of results that truly require their attention.
Drees appreciates the power of artificial intelligence afforded him by the Data Innovations Instrument Manager product he uses. “Before middleware, we needed the IT department to write the rules that directed specific functions,” he said. “With middleware, lab staff has control over that process.” Noting that rule writing by med techs is not yet common, he added that responsibility generally falls to someone in the lab.
To ease the process of establishing rules in the hematology lab, staff are now implementing a new Sysmex middleware program called Molis WAM, which includes a template of about 100 common rules that can be activated on demand.
Companies aren’t entirely to blame for inadequate rule-writing capability and support for the task, Drees said. “Companies are concerned about liability. Instruments and labs can be very different, so there isn’t one set of universal rules they can use.” Nichols added that more company involvement in rule writing and increased support of the process “could be construed as medical decision making, bringing the process under regulatory purview.”
Concerns about liability, and companies’ reluctance to become involved in lab rule-writing aren’t likely to diminish any time soon. But at the University of Iowa lab bench, Drees and his colleagues look forward to the forthcoming IICC standard and the prospect of vendors agreeing on the manner in which data can be exchanged. “If Data Innovations and Sysmex were standardized, we could have ‘plug and play.’ And training would be a lot easier for users.”
Until the IICC comes up with some solutions to the current middleware problems, lab directors and staff at different institutions will just have to resolve the problems by sharing their successes and failures.