American Association for Clinical Chemistry
Better health through laboratory medicine
January 2010 Clinical Laboratory News: Lab Medicine Outlook

CLN Banner Logo

January 2010: Volume 36, Number 1


Lab Medicine Outlook
What Will the Coming Decade Bring?
By Genna Rollins

The first decade of the 21st Century by any measure has been transformative for lab medicine. Pharmacogenetics (PGx), point-of-care testing (POCT), lab automation, and LEAN went beyond buzzwords to become mainstream realities. Evidence mounted about certain biomarkers and pushed them into the limelight, while others fell out of favor. Still others that showed great promise have yet to make their mark on clinical practice. The field of lab medicine also sharpened its focus on standardizing and improving the analytical performance of assays, and on the methods used to report lab-related research.

True of any period of great change, the decade was not without its controversies. How tight should glucose levels be in critically ill patients? What’s the appropriate screening cutoff for prostate-specific antigen (PSA)? Should CYP2C9 and VKORC1 genotyping be performed on all patients receiving warfarin therapy? And the list goes on.

Looking out over the next 10 years, how will lab medicine change? Will biomarkers for Alzheimer’s disease become routine labs tests? Will POCT move even more tests out of the central lab? Will the vitamin D testing bubble burst? These and other questions may be resolved in the coming decade, but then again, maybe not. As the Nobel-prize laureate physicist Niels Bohr observed, “Prediction is very difficult, especially about the future.”

As the second decade of the 21st Century starts, CLN asked lab medicine experts to reflect on the years ahead.

Genes and More

Arguably the seminal scientific event of the decade was mapping of the human genome, completed two years ahead of schedule in 2003 to coincide with the 50th anniversary of James Watson’s and Francis Crick’s first report about the structure of DNA. That milestone event laid the groundwork for the expansion of molecular diagnostics in ways that might not have been predicted 10 years ago, according to Wayne Grody, MD, PhD, professor in the departments of Pathology and Laboratory Medicine, Pediatrics, and Human Genetics at the UCLA School of Medicine. “To have the entire human genome at your fingertips so there's no mystery about the sequence of any gene has helped a lot in the maturation of molecular testing,” he observed.

And yet, the genomics field in many ways is full of unrealized potential. In the mid-1990s, Grody's lab was involved in the effort to offer cystic fibrosis carrier screening nationally. “We showed that it was possible to implement on a nationwide scale a molecular test, and that we had the ability to convey the information to huge numbers of patients via genetics counselors, videos, and brochures,” he explained. To Grody's disappointment, other molecular screening tests have not taken off, due to a variety of ethical, cost, logistical, and technical challenges.

At the same time, an emerging theme of the decade has been direct-to-consumer (DTC) genetic testing, a development Grody views warily. “For some reason DNA is considered a different type of analyte than say, sodium, and it's seen as a subject of fun and recreation. I don't know where that idea came from because it's actually the most complex of all analytes,” he contended. “I'd like to see us take a step back from DNA as fun and put it back into the clinical lab as a very serious and complex analyte.”

A more positive development has been the advance of molecular diagnostic techniques, including high-throughput microarrays, next generation sequencing, and genome-wide association studies (GWAS). “Until this decade I couldn't conceive of a clinical application for these things—it seemed like such overkill,” he explained. “Now, I'm amazed that there actually is a use for this very sophisticated technology.” For instance, Grody's genetics clinic orders several array comparative genomic hybridization studies per week on children with developmental delays, autism, and non-specific malformations.

Looking into the future, Grody sees these advanced technologies moving out of tertiary care labs like UCLA into community hospitals. Changes in the regulatory climate may also be in the offing, particularly for DTC tests. Grody also speculates that the largely unrealized potential of GWAS actually may put the spotlight on proteomics and large networks of interacting genes.

 

CLN Survey
A Look Back, Peak into the Future

As part of our look back at lab advances over the past decade and look forward to the next 10 years, CLN surveyed readers about key developments in the field. More than 600 individuals responded.

Hands-down, the biomarker that participants identified as having both the greatest impact on volume and most significance clinically was cTn I/cTn T, followed by HbA1c, and BNP/NT-proBNP. Vitamin D, which rated highly in terms of impact on volume, did not fare as well when it came to clinical significance. Survey participants ranked Alzheimer’s disease and stroke markers as tests with the most potential in the future.

Nearly half of respondents ranked lab automation as the most significant technological issue of the 2000s, followed by combined chemistry/immunoassay platforms and point-of-care testing. The need to make cost reductions and recruiting and maintaining staff were top lab management concerns.

Looking forward, participants predicted standardization of lab tests, multimarker assays, and pharmacogenetics would be major trends of the coming decade.

The Search for Biomarkers

In the latter part of the decade, proteomic studies heralding associations between X biomarkers and Y conditions populated the scientific literature and made headlines in the popular press. Yet for the most part, these potential biomarkers did not clear the considerable hurdle of replication in larger or different populations, or in labs other than those first reporting the association. This has proven disappointing to many, but other experts, including James Ritchie, PhD, see it differently. “A lot of people have been thinking there's a failure in the pipeline, but to some extent they're forgetting history. It takes quite a while to get a biomarker from the phase of discovery to clinical usage, typically five-to-ten years,” he explained. “We're expecting more from proteomics now, and want it to produce tangible changes at the bedside sooner.” Ritchie is professor of pathology and laboratory medicine at Emory University in Atlanta.

While acknowledging these challenges, Ritchie remains bullish on the biomarker development process, particularly given technological innovations of the past decade. Two crucial advancements have been the introduction of stable tandem mass spectroscopy and multiplex proteomic assays. “Mass spec instruments are computer-controlled and very stable now such that the technology is on its way to becoming a common lab instrument,” he explained. “We’ve been using multiplex assays in general clinical chemistry for awhile, but the idea that in proteomics you can have separate reactions all at one time in a single reaction vessel is kind of amazing.”

Ritchie believes that updated regulatory paradigms will be necessary to realize the full potential of proteomic biomarkers in particular. “With some of the new modalities like multiplex proteomic immunoassays, it has been kind of unclear how regulators will evaluate them and what standards they will hold them to if there haven't been any previous assays,” he explained. Clarification of these issues will encourage manufacturers to submit FDA applications, and ultimately speed-up the biomarker pipeline, he said.

Ritchie expects a troubling intellectual property trend to accelerate in the 2010s. “Every gene or proteomic test that becomes available is immediately patented and that inhibits researchers from taking it to the next level of identifying analytical problems and finding solutions to them. When you’re locked into the company that owns the assay being the only one who can modify or experiment with it, that slows rather than speeds the process,” he observed. On the flip-side, researchers are becoming savvier to the advantages of including laboratorians early-on in molecular studies, a concept promoted by the AACC Proteomics Division. “If that happens and the laboratorian tells you, ‘this protein is so labile you can't get it from the patient to the lab without it breaking down’, then you shouldn't bother measuring it,” Ritchie explained. “In the end that will speed-up the process.”

Personalized Medicine

PGx found its stride in the 2000s, most notably in the oncology realm. “We’ve moved from having just a few research applications to a point where pharmacogenetics is applied routinely in selected areas,” noted Michael Stocum, MS, managing director of Personalized Medicine Partners in Research Triangle Park, N.C. In the late 1990s FDA approved trastuzumab along with a companion diagnostic immunohistochemistry HER2/neu oncoprotein test to select patients for treatment with the drug. This was followed in 2000 with FDA approval of a HER2/neu oncoprotein test to monitor the effect of trastuzumab in women with metastatic breast cancer. Later approvals followed for fluorescent in situ hybridization (FISH) for gene amplification and for use of the drug in adjuvant therapy in HER2-over expressing breast cancer. Tests associated with other drugs followed, including the UGT1A1 molecular assay to identify colorectal cancer patients at increased risk for adverse reaction to irinotecan, and KRAS mutation testing in patients undergoing anti-epidermal growth factor receptor (EGFR) monoclonal antibody therapy for colorectal cancer, among others.

Applications of PGx vis-a-vis drug metabolizing enzymes has been more limited, with the most prominent example being CYP2C9 and VKORC1 genotyping to determine the dosing algorithm for warfarin. Yet genetic testing associated with warfarin remains controversial, since expression of the two genotypes accounts for only about 50% of the variability in response to the drug. Stocum attributed the fact that warfarin genotyping has not exactly thrilled the medical community to the drug having been on the market for decades. “If physicians are already treating and managing patients without pharmacogenetic information, convincing them of the value of this additional information is a difficult adoption hurdle to overcome. I suspect that if warfarin were a new drug, CYP2C9 and VKORC1 genotyping might have been adopted quickly.”

Looking into the next decade, Stocum sees an explosion of PGx applications in other diseases like there was this decade for cancer therapies. The 2010s also will be a time when regulators work out new paradigms for PGx-related clinical trial designs that answer crucial safety and efficacy questions while overcoming unique challenges in the field. A good example is KRAS expression and anti-EGFR monoclonal antibody therapy in colorectal cancer. Now that KRAS wild-type expression has been strongly associated retrospectively with positive response to treatment, a prospective clinical trial to examine this association would be problematic, according to Stocum. “It would be difficult to put patients on this type of drug if they have KRAS mutations, and may even be unethical. Most colorectal cancer patients and their oncologists realize the patients are unlikely to achieve efficacy from these drugs and that's where FDA and product developers need to think about how to deal with this type of situation.”

As more PGx applications come on the market, labs will need to play an even more prominent role, Stocum contends. “Robust lab results have become taken for granted because the platforms are more commonplace and user-friendly than ever. However, the knowledge component of what the results mean will become more important because the average clinician can't keep track of it all. Labs will need to be proactive in reaching out to physicians and sharing their knowledge,” he said.

Lab Automation Gains Speed

Another major theme of the 2000s was lab automation. Fueled by an intractable medical technologist shortage, considerable price decreases, and improved modularity, the systems took off, according to Charles Hawker, PhD, MBA, FACB, scientific director of automation and special projects at ARUP Laboratories in Salt Lake City. “In ten years we went from having about 200 labs automated to more than one thousand. There's been a tremendous rate of progress and there'll be many more automation installations in the future. A driver for many organizations is a simple lack of labor, and automation is almost more of a defensive strategy,” he observed.

Hawker explained that vendors responded to the exigent circumstances by making their systems more modular. “Having options to choose from is very valuable because people have different needs and budgets. Hospitals are strapped for cash and purchasing these systems is not cheap, though prices have come down,” he said. Systems that cost in the neighborhood of $500,000 in 2003 can be had today for about $200,000, he indicated. This has enabled even smaller operations that process between 500 and 1,500 specimens per day to benefit from functionalities like preanalytical work stations that centrifuge, decap, and sort specimens into customizable analyzer-specific racks and receive them in the LIS. Higher volume facilities may have total automation encompassing all preanalytical, analytical, and postanalytical functions, which they implemented over time as their budgets allowed.

In the coming decade Hawker expects to see several improvements over automation systems of today. For instance, several manufacturers are developing automated inspection systems that will use a variety of technologies to check specimens for interferences, sufficient volume, and the like. In addition, ARUP Laboratories is working on a system that will use optical character verification to read specimen labels and determine if any are mislabeled. Hawker also sees a day when conventional specimen conveyor systems, the technology for which dates to the 1970s, may be replaced with synchronous linear motor systems with magnetic pucks, a concept somewhat similar to that used with high-speed bullet trains. “The mechanical systems in use today require maintenance of sensors, belts, and motors, and with this new technology, all that goes away, and you have a maintenance-free system,” he explained. Given all these advances and favorable cost-benefit equations, Hawker predicts that up to 80% of labs will be automated during the 2010s.

POCT Sees Major Advances

The pace of change in lab medicine over the past decade perhaps was no faster than in POCT. The field progressed from using POCT in circumstances where a stat result was essential—such as blood gases when a patient's metabolism was changing rapidly—to using the tests to facilitate efficient patient management. “Creatinine is not something that changes hourly in a patient or even by the minute, but using it in a POCT setting can move patients quicker through radiology, for example,” explained James Nichols, PhD, DABCC, FACB, professor of pathology at Tufts University School of Medicine and medical director of clinical chemistry at Baystate Health in Springfield, Mass.

In concert with new testing modalities, POCT quality control also has advanced by leaps and bounds. Lock-out features that enable only designated users to operate devices are standard fair today, and connectivity has improved markedly. In the realm of glucose meters, for example, “ten years ago, data management consisted of ‘sneakernet,’ because a lot of lab techs had to run around the hospital and physically connect the meters to a laptop to collect all the data,” Nichols recalled. But since then, the POC Connectivity Industry Consortium and POCT1 standards have moved the field forward significantly.

Looking ahead, Nichols sees only blue skies for POCT, with connectivity and quality control improving even further, a trend towards smaller, more portable devices, and further development of multiplex molecular POCT testing, particularly for infectious diseases. In addition, applications that use minimally or non-invasive specimen collection will gain in prominence. Nichols also believes any tensions that have existed between POCT programs and labs will ease in the future. “I think we're much more realistic now that in certain situations POCT is a better way to go, and in other cases it's better and cheaper to send the sample through the main lab. We're better now at helping staff understand the balance of those two opposing forces,” he observed. In keeping with the expansion of POCT into new diagnostic arenas, there also will be more outcomes-based research related to POCT, Nichols predicted.

Analytics to the Fore

Analytical issues rose to the fore in the 2000s in concert with an increasing emphasis on evidence-based medicine in laboratory science and as a result of the European Union’s IVD directive, effective in 2003, which required that products marketed in E.U. countries be traceable to higher order reference materials or methods. “The E.U. in large part has been driving the move towards traceability. We’re benefitting in the U.S. from that because it’s a global market and manufacturers want to sell their products as they are in any country,” explained Mary Kimberly, PhD, chief of the lipid reference laboratory in the clinical chemistry branch at CDC.

Standardization efforts that started decades earlier with analytes like cholesterol and its components continued through the 2000s, adding biomarkers such as HbA1c and serum creatinine. Attention shifted in the late 2000s to other analytes like vitamin D and cardiac troponin (cTnI), both of which are expected to wrap up in the coming years. Progress in the field is due to a team effort by organizations such as the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), the Joint Committee for Traceability in Laboratory Medicine (JCTLM), and various governmental entities such as CDC and the National Institute of Standards and Technology (NIST).

In the coming years, “a lot of the effort will continue as it has in recent years on manufacturers and in getting systems calibrated correctly before they’re even on the market. Therefore, the role of labs will be more on accuracy-based proficiency testing,” Kimberly predicted.

The Heart of the Matter

Another area rife with both new discoveries and challenges for labs was cardiovascular disease (CVD)-related testing. Consider, for example, that in the 1990s creatinine kinase-MB testing was an essential component of the work-up for acute coronary syndrome, but by the late 2000s, some labs had retired the assay, in favor of the much more specific cTn I or cTn T markers. Evidence also mounted that inflammation was a CVD risk factor and as a result, the high sensitivity C-reactive protein (hsCRP) test became quite prevalent. Yet use of these markers will continue to evolve as research yields even a greater understanding of atherosclerotic disease, predicted Allan Jaffe, MD, professor of medicine and director of core clinical laboratory services at the Mayo Clinic. “I suspect we'll find much better markers than hsCRP in the future,” he said. “I think the reason it works is because it's such a dirty marker. It integrates a large number of variables, many of which we need to learn more about to understand why inflammation is so important. My feeling is in the next ten years we'll understand inflammation better and develop more sensitive and specific assays.”

Crucial to advancement of cardiac biomarkers will be laboratorians' role in closely scrutinizing proposed markers and limiting clinical use of markers before the evidence for them is solid. He cites as examples ischemia-modified albumin and homocysteine, two highly touted markers that have not lived up to the hype.

Cholesterol, triglycerides, and lipoproteins also had top billing in the 2000s as markers for CVD risk, but as with hsCRP and cTn I and T, the evidence base is continuing to evolve. As a result, labs will need to stay abreast of and educate clinicians about developments and ensure that these tests are analytically sound. “To be able to use specific cut-points that come from study populations and transfer them to individual treatment decisions, we must have very accurate and precise tests. It's still up to labs to provide the most reliable results possible, especially if healthcare reform includes a strong preventive component,” explained Gary Myers, PhD, FACB, chief of the clinical chemistry branch at CDC.

From an update of the National Cholesterol Education Program Guidelines in 2001, much of the past decade focused on LDL-C and HDL-C as risk factors and intervention targets. However, towards the end of the decade more evidence emerged about apolipoprotein B (ApoB), suggesting that it may be a better indicator of CVD risk. NACB Laboratory Medicine Practice Guidelines on emerging CV markers issued earlier this year recommended not replacing LDL-C with ApoB and not screening for ApoB on a regular basis. Still, “it will be interesting to see, based on the data available now, where ApoB falls in the paradigm of CVD risk assessment and whether it will be added or whether LDL-C will remain the primary target,” said Myers. “My feeling is that ApoB is unlikely to replace LDL-C because LDL-C is too entrenched in our prevention strategies.”

Even so, Myers believes that measurement of ApoB will be more commonplace in the 2010s, and he expects evidence to clarify which, if any, of the various methods for measuring lipoprotein subspecies, subfractions, particle concentrations or particle size are better. “The body of evidence is just growing and it will become clearer either that these subspecies do in fact add information, and allow us to better differentiate risk and outcomes, or add nothing more than what we already know. The key will be evidence that shows whether they help us treat patients better or improve outcomes.”

A Dose of Sunshine

Vitamin D measurement seemingly rose out of the mist to become a top-volume test for many labs by the end of the decade. Numerous animal studies had associated vitamin D supplementation with decreased risk of type 1 diabetes, breast cancer, multiple sclerosis and other diseases, and there had been epidemiologic evidence in humans as well. But in late 2006 a study reported that patients with highest concentrations of vitamin D had a significant risk reduction in multiple sclerosis compared to those with the lowest concentrations (JAMA 2006; 296:2832-8). The article caught the attention of both the scientific community and lay press, and from there testing for 25(OH)D took off. “That was the start. It was the first study to have a large enough ‘N’ to be valid, and it's the one that is always referenced,” explained Sylvia Christakos, PhD, professor of biochemistry and molecular biology at UMDNJ-New Jersey Medical School in Newark.

As testing volumes rose, numerous analytical challenges surfaced and efforts went into teasing out sources of variability and improving analytic methods for the analyte. In mid 2009, NIST made available two standard reference materials, providing both immunoassay and LC-MS users a means to calibrate their assays, thereby reducing the interlaboratory CVs for each method. Also in 2009, CDC developed its own LC-MS/MS method for measuring 25(OH)D and indicated that this method would be used in National Health and Nutrition Examination Surveys. NIST also planned to submit a candidate reference measurement procedure to the JCTLM. These initiatives are expected to improve analytical performance of 25(OH)D assays and help advance the scientific and clinical utility of the test.

Looking forward, Christakos expects the next decade to be very active and fruitful in terms of further clarifying analytical issues and in understanding the mechanisms of vitamin D action. Her lab is embarking on a GWAS of vitamin D. “Within ten years, labs like ours will know the critical genes, co-activators, and mechanisms of vitamin D: what turns it on and off, and how it prevents or reverses some of these diseases. It’s a very exciting time.”

The Great Glucose Debate

During the 2000s, diabetes was increasingly recognized as a world-wide public health problem that is only expected to worsen in coming years. Consequently, considerable resources went into research, patient education, and management of the disease. Labs were front-and-center in these efforts. There was a flurry of activity around the standardization and use of HbA1c assays, which, through the 1990s and early 2000s had considerable variability and inconsistent use. Thanks to initiatives by the National Glycohemoglobin Standardization Program (NGSP) and IFCC, as well as participation by assay manufacturers, “there seems to be huge movement to reporting HbA1c rather than other forms of glycated hemoglobin and in using NGSP-certified methods, with the assays markedly improved” according to David Sacks, MB, ChB, FRCPath, associate professor of pathology at Harvard Medical School.

These efforts set the stage for an expert committee to propose in 2009 that HbA1c be used as a means of diagnosing diabetes, a recommendation that Sacks believes soon will be endorsed by professional associations. Also on the horizon are new criteria for defining gestational diabetes based on patient outcomes.

Within the last two years there was considerable controversy about a recommendation for U.S. labs to report both HbA1c and estimated average glucose, a proposal not adopted elsewhere. “That hasn’t caught on in other countries and I don’t expect to see a lot of change in that regard,” Sacks observed.

Controversy also swirled around appropriate glucose targets in critically ill patients. Based on data published in 2001, tight glycemic control in ICU patients was widely implemented throughout the U.S. and numerous other countries. However, by decade’s end there was a growing body of evidence that these restrictive targets might be harmful. Experts attributed at least some of the conflicting evidence to testing issues. “It’s quite clear that the accuracy and precision of glucose meters are probably not adequate for the ICU setting,” observed Sacks. “However, it’s not clear where this is going. A lot of research will continue to be focused on this issue.” Research efforts also will be intensive in identifying and validating other diabetes-related markers, he predicted.

PSA in Flux

The diagnosis and management of prostate cancer was a hot topic throughout the 2000s, and once again, labs were central to the debate. Interim reports from two major U.S. and European studies issued in 2009 that many had hoped would clarify the efficacy of population-based PSA testing did anything but (N Engl J Med 2009;360: 1310–9 and 1320–8). “These really are interim reports, but the data from the European trial demonstrates that PSA-screening reduces prostate cancer mortality although the balance between this benefit versus the harms of over-detection and over-treatment is not clear,” explained Hans Lilja, MD, PhD, attending research clinical chemist in the departments of clinical laboratories, surgery, and medicine at Memorial Sloan-Kettering Cancer Center in New York City. “When there’s longer follow-up from these studies that can be reported, it will guide us somewhat more in the efficacy of this type of intervention.”

Considerable research in the past decade also was devoted to understanding the best clinically actionable PSA score as well as the best combination of PSA components for risk stratification and treatment decisions. Lilja expects that while PSA will remain the foundation of initial prostate cancer risk assessment, the coming years will bring clarification about panels of genetic and serum-based markers currently under intense investigation that will boost specificity in detecting clinically significant cancers. “PSA is the best starting point and it’s very unlikely to be challenged for a long time,” observed Lilja. “But we need something on top of that and I think we’ll see within three-to-five years genetic variation, blood and possibly urinary markers being sorted out in an organized fashion and that kind of modality will move into clinical testing.”

The Coming Decade

As laboratorians bid the 2000s farewell and greet the 2010s, they can be assured of two certainties: change is inevitable, and labs will continue to play a crucial role in medical advances and treatments.

Dr. Lilja holds patents for free PSA and human kallikrein-related peptidase 2 assays, and Dr. Sacks has received honoraria for speaking from Bio-Rad.