Next-generation technologies (NGT) deliver huge improvements in cost efficiency, accuracy, robustness, and the amount of information they provide, Ronald F. Carter, PhD, DVM, writes in October’s CLN. Microarrays, high-throughput sequencing platforms, digital droplet PCR, and other technologies all offer unique combinations of desirable performance.

As stronger evidence of genetic testing’s clinical utility influences patterns of patient care, demand for NGT testing is increasing, observes Carter, a professor emeritus in the department of pathology and molecular medicine at McMaster University in Hamilton, Ontario, Canada.

This presents several challenges to clinical laboratories, including increased urgency, clinical importance, and breadth of application in molecular oncology, as well as more integration of genetic tests into synoptic reporting. Laboratories need to add NGT-based protocols while still providing old tests, and the pace of change is increasing.

Instrument selection is a critical decision that has to align with intended test applications, sequencing chemistries, and analytical software. Although multiple platforms are available, a mainstream standard has not emerged. Depending on their goals, laboratories might set up NGTs for improved accuracy of mutation detection, massively higher sequencing capacity per test, massively more targets combined in one test (multiplexing), greater range in sequencing read length, much lower cost per base pair assessed, and economy of specimen volume.

In addition, batching, robotics, workload volume pat­terns, maintenance contracts, software licenses, and plat­form lifetime affect the cost per analyte and per specimen considerably. Royalties and reagent contracts also factor into the cost of operating NGT.

Laboratories must also deal with the problem of obsolescence.

Investing in a new platform brings the angst of knowing that better machines and chemistries are just around the corner. Laboratories are buying bigger pieces of equipment with shorter service lives.

Beyond choosing a platform, two distinct challenges arise in bringing NGTs into the lab. The first is assem­bling the resources for validation and quality assurance. The second is keep­ing tests up-to-date as new analytes are needed.

While options for information tech­nology (IT) pipelines for NGTs are improving rapidly, recent studies still show significant inconsistencies and lack of reproduc­ibility when it comes to interpreting variants in array comparative genomic hybridization, panel testing, tumor expression profiling, and tumor genome sequencing, Carter elaborates.

Tumor biology brings another level of complexity. Bioinformatic analysis must distinguish tumor-spe­cific variants from genomic variants. Sequencing of paired normal tissue is often performed as a control, but virtual normal controls may have intriguing advantages. One of the biggest challenges is to reproduc­ibly interpret the clinical significance of interactions between different mutations, even with commonly known, well-defined mutations. These complexities mean that IT must be designed into the process from the beginning. Like robotics, IT represents a major ancillary decision.

New technical advances are about to change how, where, and for whom testing is performed. As an example, whole-genome and whole-exome tumor sequencing approaches provide a wealth of biologically important information, and will replace individ­ual or multiple gene test panels as the technical cost of sequencing declines.

Pick up the October issue of CLN and read more about the challenges of bringing rapidly evolving NGT into the clinical lab setting.