American Association for Clinical Chemistry
Better health through laboratory medicine
June 2011 Clinical Laboratory News: A New Paradigm for Guideline Development

CLN Banner Logo

June 2011: Volume 37, Number 6

A New Paradigm for Guideline Development

A self-described perfectionist, A. Rita Horvath, MD, PhD, is quick to point out what she sees as shortcomings in the process used to develop the National Academy of Clinical Biochemistry’s (NACB) 2011 update of Guidelines and Recommendations for Laboratory Analysis in the Diagnosis and Management of Diabetes Mellitus (LMPG). But in point of fact, Horvath and her eight colleagues who served on the guideline committee essentially created a new approach to identifying key areas of change in diabetology and assessing the quality of evidence for guidelines that have to do with laboratory medicine.

“It’s not perfect, but it’s a step forward and I’m hoping it will contribute to guideline development in the future,” said Horvath, who is clinical director of the SEALS department of clinical chemistry at Prince of Wales Hospital in Sydney, Australia.

The challenge was to assess evidence that relates specifically to lab testing for screening, diagnosing, and monitoring diabetes, rather than clinical management of the disease, the subject of most existing guidelines. For example, in the 2002 NACB LMPG on Diabetes, “we just accepted the American Diabetes Association grading criteria, which are designed to evaluate clinical information,” recalled guidelines committee chair David Sacks, MD, senior investigator and chief of the clinical chemistry department at the National Institutes of Health in Bethesda, Md. “This time we decided to develop a new grading system because what we had wasn’t a suitable grading system to evaluate lab tests.”

Horvath, an expert on evidence-based medicine, used the international Grades of Recommendation Assessment, Development and Evaluation (GRADE) system for rating guidelines as a resource, but modified this approach to deal with particular issues related to grading diagnostic testing. “We were a little stuck in how to overcome the problem that we didn’t have a perfect system available to use. So I had to develop a kind of hybrid system using internationally accepted grading rules,” she explained. “What we have is a system adapted from existing systems in a pragmatic manner, which was more suited for this group’s expertise.”

In the end, Horvath paced her colleagues through a course of analyzing evidence for each of seven chapters which mirrored their professional expertise. For each guideline recommendation, Horvath asked them to prioritize key questions that were most relevant in terms of outcomes. The committee member then explained why a modification to an existing NACB LMPG recommendation was needed, the key references supporting the new recommendation, and the types of study designs used in those references. Committee members also were asked to rank both the level and quality of evidence for each recommendation, and to provide any explanatory comments.

Though Horvath would have liked an even more thorough process—for instance, reviewing every single study that supported each recommendation—she still is satisfied with the result. “The strength of our process is its explicitness, transparency, and reproducibility. This will be important whenever the next update of this guideline occurs. Readers can look back to our evidence tables and see what was behind each recommendation.”

A full description of the methodology, prioritization criteria, and evidence tables is available online at