American Association for Clinical Chemistry
Better health through laboratory medicine
April 2010 Clinical Laboratory News: Diagnostic Profiles

CLN Banner Logo

April 2010: Volume 36, Number 4

Procalcitonin-Guided Strategy Reduces Antibiotic Use in ICU

New research indicates that a procalcitonin-guided strategy to treat suspected bacterial infections in intensive care unit (ICU) patients reduced antibiotic exposure with no apparent adverse outcomes (Lancet, 2010; 375:463–74). This strategy could help curb the growing rate of antibiotic resistance, according to the authors.

The study involved 621 patients in eight ICUs who had suspected bacterial infection at admission or during their ICU stay. Patients were randomized into either control or treatment arms, with those in the treatment arm receiving antibiotics based on procalcitonin levels established in predefined algorithms, subject to physician discretion. At procalcitonin levels ≥1 µg/L or between ≥0.5 and <1 µg/L, antibiotic use was strongly encouraged or encouraged, respectively; at concentrations ≥0.25 and <0.5 µg/L and <0.25 µg/L, antibiotics were either discouraged or strongly discouraged, respectively. For patients in the control group, physicians received a reminder about the recommended length of antimicrobial treatment of the most frequent types of infections. However, in both groups, drug selection and final decision to start or stop antibiotics was left to the treating physician.

After treatment arm subjects began receiving antibiotics, investigators used serial serum procalcitonin measurements to guide antibiotic discontinuation. Physicians were encouraged to discontinue antibiotics when procalcitonin concentration was <80% of the peak concentration or when the patients had an absolute concentration of <0.5 µg/L.

The investigators found that patients in the treatment arm had significantly more days without antibiotics than those in the control group, reflecting an absolute mean difference of 2.7 days and an estimated 23% relative reduction in antibiotic exposure. In addition, there was no significant mortality difference between the two groups.

Recommended Tests Not Performed for Antipsychotic Therapy

FDA warnings issued in 2003 about the need to monitor glucose levels in patients taking second-generation antipsychotic (SGA) drugs who have an established diagnosis of, risk factors for, or symptoms of diabetes “appear to have had no detectable effect” on glucose or lipid testing rates in a Medicaid population analyzed as part of a recent study (Arch Gen Psychiatry 2010;67:17–24). The findings underscore that “more effort is needed” to ensure that patients who receive this class of drugs are screened and monitored for potential adverse drug events, according to the authors.

In late 2003, FDA required a label change on SGAs describing an increased risk of hyperglycemia and diabetes, and compelled drug manufacturers to mail letters to healthcare providers informing them of the warning. Concurrently, the American Diabetes Association (ADA) and American Psychiatric Association (APA) issued a consensus statement that described the metabolic risks of SGA drugs and outlined a monitoring protocol for patients taking the medications.

In the newly published paper, researchers conducted a retrospective analysis of claims data for 109,451 Medicaid patients who had a new prescription claim for an SGA drug, either before or after the FDA warnings and ADA/APA consensus statement. The investigators compared testing rates in a control cohort of 203,527 patients who started on albuterol therapy but not SGA medications. The goal of the study was to determine whether glucose and lipid testing increased for this population of patients and whether prescribing patterns shifted to drugs with lower metabolic risk after the warnings.

The study indicated that before the warnings were issued, only 26.9% and 10% of patients initiating SGA therapy had baseline serum glucose and lipid testing performed. Patients who continued SGA medications did not have higher rates of testing than those who initiated therapy. Further, testing rates in SGA-treated patients were similar to those in the control group, despite the well-characterized increased risk of diabetes and cardiovascular disease in patients taking SGA medications.

Reference Values Established for CSF WBC Count in Infants

Researchers at Children’s Hospital of Philadelphia have developed age-specific cerebrospinal fluid (CSF) white blood cell (WBC) count reference values for neonates and young infants that can used to interpret accurately the results of lumbar puncture (Pediatrics, 2010;125:257–64). The analysis addressed important methodological limitations in the few studies aimed at determining reference ranges in this population, and by using PCR testing was able to detect and exclude from the study children with CSF positive for enterovirus, who had not been excluded from other studies.

Reference values for CSF WBC in infants have been based on mostly older children considered healthy after initial evaluation for central nervous system infection. A limited number of studies looked at patients ≤56 days but did not exclude those with traumatic lumbar puncture, seizures, sepsis, and other conditions that might affect the reference ranges. This retrospective analysis included 380 infants ≤56 days old who presented to the emergency department with an indication for lumbar puncture and who did not meet exclusion criteria from the study, such as having CSF positive results for enterovirus by PCR.

Infants who were 0-28 days had a median CSF WBC count of 3/µL with a 95th percentile value of 19/µL. In contrast, infants 29-56 days had a statistically lower median CSF WBC count of 2/µL with a 9/µL 95th percentile value.

Proteinuria Enhances eGFR Risk Prediction

A study by Canadian researchers indicates that higher levels of proteinuria independently increased the risks of death, myocardial infarction (MI), and progression to kidney failure at a given level of estimated glomerular filtration rate (eGFR) (JAMA 2010;303:423–429). The findings are significant because current guidelines for the classification and staging of chronic kidney disease (CKD) are based on eGFR without specifically considering the severity of coexisting proteinuria. The study also suggests that simple dipstick urinalysis adds “considerable prognostic information” to eGFR alone, according to the authors.

The study involved 920,985 adults in Alberta, Canada, who had at least one outpatient serum creatinine and proteinuria measurement, respectively, and who were not receiving kidney dialysis or had not had a kidney transplant at baseline. The researchers estimated index eGFR based on baseline serum creatinine measurements using the Modification of Diet in Renal Disease Study equation. Proteinuria was determined by either urine dipstick or albumin-creatinine ratio (ACR). Median follow-up time was 35 months.

The researchers found that age-adjusted rates of MI, death, doubling of serum creatinine, and renal replacement therapy all were increased at lower levels of eGFR but at heavier proteinuria. However, the difference in risk associated with moderate or heavy proteinuria versus no proteinuria was clinically relevant in every eGFR stratum and for all four outcomes. For instance, patients with heavy proteinuria but normal eGFR appeared to have worse clinical outcomes than those with moderately reduced eGFR but no proteinuria.

In addition, although ACR generally is favored over dipstick urinalysis for assessing CKD, the researchers found that the magnitude of excess risk associated with heavy proteinuria appeared similar whether assessed by dipstick or ACR. This suggests that dipstick urinalysis is a valid alternative to ACR for stratification, particularly in resource-limited settings.