In This Issue...

Lab Results Reporting Change Significantly Reduced Inappropriate Antimicrobial Orders

A pilot proof-of-concept study demonstrated that not routinely reporting urine culture results from non-catheterized inpatients can greatly reduce unnecessary antimicrobial therapy for asymptomatic bacteriuria (Clin Infect Dis 2014;58:980–3).

Researchers at the University of Toronto conducted the study after observing that urine cultures from non-catheterized patients were rarely associated with urinary tract infection (UTI). In addition, prior studies have shown that antimicrobial therapy for asymptomatic bacteriuria (ASB) does not confer benefits except in specific populations, yet is associated with adverse drug reactions and selection for infection with increasingly resistant bacteria.

Realizing that to change urine culture ordering practices would require modification of long-standing clinical protocols and beliefs, the authors tested what effect a change in lab reporting practices would have. During the study's implementation period, they stopped routinely reporting positive results from non-catheterized patients, and instead posted a message in the electronic health record advising clinicians that most positive urine cultures from inpatients without catheters were likely due to ASB. The message then requested clinicians who strongly suspected their patients of having UTI to contact the laboratory.

The researchers found that 23% of urine specimens were positive, but UTI was present in only 2% of non-catheterized patients and 3% of those with catheters. The rate of antimicrobial therapy for ASB before the intervention was 48%; but during the intervention period it dropped to 12%, for a 36% absolute risk reduction. Based on these findings the researchers concluded that no longer reporting positive non-catheterized urine culture results unless physicians called the lab and requested them greatly reduced antimicrobial therapy for ASB.

Biomarker-Based Sepsis Risk Model Validated

A multinational, multi-institutional research team reported deriving, testing, and validating a biomarker-based risk model that estimates mortality in adults with septic shock (Crit Care Med 2014;42:781–9). The authors suggested the findings could enhance the design of future septic shock clinical trials and serve as a benchmarking metric for quality improvement efforts.

The authors previously had derived, tested, and validated a biomarker-based model that reliably predicts 28-day mortality in children with septic shock. They hypothesized that the same biomarkers would perform similarly in an adult population.

The biomarker panel included 12 analytes, and of these, three—chemokine ligand 3, heat shock protein 70 kDa 1B, and interleukin-8—persisted in the model's upper decision rules in both pediatric and adult populations. In the current, adult study, biomarkers "dominated" the model's upper decision rules, but clinical variables contributed only to lower decision rules or not at all, the authors said. In addition to the biomarker panel and clinical variables, the model also consisted of serum lactate concentration.

The calibrated decision tree applied in the validation cohort had a sensitivity of 85%, specificity of 60%, and positive- and negative-predictive values of 61% and 85%, respectively, in estimating probability of mortality in adults with septic shock. Among subjects in the validation cohort classified by the model as low-risk, the mortality rate was 15.1%, whereas the morality rate was 60.9% in subjects who had been classified by the model as high-risk.

IGRA Less Specific Than Tuberculin Skin Testing

A retrospective chart review found that QuantiFERON-TB Gold In-Tube (QFT-GIT) testing was less specific than tuberculin skin testing (TST) in a diverse population of students at the University of Pennsylvania (Clin Infect Dis 2014;58:1260–6). The findings support the use of TST for college students in the United States and risk-stratified result interpretation for students tested with QFT-GIT, according to the authors.

About 750,000 of the 20 million college students in the United States were not born in the U.S., and many come from countries with a high incidence of Mycobacterium tuberculosis infection and have been vaccinated at birth with Bacillus Calmette-Guerin (BCG). Guidelines of the Centers for Disease Control and Prevention suggest that while TST and interferon gamma release assay (IGRA) may be used interchangeably, IGRA is the preferred test in people who have received BCG vaccination because it does not cross-react with BCG. However, the authors suggested cautious interpretation of this proposed benefit of IGRA because BCG has a variable effect on TST and cross-reactivity wanes over time.

The researchers sought to determine how well QFT-GIT—an IGRA—and TST performed in detecting tuberculosis infection in students with varied risk profiles.

During the study period, 9,483 students received 15,936 tuberculosis tests. Coming from a tuberculosis-endemic country was the only risk factor significantly associated with a positive test result. TST had higher specificity than QFT-GIT; 99.7% versus 91.4%. When the researchers assessed a higher QFT-GIT cutoff of ≥1.0 IU/mL, the test's specificity improved to 96.1%; still less than TST. Use of the manufacturer's recommended cutoff of ≥0.35 IU/mL yielded comparable results to TST in the high-risk group of students born outside the U.S. in tuberculosis-endemic countries.