20-Minute Void Interval Acceptable in C. trachomatis NAAT
Nucleic acid amplification testing (NAAT) for Chlamydia trachomatis after a 20-minute voiding interval is sufficiently sensitive in asymptomatic men that this protocol could be used in opportunistic screening for this organism, according to the authors of a new study (Sex Transm Dis 2012;39:405–406). NAAT manufacturers generally recommend a minimum voiding interval of 1 or 2 hours before testing for C. trachomatis, but these recommendations are not based on substantive evidence, and may be a barrier to screening in some men, the researchers suggested. Prior studies in predominantly symptomatic men have found no difference in NAAT sensitivity using less than the recommended minimum voiding intervals; however, asymptomatic men typically have a considerably lower organism load, so are more likely to have less sensitive results with shorter voiding intervals.
The researchers recruited men returning to a sexual health center for treatment of lab-confirmed urethral C. trachomatis infection who had no evidence of symptomatic urethritis. Patients completed a short questionnaire at this recall visit to determine whether they had not voided within the past 1 hour, were taking any antibiotics, or had developed any minor urethral symptoms since their initial positive C. trachomatis test. Two first void urine samples were collected, the first at >1 hour since last void, and the second 20 minutes later.
Of 31 pairs of first void samples, the researchers found that all the initial samples taken >1 hour since last void were positive, and 29, or 93.5% of the second samples taken 20 minutes later also were positive.
MRSA Outbreak Investigation: Clinically Actionable Data from Whole-Genome Sequencing
British researchers recently used rapid whole-genome sequencing to investigate a neonatal methicillin-resistant Staphylococcus aureus (MRSA) outbreak and concluded that this method can provide clinically relevant data within a timeframe that can influence patient care. However, the need for automated data interpretation and clinically meaningful reports are barriers to implementing this in routine practice (N Engl J Med 2012;366:2267–75).
After a baby transferred from a general hospital to the neonatal intensive care unit of a women’s and children’s hospital developed MRSA bacteremia, the hospital initiated special infection control precautions and an infection control investigation, including screening patients and staff for MRSA colonization, antimicrobial susceptibility, and DNA sequencing and analysis for each MRSA isolate. The latter included seven isolates associated with the outbreak, and seven others associated with MRSA carriage or bacteremia.
The researchers aligned DNA sequences with a reference isolate of the most common hospital-associated MRSA clone in the U.K. to identify single-nucleotide polymorphisms (SNPs), insertions, and deletions. This process revealed a distinct cluster of outbreak isolates and a clear difference between these and non-outbreak isolates. The authors also established the antimicrobial resistance profile of the outbreak strain by searching for SNPs previously reported to determine resistance to ciprofloxacin and rifampin and comparing it to the resistance pattern defined by standard antimicrobial susceptibility testing.
According to the authors, their findings suggest that there could be value in performing whole-genome sequencing in real time as an integral part of MRSA control in hospitals.
Risk Model Predicts Mortality in Acute Heart Failure
A risk model incorporating levels of creatinine, potassium, and troponin along with eight other variables stratifies mortality risk with high discrimination in patients with acute heart failure (HF) presenting to the emergency department (ED) (Ann Intern Med 2012;156:767–775). The findings suggest that this simple clinical risk model can help clinicians improve care and outcomes for acute HF patients.
The authors developed the model because the current standard of care for decompensated acute HF patients presenting to the ED relies predominantly on clinical judgment. This may result in excess hospitalization of low-risk patients and discharge of high-risk patients who may subsequently require further treatment. Previously developed algorithms focused on hospitalized HF patients, not those in the ED.
To develop the model, the researchers studied data from 12,500 patients who presented to the ED with acute HF and were either discharged or hospitalized. Trained nurse abstractors reviewed for an extensive list of covariates a random subset of cases that had been selected for chart abstraction. Variables determined through a series of statistical analyses to be the most prognostic for the primary outcome, 7-day mortality after initial presentation, were incorporated in the risk model. Troponin levels above the upper limit of normal were associated with an odds ratio of 2.75; potassium levels ≥4.6 mmol/L and ≤ 3.9 mmol/L were associated with odds ratios of 1.71 and 1.09, respectively. Each 1 mg/dL increase in creatinine was associated with a 1.35 odds ratio. Other factors included age, whether the patient was transported to the ED by ambulance, blood pressure, heart rate, oxygen saturation, and whether the patient had cancer or had taken metolazone at home.
Areas under the receiver receiver-operating characteristic curve for this model were 0.805 and 0.826 in the derivation and validation data sets, respectively.
Age-Dependent Cutoffs Increase Specificity of D-Dimer Testing
Compared with the conventional cutoff of 500 µg/L, a low clinical probability of deep vein thrombosis (DVT) and age-dependent D-dimer cutoffs considerably increased the proportion of primary care patients in whom DVT can be safely excluded (BMJ 2012;344:32985).
Risk of venous thromboembolism (VTE) rises with age, but comorbidities can mask symptoms. D-dimer, a degradation product of fibrin, is helpful in ruling out patients with suspected, but low pre-test probability of VTE. However, D-dimer levels rise with age, thereby decreasing specificity of D-dimer test results with increasing patient age. Recently, age-specific D-dimer cutoffs have been proposed, but they were not validated in patients with suspected DVT. This led the investigators to evaluate age-adapted cutoffs for excluding DVT in primary care patients with suspected DVT. For patients older than age 50, these cutoffs were based on age in years multiplied by 10 µg/L; for those age 60 or older, the cutoff was 750 µg/L.
The study involved 1,374 consecutive patients with clinically suspected DVT, more than two-thirds of whom were older than age 50. The patients were assessed clinically using the Wells clinical probability score, and measured with one of two D-dimer assays. They also underwent compression ultrasonography of the symptomatic leg, repeated at day seven. In all, 647 patients had an unlikely clinical probability of DVT based on Wells score. Of these, DVT could be excluded in 47.8% using age-dependent cutoff values, compared with 42% using the conventional 500 µg/L cutoff. The age-dependent cutoff had the greatest impact in patients older than age 80; 21% would have been ruled out for DVT based on the standard cutoff versus 33.9% based on the age-dependent threshold.