Aß 42: P-tau Ratio Predicts Alzheimer's Disease 5–10 Years Before Diagnosis

Swedish researchers recently reported that low baseline ß-amyloid (Aß) 42:phosphorylated tau (P-tau) ratio predicted development of Alzheimer's disease within 9.2 years with a positive predictive value of 91% and negative predictive value of 86% (Arch Gen Psychiatry 2012;69:98-106). The findings demonstrate that these markers can identify individuals at high risk of Alzheimer's disease at least 5-10 years before they have symptoms of dementia.

The study involved 137 patients with mild cognitive impairment (MCI) from whom cerebrospinal fluid samples had been collected at baseline, and was a continuation of a previously published study with a median of 5.2 years follow-up. Overall, 41 patients remained cognitively stable, while 57 developed Alzheimer's disease and 21 had other types of dementia. In addition, 39 healthy volunteers served as controls for the study. Besides Aß42 and P-tau, the investigators measured Total tau (T-tau) at baseline.

Patients with MCI who later developed Alzheimer's disease had significantly higher baseline levels of T-tau and P-tau and lower levels of Aß42 than controls, patients with stable MCI, and patients who developed non-Alzheimer's disease-related dementias. The researchers also found that T-tau levels were significantly higher at baseline in early converters to Alzheimer's disease in comparison to patients who were diagnosed in later years of follow-up.

Aldosterone Levels Linked to Outcomes in Coronary Artery Disease

Aldosterone levels are strongly and independently associated with mortality and acute ischemic events in patients with coronary artery disease (CAD) but without heart failure or acute myocardial infarction (AMI) (Eur Heart J 2012;33:191–202). These findings extend the results from previous studies in patients with heart failure and AMI suggesting that aldosterone levels may provide incremental prognostic information beyond typical markers of CAD such as levels of C-reactive protein (CRP) and brain natriuretic peptide (BNP), age, and left ventricular ejection fraction.

The study involved 799 CAD patients referred for elective coronary angioplasty who were followed for a median of 14.9 months afterwards. In addition to aldosterone, CRP, and BNP levels, the investigators took baseline measurements of other biomarkers, including lipids, glucose, creatinine, and HbA1c. Study endpoints included cardiovascular-related mortality, total mortality, acute ischemic events, and the composite of death and acute ischemic events.

Multivariate analysis showed that higher aldosterone levels were associated with a higher New York Heart Association class of heart failure, higher BMI, and hypertension, and inversely correlated with age, beta-blocker therapy, and higher creatinine clearance. Aldosterone levels also were independently associated with cardiovascular death, total mortality, acute ischemic events, and the composite of death and acute ischemic events.

According to the authors, the association between aldosterone and risk of acute ischemic events suggests that aldosterone may interact with the atherosclerotic process. The authors called for a randomized controlled trial to investigate the hypothesis that an aldosterone-receptor blockade can improve prognosis in patients with CAD, whether or not their condition is complicated by left ventricular dysfunction.

Midstream or First-void Urine Specimens Equally Sensitive for C. trachomatis by NAAT

Research involving nucleic acid amplification testing (NAAT) for Chlamydia trachomatis suggests the timing of specimen collection may not be as important as previously thought (Ann Fam Med 2012; 10:50–53). If confirmed in similar studies, the findings could simplify specimen collection practices for this common sexually transmitted bacterium.

The preferred sample collection methods for C. trachomatis are either vaginal swab or first-void urine sample, the latter under the assumption that bacteria in the urethra are more likely to be captured in the first urine passing over the epithelial cells. Meanwhile, midstream urine samples are preferred for the work-up of suspected urinary tract infection (UTI), a common reason for women to seek primary care-based treatment. If results from midstream urine samples proved to be sufficiently sensitive in detecting C. trachomatis, it might increase compliance with testing by making it possible for women to provide easier-to-obtain midstream samples and be tested for both UTI and C. trachomatis. This led the researchers to evaluate whether NAAT testing on midstream urine samples is sufficiently sensitive to be considered for routine clinical use.

The study involved 100 women who previously had positive C. trachomatis results from NAAT testing of vaginal swabs and who were returning for treatment with antibiotics. Participants provided both first-void and midstream urine samples. The researchers tested both types of urine samples from all subjects using NAAT. They found the sensitivity of NAAT from first-void samples to be 96%, and from midstream samples to be 95%.

Based on these findings, the authors conclude that the sensitivity of the two methods is sufficiently equivalent to consider the use of midstream urine collection in clinical practice.

Real-Time Continuous Glucose Monitoring Produces Long-term Effects in Type 2 Diabetics

In comparison to self-monitoring of blood glucose (SMBG), intermittent short-term use of real-time continuous glucose monitoring (RT-CGM) in type 2 diabetics significantly improved glycemic control at 12 weeks and sustained this improvement even without RT-CGM during the subsequent 40 weeks (Diabetes Care 2012;35:32038). The findings suggest that periodic use of RT-CGM could help patients improve glycemic control.

Studies have shown that RT-CGM helps both type 1 diabetics and type 2 diabetics taking prandial insulin to improve their glycemic control, but this technology has not been investigated in type 2 diabetics not on prandial insulin. This led the authors to examine the effect of RT-CGM in the latter group.

The randomized controlled trial of 100 type 2 diabetics not on prandial insulin compared the effects of 12 weeks' intermittent RT-CGM followed by SMBG for 44 weeks, compared with SMBG only for the entire 52 weeks. The primary study outcome was HbA1c; secondary outcomes included RT-CGM and SMBG results, weight, blood pressure, and change in diabetes-related stress.

The authors previously reported that RT-CGM used intermittently over 12 weeks was associated with a clinically significant reduction in HbA1c, compared with SMBG alone. Now they have reported the long-term effects of this 12-week intervention.

The authors found that the reductions in HbA1c observed during the first 12 weeks were sustained over the course of the entire study. Even though the effect of RT-CGM attenuated after 24 weeks, HbA1c levels had not returned to baseline at the end of the study. Mean, unadjusted HbA1c levels decreased by 1.0, 1.2, 0.8, and 0.8% in the RT-CGM group at weeks 12, 24, 38, and 52, respectively. At the same time points, the investigators found mean, unadjusted HbA1c reductions in the SMBG group of 0.5, 0.5, 0.5 and 0.2%, respectively. The magnitude of this improvement was accomplished without intensifying pharmacotherapy compared with the SMBG group. The authors called for further studies to confirm their findings, to uncover the mechanisms behind the improvement, and determine the minimum time needed for RT-CGM to be effective.