Showing papers in "JAMA in 2005"
••
TL;DR: In this article, the period prevalence of acute renal failure (ARF) requiring renal replacement therapy (RRT) was found to be between 5% and 6% and was associated with a high hospital mortality rate.
Abstract: ContextAlthough acute renal failure (ARF) is believed to be common in the setting
of critical illness and is associated with a high risk of death, little is
known about its epidemiology and outcome or how these vary in different regions
of the world.ObjectivesTo determine the period prevalence of ARF in intensive care unit (ICU)
patients in multiple countries; to characterize differences in etiology, illness
severity, and clinical practice; and to determine the impact of these differences
on patient outcomes.Design, Setting, and PatientsProspective observational study of ICU patients who either were treated
with renal replacement therapy (RRT) or fulfilled at least 1 of the predefined
criteria for ARF from September 2000 to December 2001 at 54 hospitals in 23
countries.Main Outcome MeasuresOccurrence of ARF, factors contributing to etiology, illness severity,
treatment, need for renal support after hospital discharge, and hospital mortality.ResultsOf 29 269 critically ill patients admitted during the study period,
1738 (5.7%; 95% confidence interval [CI], 5.5%-6.0%) had ARF during their
ICU stay, including 1260 who were treated with RRT. The most common contributing
factor to ARF was septic shock (47.5%; 95% CI, 45.2%-49.5%). Approximately
30% of patients had preadmission renal dysfunction. Overall hospital mortality
was 60.3% (95% CI, 58.0%-62.6%). Dialysis dependence at hospital discharge
was 13.8% (95% CI, 11.2%-16.3%) for survivors. Independent risk factors for
hospital mortality included use of vasopressors (odds ratio [OR], 1.95; 95%
CI, 1.50-2.55; P<.001), mechanical ventilation
(OR, 2.11; 95% CI, 1.58-2.82; P<.001), septic
shock (OR, 1.36; 95% CI, 1.03-1.79; P = .03),
cardiogenic shock (OR, 1.41; 95% CI, 1.05-1.90; P = .02),
and hepatorenal syndrome (OR, 1.87; 95% CI, 1.07-3.28; P = .03).ConclusionIn this multinational study, the period prevalence of ARF requiring
RRT in the ICU was between 5% and 6% and was associated with a high hospital
mortality rate.
3,706 citations
••
TL;DR: The cumulative incidence of stent thrombosis 9 months after successful drug-eluting stent implantation in consecutive "real-world" patients was substantially higher than the rate reported in clinical trials.
Abstract: ContextTraditionally, stent thrombosis has been regarded as a complication
of percutaneous coronary interventions during the first 30 postprocedural
days. However, delayed endothelialization associated with the implantation
of drug-eluting stents may extend the risk of thrombosis beyond 30 days. Data
are limited regarding the risks and the impact of this phenomenon outside
clinical trials.ObjectiveTo evaluate the incidence, predictors, and clinical outcome of stent
thrombosis after implantation of sirolimus-eluting and paclitaxel-eluting
stents in routine clinical practice.Design, Setting, and PatientsProspective observational cohort study conducted at 1 academic hospital
and 2 community hospitals in Germany and Italy. A total of 2229 consecutive
patients underwent successful implantation of sirolimus-eluting (1062 patients,
1996 lesions, 2272 stents) or paclitaxel-eluting (1167 patients, 1801 lesions,
2223 stents) stents between April 2002 and January 2004.InterventionsImplantation of a drug-eluting stent (sirolimus or paclitaxel). All
patients were pretreated with ticlopidine or clopidogrel and aspirin. Aspirin
was continued indefinitely and clopidogrel or ticlopidine for at least 3 months
after sirolimus-eluting and for at least 6 months after paclitaxel-eluting
stent implantation.Main Outcome MeasuresSubacute thrombosis (from procedure end through 30 days), late thrombosis
(>30 days), and cumulative stent thrombosis.ResultsAt 9-month follow-up, 29 patients (1.3%) had stent thrombosis (9 [0.8%]
with sirolimus and 20 [1.7%] with paclitaxel; P = .09).
Fourteen patients had subacute thrombosis (0.6%) and 15 patients had late
thrombosis (0.7%). Among these 29 patients, 13 died (case fatality rate, 45%).
Independent predictors of stent thrombosis were premature antiplatelet therapy
discontinuation (hazard ratio [HR], 89.78; 95% CI, 29.90-269.60; P<.001), renal failure (HR, 6.49; 95% CI,
2.60-16.15; P<.001), bifurcation lesions (HR, 6.42;
95% CI, 2.93-14.07; P<.001), diabetes (HR, 3.71;
95% CI, 1.74-7.89; P = .001), and a lower
ejection fraction (HR, 1.09; 95% CI, 1.05-1.36; P<.001 for each 10% decrease).ConclusionsThe cumulative incidence of stent thrombosis 9 months after successful
drug-eluting stent implantation in consecutive “real-world” patients
was substantially higher than the rate reported in clinical trials. Premature
antiplatelet therapy discontinuation, renal failure, bifurcation lesions,
diabetes, and low ejection fraction were identified as predictors of thrombotic
events.
3,050 citations
••
TL;DR: Improvement in practitioner performance was associated with CDSSs that automatically prompted users compared with requiring users to activate the system and studies in which the authors were not the developers, as well as other factors.
Abstract: ContextDevelopers of health care software have attributed improvements in patient
care to these applications. As with any health care intervention, such claims
require confirmation in clinical trials.ObjectivesTo review controlled trials assessing the effects of computerized clinical
decision support systems (CDSSs) and to identify study characteristics predicting
benefit.Data SourcesWe updated our earlier reviews by searching the MEDLINE, EMBASE, Cochrane
Library, Inspec, and ISI databases and consulting reference lists through
September 2004. Authors of 64 primary studies confirmed data or provided additional
information.Study SelectionWe included randomized and nonrandomized controlled trials that evaluated
the effect of a CDSS compared with care provided without a CDSS on practitioner
performance or patient outcomes.Data ExtractionTeams of 2 reviewers independently abstracted data on methods, setting,
CDSS and patient characteristics, and outcomes.Data SynthesisOne hundred studies met our inclusion criteria. The number and methodologic
quality of studies improved over time. The CDSS improved practitioner performance
in 62 (64%) of the 97 studies assessing this outcome, including 4 (40%) of
10 diagnostic systems, 16 (76%) of 21 reminder systems, 23 (62%) of 37 disease
management systems, and 19 (66%) of 29 drug-dosing or prescribing systems.
Fifty-two trials assessed 1 or more patient outcomes, of which 7 trials (13%)
reported improvements. Improved practitioner performance was associated with
CDSSs that automatically prompted users compared with requiring users to activate
the system (success in 73% of trials vs 47%; P = .02)
and studies in which the authors also developed the CDSS software compared
with studies in which the authors were not the developers (74% success vs
28%; respectively, P = .001).ConclusionsMany CDSSs improve practitioner performance. To date, the effects on
patient outcomes remain understudied and, when studied, inconsistent.
2,875 citations
••
World Health Organization1, University of Otago2, Columbia University3, American Foundation for Suicide Prevention4, Ludwig Maximilian University of Munich5, National Institute for Health and Welfare6, University College Dublin7, University of Oslo8, Uppsala University9, University of Würzburg10, National Defense Medical College11, Karolinska Institutet12, University of Hong Kong13
TL;DR: Physician education in depression recognition and treatment and restricting access to lethal methods reduce suicide rates, and other interventions need more evidence of efficacy.
Abstract: ContextIn 2002, an estimated 877 000 lives were lost worldwide through
suicide. Some developed nations have implemented national suicide prevention
plans. Although these plans generally propose multiple interventions, their
effectiveness is rarely evaluated.ObjectivesTo examine evidence for the effectiveness of specific suicide-preventive
interventions and to make recommendations for future prevention programs and
research.Data Sources and Study SelectionRelevant publications were identified via electronic searches of MEDLINE,
the Cochrane Library, and PsychINFO databases using multiple search terms
related to suicide prevention. Studies, published between 1966 and June 2005,
included those that evaluated preventative interventions in major domains;
education and awareness for the general public and for professionals; screening
tools for at-risk individuals; treatment of psychiatric disorders; restricting
access to lethal means; and responsible media reporting of suicide.Data ExtractionData were extracted on primary outcomes of interest: suicidal behavior
(completion, attempt, ideation), intermediary or secondary outcomes (treatment
seeking, identification of at-risk individuals, antidepressant prescription/use
rates, referrals), or both. Experts from 15 countries reviewed all studies.
Included articles were those that reported on completed and attempted suicide
and suicidal ideation; or, where applicable, intermediate outcomes, including
help-seeking behavior, identification of at-risk individuals, entry into treatment,
and antidepressant prescription rates. We included 3 major types of studies
for which the research question was clearly defined: systematic reviews and
meta-analyses (n = 10); quantitative studies, either randomized
controlled trials (n = 18) or cohort studies (n = 24);
and ecological, or population- based studies (n = 41). Heterogeneity
of study populations and methodology did not permit formal meta-analysis;
thus, a narrative synthesis is presented.Data SynthesisEducation of physicians and restricting access to lethal means were
found to prevent suicide. Other methods including public education, screening
programs, and media education need more testing.ConclusionsPhysician education in depression recognition and treatment and restricting
access to lethal methods reduce suicide rates. Other interventions need more
evidence of efficacy. Ascertaining which components of suicide prevention
programs are effective in reducing rates of suicide and suicide attempt is
essential in order to optimize use of limited resources.
2,649 citations
••
TL;DR: For example, this paper found that obesity was associated with 111 909 excess deaths (95% confidence interval [CI], 53 754170 064) and underweight with 33 746 excess deaths.
Abstract: Results Relative to the normal weight category (BMI 18.5 to 25), obesity (BMI 30) was associated with 111 909 excess deaths (95% confidence interval [CI], 53 754170 064) and underweight with 33 746 excess deaths (95% CI, 15 726-51 766). Overweight was not associated with excess mortality (�86 094 deaths; 95% CI, �161 223 to �10 966). The relative risks of mortality associated with obesity were lower in NHANES II and NHANES III than in NHANES I. Conclusions Underweight and obesity, particularly higher levels of obesity, were associated with increased mortality relative to the normal weight category. The impact of obesity on mortality may have decreased over time, perhaps because of improvements in public health and medical care. These findings are consistent with the increases in life expectancy in the United States and the declining mortality rates from ischemic heart disease.
2,566 citations
••
TL;DR: Substantial evidence supports screening all patients with diabetes to identify those at risk for foot ulceration and recommending certain prophylactic interventions, including patient education, prescription footwear, intensive podiatric care, and evaluation for surgical interventions.
Abstract: ContextAmong persons diagnosed as having diabetes mellitus, the prevalence
of foot ulcers is 4% to 10%, the annual population-based incidence is 1.0%
to 4.1%, and the lifetime incidence may be as high as 25%. These ulcers frequently
become infected, cause great morbidity, engender considerable financial costs,
and are the usual first step to lower extremity amputation.ObjectiveTo systematically review the evidence on the efficacy of methods advocated
for preventing diabetic foot ulcers in the primary care setting.Data Sources, Study Selection, and Data ExtractionThe EBSCO, MEDLINE, and the National Guideline Clearinghouse databases
were searched for articles published between January 1980 and April 2004 using
database-specific keywords. Bibliographies of retrieved articles were also
searched, along with the Cochrane Library and relevant Web sites. We reviewed
the retrieved literature for pertinent information, paying particular attention
to prospective cohort studies and randomized clinical trials.Data SynthesisPrevention of diabetic foot ulcers begins with screening for loss of
protective sensation, which is best accomplished in the primary care setting
with a brief history and the Semmes-Weinstein monofilament. Specialist clinics
may quantify neuropathy with biothesiometry, measure plantar foot pressure,
and assess lower extremity vascular status with Doppler ultrasound and ankle-brachial
blood pressure indices. These measurements, in conjunction with other findings
from the history and physical examination, enable clinicians to stratify patients
based on risk and to determine the type of intervention. Educating patients
about proper foot care and periodic foot examinations are effective interventions
to prevent ulceration. Other possibly effective clinical interventions include
optimizing glycemic control, smoking cessation, intensive podiatric care,
debridement of calluses, and certain types of prophylactic foot surgery. The
value of various types of prescription footwear for ulcer prevention is not
clear.ConclusionsSubstantial evidence supports screening all patients with diabetes to
identify those at risk for foot ulceration. These patients might benefit from
certain prophylactic interventions, including patient education, prescription
footwear, intensive podiatric care, and evaluation for surgical interventions.
2,469 citations
••
TL;DR: It is suggested that adhering to current CPGs in caring for an older person with several comorbidities may have undesirable effects and could create perverse incentives that emphasize the wrong aspects of care for this population and diminish the quality of their care.
Abstract: ContextClinical practice guidelines (CPGs) have been developed to improve the
quality of health care for many chronic conditions. Pay-for-performance initiatives
assess physician adherence to interventions that may reflect CPG recommendations.ObjectiveTo evaluate the applicability of CPGs to the care of older individuals
with several comorbid diseases.Data SourcesThe National Health Interview Survey and a nationally representative
sample of Medicare beneficiaries (to identify the most prevalent chronic diseases
in this population); the National Guideline Clearinghouse (for locating evidence-based
CPGs for each chronic disease).Study SelectionOf the 15 most common chronic diseases, we selected hypertension, chronic
heart failure, stable angina, atrial fibrillation, hypercholesterolemia, diabetes
mellitus, osteoarthritis, chronic obstructive pulmonary disease, and osteoporosis,
which are usually managed in primary care, choosing CPGs promulgated by national
and international medical organizations for each.Data ExtractionTwo investigators independently assessed whether each CPG addressed
older patients with multiple comorbid diseases, goals of treatment, interactions
between recommendations, burden to patients and caregivers, patient preferences,
life expectancy, and quality of life. Differences were resolved by consensus.
For a hypothetical 79-year-old woman with chronic obstructive pulmonary disease,
type 2 diabetes, osteoporosis, hypertension, and osteoarthritis, we aggregated
the recommendations from the relevant CPGs.Data SynthesisMost CPGs did not modify or discuss the applicability of their recommendations
for older patients with multiple comorbidities. Most also did not comment
on burden, short- and long-term goals, and the quality of the underlying scientific
evidence, nor give guidance for incorporating patient preferences into treatment
plans. If the relevant CPGs were followed, the hypothetical patient would
be prescribed 12 medications (costing her $406 per month) and a complicated
nonpharmacological regimen. Adverse interactions between drugs and diseases
could result.ConclusionsThis review suggests that adhering to current CPGs in caring for an
older person with several comorbidities may have undesirable effects. Basing
standards for quality of care and pay for performance on existing CPGs could
lead to inappropriate judgment of the care provided to older individuals with
complex comorbidities and could create perverse incentives that emphasize
the wrong aspects of care for this population and diminish the quality of
their care. Developing measures of the quality of the care needed by older
patients with complex comorbidities is critical to improving their care.
2,247 citations
••
TL;DR: It is found that a leading CPOE system often facilitated medication error risks, with many reported to occur frequently, and multiple qualitative and survey methods identified and quantified error risks not previously considered.
Abstract: ContextHospital computerized physician order entry (CPOE) systems are widely
regarded as the technical solution to medication ordering errors, the largest
identified source of preventable hospital medical error. Published studies
report that CPOE reduces medication errors up to 81%. Few researchers, however,
have focused on the existence or types of medication errors facilitated by
CPOE.ObjectiveTo identify and quantify the role of CPOE in facilitating prescription
error risks.Design, Setting, and ParticipantsWe performed a qualitative and quantitative study of house staff interaction
with a CPOE system at a tertiary-care teaching hospital (2002-2004). We surveyed
house staff (N = 261; 88% of CPOE users); conducted 5 focus groups
and 32 intensive one-on-one interviews with house staff, information technology
leaders, pharmacy leaders, attending physicians, and nurses; shadowed house
staff and nurses; and observed them using CPOE. Participants included house
staff, nurses, and hospital leaders.Main Outcome MeasureExamples of medication errors caused or exacerbated by the CPOE system.ResultsWe found that a widely used CPOE system facilitated 22 types of medication
error risks. Examples include fragmented CPOE displays that prevent a coherent
view of patients’ medications, pharmacy inventory displays mistaken
for dosage guidelines, ignored antibiotic renewal notices placed on paper
charts rather than in the CPOE system, separation of functions that facilitate
double dosing and incompatible orders, and inflexible ordering formats generating
wrong orders. Three quarters of the house staff reported observing each of
these error risks, indicating that they occur weekly or more often. Use of
multiple qualitative and survey methods identified and quantified error risks
not previously considered, offering many opportunities for error reduction.ConclusionsIn this study, we found that a leading CPOE system often facilitated
medication error risks, with many reported to occur frequently. As CPOE systems
are implemented, clinicians and hospitals must attend to errors that these
systems cause in addition to errors that they prevent.
2,031 citations
••
TL;DR: Evidence shows that conclusions about nonsocioeconomic causes of racial/ethnic differences in health may depend on the measure-eg, income, wealth, education, occupation, neighborhood socioeconomic characteristics, or past socioeconomic experiences used to "control for SES," suggesting that findings from studies that have measured limited aspects of SES should be reassessed.
Abstract: Problems with measuring socioeconomic status (SES)—frequently included in clinical and public health studies as a control variable and less frequently as the variable(s) of main interest—could affect research findings and conclusions, with implications for practice and policy.Wecritically examine standard SES measurement approaches, illustrating problems with examples from new analyses and the literature. For example, marked racial/ethnic differences in income at a given educational level and in wealth at a given income level raise questions about the socioeconomic comparability of individuals who are similar on education or income alone. Evidence also shows that conclusions about nonsocioeconomic causes of racial/ethnic differences in health may depend on the measure—eg, income, wealth, education, occupation, neighborhood socioeconomic characteristics, or past socioeconomic experiences—used to “control for SES,” suggesting that findings from studies that have measured limited aspects of SES should be reassessed. We recommend an outcome- and social group–specific approach to SES measurement that involves (1) considering plausible explanatory pathways and mechanisms, (2) measuring as much relevant socioeconomic information as possible, (3) specifying the particular socioeconomic factors measured (rather than SES overall), and (4) systematically considering how potentially important unmeasured socioeconomic factors may affect conclusions. Better SES measures are needed in data sources, but improvements could be made by using existing information more thoughtfully and acknowledging its limitations.
1,974 citations
••
TL;DR: Patients with cancer have a highly increased risk of venous thrombosis especially in the first few months after diagnosis and in the presence of distant metastases, which is a common complication in patients with cancer.
Abstract: Context Venous thrombosis is a common complication in patients with cancer, leading to additional morbidity and compromising quality of life.
Objective To identify individuals with cancer with an increased thrombotic risk, evaluating different tumor sites, the presence of distant metastases, and carrier status of prothrombotic mutations.
Design, Setting, and Patients A large population-based, case-control (Multiple Environmental and Genetic Assessment [MEGA] of risk factors for venous thrombosis) study of 3220 consecutive patients aged 18 to 70 years, with a first deep venous thrombosis of the leg or pulmonary embolism, between March 1, 1999, and May 31, 2002, at 6 anticoagulation clinics in the Netherlands, and separate 2131 control participants (partners of the patients) reported via a questionnaire on acquired risk factors for venous thrombosis. Three months after discontinuation of the anticoagulant therapy, all patients and controls were interviewed, a blood sample was taken, and DNA was isolated to ascertain the factor V Leiden and prothrombin 20210A mutations.
Main Outcome Measure Risk of venous thrombosis.
Results The overall risk of venous thrombosis was increased 7-fold in patients with a malignancy (odds ratio [OR], 6.7; 95% confidence interval [CI], 5.2-8.6) vs persons without malignancy. Patients with hematological malignancies had the highest risk of venous thrombosis, adjusted for age and sex (adjusted OR, 28.0; 95% CI, 4.0-199.7), followed by lung cancer and gastrointestinal cancer. The risk of venous thrombosis was highest in the first few months after the diagnosis of malignancy (adjusted OR, 53.5; 95% CI, 8.6-334.3). Patients with cancer with distant metastases had a higher risk vs patients without distant metastases (adjusted OR, 19.8; 95% CI, 2.6-149.1). Carriers of the factor V Leiden mutation who also had cancer had a 12-fold increased risk vs individuals without cancer and factor V Leiden (adjusted OR, 12.1; 95% CI, 1.6-88.1). Similar results were indirectly calculated for the prothrombin 20210A mutation in patients with cancer.
Conclusions Patients with cancer have a highly increased risk of venous thrombosis especially in the first few months after diagnosis and in the presence of distant metastases. Carriers of the factor V Leiden and prothrombin 20210A mutations appear to have an even higher risk.
1,673 citations
••
TL;DR: Overall dietary adherence rates were low, although increased adherence was associated with greater weight loss and cardiac risk factor reductions for each diet group, and each popular diet modestly reduced body weight and several cardiac risk factors at 1 year.
Abstract: ContextThe scarcity of data addressing the health effects of popular diets
is an important public health concern, especially since patients and physicians
are interested in using popular diets as individualized eating strategies
for disease prevention.ObjectiveTo assess adherence rates and the effectiveness of 4 popular diets (Atkins,
Zone, Weight Watchers, and Ornish) for weight loss and cardiac risk factor
reduction.Design, Setting, and ParticipantsA single-center randomized trial at an academic medical center in Boston,
Mass, of overweight or obese (body mass index: mean, 35; range, 27-42) adults
aged 22 to 72 years with known hypertension, dyslipidemia, or fasting hyperglycemia.
Participants were enrolled starting July 18, 2000, and randomized to 4 popular
diet groups until January 24, 2002.InterventionA total of 160 participants were randomly assigned to either Atkins
(carbohydrate restriction, n=40), Zone (macronutrient balance, n=40), Weight
Watchers (calorie restriction, n=40), or Ornish (fat restriction, n=40) diet
groups. After 2 months of maximum effort, participants selected their own
levels of dietary adherence.Main Outcome MeasuresOne-year changes in baseline weight and cardiac risk factors, and self-selected
dietary adherence rates per self-report.ResultsAssuming no change from baseline for participants who discontinued the
study, mean (SD) weight loss at 1 year was 2.1 (4.8) kg for Atkins (21 [53%]
of 40 participants completed, P = .009),
3.2 (6.0) kg for Zone (26 [65%] of 40 completed, P = .002),
3.0 (4.9) kg for Weight Watchers (26 [65%] of 40 completed, P < .001), and 3.3 (7.3) kg for Ornish (20 [50%] of 40
completed, P = .007). Greater effects were
observed in study completers. Each diet significantly reduced the low-density
lipoprotein/high-density lipoprotein (HDL) cholesterol ratio by approximately
10% (all P<.05), with no significant effects on
blood pressure or glucose at 1 year. Amount of weight loss was associated
with self-reported dietary adherence level (r = 0.60; P<.001) but not with diet type (r = 0.07; P = .40). For
each diet, decreasing levels of total/HDL cholesterol, C-reactive protein,
and insulin were significantly associated with weight loss (mean r = 0.36, 0.37, and 0.39, respectively) with no significant
difference between diets (P = .48, P = .57, P = .31,
respectively).ConclusionsEach popular diet modestly reduced body weight and several cardiac risk
factors at 1 year. Overall dietary adherence rates were low, although increased
adherence was associated with greater weight loss and cardiac risk factor
reductions for each diet group.
••
TL;DR: Atypical antipsychotic drugs may be associated with a small increased risk for death compared with placebo, and this risk should be considered within the context of medical need for the drugs, efficacy evidence, medical comorbidity, and the efficacy and safety of alternatives.
Abstract: ContextAtypical antipsychotic medications are widely used to treat delusions,
aggression, and agitation in people with Alzheimer disease and other dementia;
however, concerns have arisen about the increased risk for cerebrovascular
adverse events, rapid cognitive decline, and mortality with their use.ObjectiveTo assess the evidence for increased mortality from atypical antipsychotic
drug treatment for people with dementia.Data SourcesMEDLINE (1966 to April 2005), the Cochrane Controlled Trials Register
(2005, Issue 1), meetings presentations (1997-2004), and information from
the sponsors were searched using the terms for atypical antipsychotic drugs
(aripiprazole, clozapine, olanzapine, quetiapine, risperidone, and ziprasidone), dementia, Alzheimer disease, and clinical trial.Study SelectionPublished and unpublished randomized placebo-controlled, parallel-group
clinical trials of atypical antipsychotic drugs marketed in the United States
to treat patients with Alzheimer disease or dementia were selected by consensus
of the authors.Data ExtractionTrials, baseline characteristics, outcomes, all-cause dropouts, and
deaths were extracted by one reviewer; treatment exposure was obtained or
estimated. Data were checked by a second reviewer.Data SynthesisFifteen trials (9 unpublished), generally 10 to 12 weeks in duration,
including 16 contrasts of atypical antipsychotic drugs with placebo met criteria
(aripiprazole [n = 3], olanzapine [n = 5], quetiapine
[n = 3], risperidone [n = 5]). A total of 3353 patients
were randomized to study drug and 1757 were randomized to placebo. Outcomes
were assessed using standard methods (with random- or fixed-effects models)
to calculate odds ratios (ORs) and risk differences based on patients randomized
and relative risks based on total exposure to treatment. There were no differences
in dropouts. Death occurred more often among patients randomized to drugs
(118 [3.5%] vs 40 [2.3%]. The OR by meta-analysis was 1.54; 95% confidence
interval [CI], 1.06-2.23; P = .02; and
risk difference was 0.01; 95% CI, 0.004-0.02; P = .01).
Sensitivity analyses did not show evidence for differential risks for individual
drugs, severity, sample selection, or diagnosis.ConclusionsAtypical antipsychotic drugs may be associated with a small increased
risk for death compared with placebo. This risk should be considered within
the context of medical need for the drugs, efficacy evidence, medical comorbidity,
and the efficacy and safety of alternatives. Individual patient analyses modeling
survival and causes of death are needed.
••
TL;DR: Oral vitamin D supplementation between 700 to 800IU/d appears to reduce the risk of hip and any nonvertebral fractures in ambulatory or institutionalized elderly persons and an oral vitamin D dose of 400 IU/d is not sufficient for fracture prevention.
Abstract: ContextThe role and dose of oral vitamin D supplementation in nonvertebral
fracture prevention have not been well established.ObjectiveTo estimate the effectiveness of vitamin D supplementation in preventing
hip and nonvertebral fractures in older persons.Data SourcesA systematic review of English and non-English articles using MEDLINE
and the Cochrane Controlled Trials Register (1960-2005), and EMBASE (1991-2005).
Additional studies were identified by contacting clinical experts and searching
bibliographies and abstracts presented at the American Society for Bone and
Mineral Research (1995-2004). Search terms included randomized
controlled trial (RCT), controlled clinical trial, random allocation,double-blind
method, cholecalciferol,ergocalciferol,25-hydroxyvitamin D, fractures, humans, elderly, falls, and bone
density.Study SelectionOnly double-blind RCTs of oral vitamin D supplementation (cholecalciferol,
ergocalciferol) with or without calcium supplementation vs calcium supplementation
or placebo in older persons (≥60 years) that examined hip or nonvertebral
fractures were included.Data ExtractionIndependent extraction of articles by 2 authors using predefined data
fields, including study quality indicators.Data SynthesisAll pooled analyses were based on random-effects models. Five RCTs for
hip fracture (n = 9294) and 7 RCTs for nonvertebral fracture risk
(n = 9820) met our inclusion criteria. All trials used cholecalciferol.
Heterogeneity among studies for both hip and nonvertebral fracture prevention
was observed, which disappeared after pooling RCTs with low-dose (400 IU/d)
and higher-dose vitamin D (700-800 IU/d), separately. A vitamin D dose of
700 to 800 IU/d reduced the relative risk (RR) of hip fracture by 26% (3 RCTs
with 5572 persons; pooled RR, 0.74; 95% confidence interval [CI], 0.61-0.88)
and any nonvertebral fracture by 23% (5 RCTs with 6098 persons; pooled RR,
0.77; 95% CI, 0.68-0.87) vs calcium or placebo. No significant benefit was
observed for RCTs with 400 IU/d vitamin D (2 RCTs with 3722 persons; pooled
RR for hip fracture, 1.15; 95% CI, 0.88-1.50; and pooled RR for any nonvertebral
fracture, 1.03; 95% CI, 0.86-1.24).ConclusionsOral vitamin D supplementation between 700 to 800 IU/d appears to reduce
the risk of hip and any nonvertebral fractures in ambulatory or institutionalized
elderly persons. An oral vitamin D dose of 400 IU/d is not sufficient for
fracture prevention.
••
TL;DR: The extent of compromised mental health among refugees (including internally displaced persons, asylum seekers, and stateless persons) using a worldwide study sample is meta-analytically established.
Abstract: ContextThe global refugee crisis requires that researchers, policymakers, and
clinicians comprehend the magnitude of the psychological consequences of forced
displacement and the factors that moderate them. To date, no empirical synthesis
of research on these issues has been undertaken.ObjectiveTo meta-analytically establish the extent of compromised mental health
among refugees (including internally displaced persons, asylum seekers, and
stateless persons) using a worldwide study sample. Potential moderators of
mental health outcomes were examined, including enduring contextual variables
(eg, postdisplacement accommodation and economic opportunity) and refugee
characteristics.Data SourcesPublished studies (1959-2002) were obtained using broad searches of
computerized databases (PsycINFO and PILOTS), manual searches of reference
lists, and interviews with prominent authors.Study SelectionStudies were selected if they investigated a refugee group and at least
1 nonrefugee comparison group and reported 1 or more quantitative group comparison
on measures of psychopathology. Fifty-six reports met inclusion criteria (4.4%
of identified reports), yielding 59 independent comparisons and including
67 294 participants (22 221 refugees and 45 073 nonrefugees).Data ExtractionData on study and report characteristics, study participant characteristics,
and statistical outcomes were extracted using a coding manual and subjected
to blind recoding, which indicated high reliability. Methodological quality
information was coded to assess potential sources of bias.Data SynthesisEffect size estimates for the refugee-nonrefugee comparisons were averaged
across psychopathology measures within studies and weighted by sample size.
The weighted mean effect size was 0.41 (SD, 0.02; range, −1.36 to 2.91
[SE, 0.01]), indicating that refugees had moderately poorer outcomes. Postdisplacement
conditions moderated mental health outcomes. Worse outcomes were observed
for refugees living in institutional accommodation, experiencing restricted
economic opportunity, displaced internally within their own country, repatriated
to a country they had previously fled, or whose initiating conflict was unresolved.
Refugees who were older, more educated, and female and who had higher predisplacement
socioeconomic status and rural residence also had worse outcomes. Methodological
differences between studies affected effect sizes.ConclusionsThe sociopolitical context of the refugee experience is associated with
refugee mental health. Humanitarian efforts that improve these conditions
are likely to have positive impacts.
••
TL;DR: The greatest benefit occurred in women who performed the equivalent of walking 3 to 5 hours per week at an average pace, with little evidence of a correlation between increased benefit and greater energy expenditure.
Abstract: ContextPhysical activity has been shown to decrease the incidence of breast
cancer, but the effect on recurrence or survival after a breast cancer diagnosis
is not known.ObjectiveTo determine whether physical activity among women with breast cancer
decreases their risk of death from breast cancer compared with more sedentary
women.Design, Setting, and ParticipantsProspective observational study based on responses from 2987 female
registered nurses in the Nurses’ Health Study who were diagnosed with
stage I, II, or III breast cancer between 1984 and 1998 and who were followed
up until death or June 2002, whichever came first.Main Outcome MeasureBreast cancer mortality risk according to physical activity category
(<3, 3-8.9, 9-14.9, 15-23.9, or ≥24 metabolic equivalent task [MET]
hours per week).ResultsCompared with women who engaged in less than 3 MET-hours per week of
physical activity, the adjusted relative risk (RR) of death from breast cancer
was 0.80 (95% confidence interval [CI], 0.60-1.06) for 3 to 8.9 MET-hours
per week; 0.50 (95% CI, 0.31-0.82) for 9 to 14.9 MET-hours per week; 0.56
(95% CI, 0.38-0.84) for 15 to 23.9 MET-hours per week; and 0.60 (95% CI, 0.40-0.89)
for 24 or more MET-hours per week (P for trend =
.004). Three MET-hours is equivalent to walking at average pace of 2 to 2.9
mph for 1 hour. The benefit of physical activity was particularly apparent
among women with hormone-responsive tumors. The RR of breast cancer death
for women with hormone-responsive tumors who engaged in 9 or more MET-hours
per week of activity compared with women with hormone-responsive tumors who
engaged in less than 9 MET-hours per week was 0.50 (95% CI, 0.34-0.74). Compared
with women who engaged in less than 3 MET-hours per week of activity, the
absolute unadjusted mortality risk reduction was 6% at 10 years for women
who engaged in 9 or more MET-hours per week.ConclusionsPhysical activity after a breast cancer diagnosis may reduce the risk
of death from this disease. The greatest benefit occurred in women who performed
the equivalent of walking 3 to 5 hours per week at an average pace, with little
evidence of a correlation between increased benefit and greater energy expenditure.
Women with breast cancer who follow US physical activity recommendations may
improve their survival.
••
TL;DR: The results suggest that ADHF patients at low, intermediate, and high risk for in-hospital mortality can be easily identified using vital sign and laboratory data obtained on hospital admission and provides clinicians with a validated, practical bedside tool for mortality risk stratification.
Abstract: ContextEstimation of mortality risk in patients hospitalized with acute decompensated
heart failure (ADHF) may help clinicians guide care.ObjectiveTo develop a practical user-friendly bedside tool for risk stratification
for patients hospitalized with ADHF.Design, Setting, and PatientsThe Acute Decompensated Heart Failure National Registry (ADHERE) of
patients hospitalized with a primary diagnosis of ADHF in 263 hospitals in
the United States was queried with analysis of patient data to develop a risk
stratification model. The first 33 046 hospitalizations (derivation cohort;
October 2001-February 2003) were analyzed to develop the model and then the
validity of the model was prospectively tested using data from 32 229 subsequent
hospitalizations (validation cohort; March-July 2003). Patients had a mean
age of 72.5 years and 52% were female.Main Outcome MeasureVariables predicting mortality in ADHF.ResultsWhen the derivation and validation cohorts are combined, 37 772 (58%)
of 65 275 patient-records had coronary artery disease. Of a combined cohort
consisting of 52 164 patient-records, 23 910 (46%) had preserved left ventricular
systolic function. In-hospital mortality was similar in the derivation (4.2%)
and validation (4.0%) cohorts. Recursive partitioning of the derivation cohort
for 39 variables indicated that the best single predictor for mortality was
high admission levels of blood urea nitrogen (≥43 mg/dL [15.35 mmol/L])
followed by low admission systolic blood pressure (<115 mm Hg) and then
by high levels of serum creatinine (≥2.75 mg/dL [243.1 μmol/L]). A simple
risk tree identified patient groups with mortality ranging from 2.1% to 21.9%.
The odds ratio for mortality between patients identified as high and low risk
was 12.9 (95% confidence interval, 10.4-15.9) and similar results were seen
when this risk stratification was applied prospectively to the validation
cohort.ConclusionsThese results suggest that ADHF patients at low, intermediate, and high
risk for in-hospital mortality can be easily identified using vital sign and
laboratory data obtained on hospital admission. The ADHERE risk tree provides
clinicians with a validated, practical bedside tool for mortality risk stratification.
••
TL;DR: In this article, the authors showed that intensive lowering of LDL-C did not result in a significant reduction in the primary outcome of major coronary events, but did reduce the risk of other composite secondary end points and nonfatal acute MI.
Abstract: cardiovascular death occurred in 156 (3.5%) and 143 (3.2%) in the 2 groups (HR, 0.92; 95% CI, 0.73-1.15; P=.47). Death from any cause occurred in 374 (8.4%) in the simvastatin group and 366 (8.2%) in the atorvastatin group (HR, 0.98; 95% CI, 0.85-1.13; P=.81). Patients in the atorvastatin group had higher rates of drug discontinuation due to nonserious adverse events; transaminase elevation resulted in 43 (1.0%) vs 5 (0.1%) withdrawals (P.001). Serious myopathy and rhabdomyolysis were rare in both groups. Conclusions In this study of patients with previous MI, intensive lowering of LDL-C did not result in a significant reduction in the primary outcome of major coronary events, but did reduce the risk of other composite secondary end points and nonfatal acute MI. There were no differences in cardiovascular or all-cause mortality. Patients with MI may benefit from intensive lowering of LDL-C without an increase in noncardiovascular mortality or other serious adverse reactions. Trial Registration ClinicalTrials.gov Identifier: NCT00159835.
••
TL;DR: In this study of CPR during out-of-hospital cardiac arrest, chest compressions were not delivered half of the time, and most compressions was too shallow.
Abstract: ContextCardiopulmonary resuscitation (CPR) guidelines recommend target values
for compressions, ventilations, and CPR-free intervals allowed for rhythm
analysis and defibrillation. There is little information on adherence to these
guidelines during advanced cardiac life support in the field.ObjectiveTo measure the quality of out-of-hospital CPR performed by ambulance
personnel, as measured by adherence to CPR guidelines.Design and SettingCase series of 176 adult patients with out-of-hospital cardiac arrest
treated by paramedics and nurse anesthetists in Stockholm, Sweden, London,
England, and Akershus, Norway, between March 2002 and October 2003. The defibrillators
recorded chest compressions via a sternal pad fitted with an accelerometer
and ventilations by changes in thoracic impedance between the defibrillator
pads, in addition to standard event and electrocardiographic recordings.Main Outcome MeasureAdherence to international guidelines for CPR.ResultsChest compressions were not given 48% (95% CI, 45%-51%) of the time
without spontaneous circulation; this percentage was 38% (95% CI, 36%-41%)
when subtracting the time necessary for electrocardiographic analysis and
defibrillation. Combining these data with a mean compression rate of 121/min
(95% CI, 118-124/min) when compressions were given resulted in a mean compression
rate of 64/min (95% CI, 61-67/min). Mean compression depth was 34 mm (95%
CI, 33-35 mm), 28% (95% CI, 24%-32%) of the compressions had a depth of 38
mm to 51 mm (guidelines recommendation), and the compression part of the duty
cycle was 42% (95% CI, 41%-42%). A mean of 11 (95% CI, 11-12) ventilations
were given per minute. Sixty-one patients (35%) had return of spontaneous
circulation, and 5 of 6 patients discharged alive from the hospital had normal
neurological outcomes.ConclusionsIn this study of CPR during out-of-hospital cardiac arrest, chest compressions
were not delivered half of the time, and most compressions were too shallow.
Electrocardiographic analysis and defibrillation accounted for only small
parts of intervals without chest compressions.
••
TL;DR: A growing body of evidence supports the existence of a novel mechanism of human disease, namely, hemolysis-associated smooth muscle dystonia, vasculopathy, and endothelial dysfunction.
Abstract: ContextThe efficient sequestration of hemoglobin by the red blood cell membrane
and the presence of multiple hemoglobin clearance mechanisms suggest a critical
need to prevent the buildup of this molecule in the plasma. A growing list
of clinical manifestations attributed to hemoglobin release in a variety of
acquired and iatrogenic hemolytic disorders suggests that hemolysis and hemoglobinemia
should be considered as a novel mechanism of human disease.Evidence AcquisitionPertinent scientific literature databases and references were searched
through October 2004 using terms that encompassed various aspects of hemolysis,
hemoglobin preparations, clinical symptoms associated with plasma hemoglobin,
nitric oxide in hemolysis, anemia, pulmonary hypertension, paroxysmal nocturnal
hemoglobinuria, and sickle-cell disease.Evidence SynthesisHemoglobin is released into the plasma from the erythrocyte during intravascular
hemolysis in hereditary, acquired, and iatrogenic hemolytic conditions. When
the capacity of protective hemoglobin-scavenging mechanisms has been saturated,
levels of cell-free hemoglobin increase in the plasma, resulting in the consumption
of nitric oxide and clinical sequelae. Nitric oxide plays a major role in
vascular homeostasis and has been shown to be a critical regulator of basal
and stress-mediated smooth muscle relaxation and vasomotor tone, endothelial
adhesion molecule expression, and platelet activation and aggregation. Thus,
clinical consequences of excessive cell-free plasma hemoglobin levels during
intravascular hemolysis or the administration of hemoglobin preparations include
dystonias involving the gastrointestinal, cardiovascular, pulmonary, and urogenital
systems, as well as clotting disorders. Many of the clinical sequelae of intravascular
hemolysis in a prototypic hemolytic disease, paroxysmal nocturnal hemoglobinuria,
are readily explained by hemoglobin-mediated nitric oxide scavenging.ConclusionA growing body of evidence supports the existence of a novel mechanism
of human disease, namely, hemolysis-associated smooth muscle dystonia, vasculopathy,
and endothelial dysfunction.
••
TL;DR: Improvement of the magnitude envisioned by the IOM requires a national commitment to strict, ambitious, quantitative, and well-tracked national goals.
Abstract: Five years ago, the Institute of Medicine (IOM) called for a national
effort to make health care safe. Although progress since then has been slow,
the IOM report truly “changed the conversation” to a focus on
changing systems, stimulated a broad array of stakeholders to engage in patient
safety, and motivated hospitals to adopt new safe practices. The pace of change
is likely to accelerate, particularly in implementation of electronic health
records, diffusion of safe practices, team training, and full disclosure to
patients following injury. If directed toward hospitals that actually achieve
high levels of safety, pay for performance could provide additional incentives.
But improvement of the magnitude envisioned by the IOM requires a national
commitment to strict, ambitious, quantitative, and well-tracked national goals.
The Agency for Healthcare Research and Quality should bring together all stakeholders,
including payers, to agree on a set of explicit and ambitious goals for patient
safety to be reached by 2010.
••
TL;DR: Contradiction and initially stronger effects are not unusual in highly cited research of clinical interventions and their outcomes, but the extent to which high citations may provoke contradictions and vice versa needs more study.
Abstract: Context Controversy and uncertainty ensue when the results of clinical research on the effectiveness of interventions are subsequently contradicted. Controversies are most prominent when high-impact research is involved. Objectives To understand how frequently highly cited studies are contradicted or find effects that are stronger than in other similar studies and to discern whether spe- cific characteristics are associated with such refutation over time. Design All original clinical research studies published in 3 major general clinical jour- nals or high-impact-factor specialty journals in 1990-2003 and cited more than 1000 times in the literature were examined. Main Outcome Measure The results of highly cited articles were compared against subsequent studies of comparable or larger sample size and similar or better con- trolled designs. The same analysis was also performed comparatively for matched stud- ies that were not so highly cited. Results Of 49 highly cited original clinical research studies, 45 claimed that the inter- vention was effective. Of these, 7 (16%) were contradicted by subsequent studies, 7 oth- ers (16%) had found effects that were stronger than those of subsequent studies, 20 (44%) were replicated, and 11 (24%) remained largely unchallenged. Five of 6 highly- cited nonrandomized studies had been contradicted or had found stronger effects vs 9 of 39 randomized controlled trials (P=.008). Among randomized trials, studies with con- tradicted or stronger effects were smaller (P=.009) than replicated or unchallenged stud- ies although there was no statistically significant difference in their early or overall cita- tion impact. Matched control studies did not have a significantly different share of refuted results than highly cited studies, but they included more studies with "negative" results. Conclusions Contradiction and initially stronger effects are not unusual in highly cited research of clinical interventions and their outcomes. The extent to which high citations may provoke contradictions and vice versa needs more study. Controversies are most common with highly cited nonrandomized studies, but even the most highly cited randomized trials may be challenged and refuted over time, especially small ones.
••
University of Florida1, Duke University2, Harvard University3, University of Cincinnati4, University of Michigan5, Johns Hopkins University6, University of North Carolina at Chapel Hill7, University of California, Los Angeles8, University of Texas Southwestern Medical Center9, University of Minnesota10, Mayo Clinic11, Ohio State University12, Northwestern University13, University of Calgary14, University of Kentucky15, Cleveland Clinic16, Vanderbilt University17, University of Alabama18, Case Western Reserve University19, University of Washington20, University of California, San Francisco21, University of Wisconsin-Madison22
TL;DR: Therapy to reduce volume overload during hospitalization for heart failure led to marked improvement in signs and symptoms of elevated filling pressures with or without the PAC, which reached significance for the time trade-off at all time points after randomization.
Abstract: Context Pulmonary artery catheters (PACs) have been used to guide therapy in multiple settings, but recent studies have raised concerns that PACs may lead to increased mortality in hospitalized patients. Objective To determine whether PAC use is safe and improves clinical outcomes in patients hospitalized with severe symptomatic and recurrent heart failure. Design, setting, and participants The Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) was a randomized controlled trial of 433 patients at 26 sites conducted from January 18, 2000, to November 17, 2003. Patients were assigned to receive therapy guided by clinical assessment and a PAC or clinical assessment alone. The target in both groups was resolution of clinical congestion, with additional PAC targets of a pulmonary capillary wedge pressure of 15 mm Hg and a right atrial pressure of 8 mm Hg. Medications were not specified, but inotrope use was explicitly discouraged. Main outcome measures The primary end point was days alive out of the hospital during the first 6 months, with secondary end points of exercise, quality of life, biochemical, and echocardiographic changes. Results Severity of illness was reflected by the following values: average left ventricular ejection fraction, 19%; systolic blood pressure, 106 mm Hg; sodium level, 137 mEq/L; urea nitrogen, 35 mg/dL (12.40 mmol/L); and creatinine, 1.5 mg/dL (132.6 micromol/L). Therapy in both groups led to substantial reduction in symptoms, jugular venous pressure, and edema. Use of the PAC did not significantly affect the primary end point of days alive and out of the hospital during the first 6 months (133 days vs 135 days; hazard ratio [HR], 1.00 [95% confidence interval {CI}, 0.82-1.21]; P = .99), mortality (43 patients [10%] vs 38 patients [9%]; odds ratio [OR], 1.26 [95% CI, 0.78-2.03]; P = .35), or the number of days hospitalized (8.7 vs 8.3; HR, 1.04 [95% CI, 0.86-1.27]; P = .67). In-hospital adverse events were more common among patients in the PAC group (47 [21.9%] vs 25 [11.5%]; P = .04). There were no deaths related to PAC use, and no difference for in-hospital plus 30-day mortality (10 [4.7%] vs 11 [5.0%]; OR, 0.97 [95% CI, 0.38-2.22]; P = .97). Exercise and quality of life end points improved in both groups with a trend toward greater improvement with the PAC, which reached significance for the time trade-off at all time points after randomization. Conclusions Therapy to reduce volume overload during hospitalization for heart failure led to marked improvement in signs and symptoms of elevated filling pressures with or without the PAC. Addition of the PAC to careful clinical assessment increased anticipated adverse events, but did not affect overall mortality and hospitalization. Future trials should test noninvasive assessments with specific treatment strategies that could be used to better tailor therapy for both survival time and survival quality as valued by patients.
••
TL;DR: Men with clinically localized prostate cancer have a lower risk of biochemical failure if they receive high-dose rather than conventional-dose conformal radiation, and this advantage was achieved without any associated increase in RTOG grade 3 acute or late urinary or rectal morbidity.
Abstract: ContextClinically localized prostate cancer is very prevalent among US men,
but recurrence after treatment with conventional radiation therapy is common.ObjectiveTo evaluate the hypothesis that increasing the radiation dose delivered
to men with clinically localized prostate cancer improves disease outcome.Design, Setting, and PatientsRandomized controlled trial of 393 patients with stage T1b through T2b
prostate cancer and prostate-specific antigen (PSA) levels less than 15 ng/mL
randomized between January 1996 and December 1999 and treated at 2 US academic
institutions. Median age was 67 years and median PSA level was 6.3 ng/mL.
Median follow-up was 5.5 (range, 1.2-8.2) years.InterventionPatients were randomized to receive external beam radiation to a total
dose of either 70.2 Gy (conventional dose) or 79.2 Gy (high dose). This was
delivered using a combination of conformal photon and proton beams.Main Outcome MeasureIncreasing PSA level (ie, biochemical failure) 5 years after treatment.ResultsThe proportions of men free from biochemical failure at 5 years were
61.4% (95% confidence interval, 54.6%-68.3%) for conventional-dose and 80.4%
(95% confidence interval, 74.7%-86.1%) for high-dose therapy (P<.001), a 49% reduction in the risk of failure. The advantage to
high-dose therapy was observed in both the low-risk and the higher-risk subgroups
(risk reduction, 51% [P<.001] and 44% [P = .03], respectively). There has been no significant difference
in overall survival rates between the treatment groups. Only 1% of patients
receiving conventional-dose and 2% receiving high-dose radiation experienced
acute urinary or rectal morbidity of Radiation Therapy Oncology Group (RTOG)
grade 3 or greater. So far, only 2% and 1%, respectively, have experienced
late morbidity of RTOG grade 3 or greater.ConclusionsMen with clinically localized prostate cancer have a lower risk of biochemical
failure if they receive high-dose rather than conventional-dose conformal
radiation. This advantage was achieved without any associated increase in
RTOG grade 3 acute or late urinary or rectal morbidity.
••
TL;DR: In this article, a large-scale individual participant meta-analysis was conducted to assess the relationship of fibrinogen levels with risk of major vascular and non-vascular outcomes based on individual participant data.
Abstract: CONTEXT: Plasma fibrinogen levels may be associated with the risk of coronary heart disease (CHD) and stroke. OBJECTIVE: To assess the relationships of fibrinogen levels with risk of major vascular and with risk of nonvascular outcomes based on individual participant data. DATA SOURCES: Relevant studies were identified by computer-assisted searches, hand searches of reference lists, and personal communication with relevant investigators. STUDY SELECTION: All identified prospective studies were included with information available on baseline fibrinogen levels and details of subsequent major vascular morbidity and/or cause-specific mortality during at least 1 year of follow-up. Studies were excluded if they recruited participants on the basis of having had a previous history of cardiovascular disease; participants with known preexisting CHD or stroke were excluded. DATA EXTRACTION: Individual records were provided on each of 154,211 participants in 31 prospective studies. During 1.38 million person-years of follow-up, there were 6944 first nonfatal myocardial infarctions or stroke events and 13,210 deaths. Cause-specific mortality was generally available. Analyses involved proportional hazards modeling with adjustment for confounding by known cardiovascular risk factors and for regression dilution bias. DATA SYNTHESIS: Within each age group considered (40-59, 60-69, and > or =70 years), there was an approximately log-linear association with usual fibrinogen level for the risk of any CHD, any stroke, other vascular (eg, non-CHD, nonstroke) mortality, and nonvascular mortality. There was no evidence of a threshold within the range of usual fibrinogen level studied at any age. The age- and sex- adjusted hazard ratio per 1-g/L increase in usual fibrinogen level for CHD was 2.42 (95% confidence interval [CI], 2.24-2.60); stroke, 2.06 (95% CI, 1.83-2.33); other vascular mortality, 2.76 (95% CI, 2.28-3.35); and nonvascular mortality, 2.03 (95% CI, 1.90-2.18). The hazard ratios for CHD and stroke were reduced to about 1.8 after further adjustment for measured values of several established vascular risk factors. In a subset of 7011 participants with available C-reactive protein values, the findings for CHD were essentially unchanged following additional adjustment for C-reactive protein. The associations of fibrinogen level with CHD or stroke did not differ substantially according to sex, smoking, blood pressure, blood lipid levels, or several features of study design. CONCLUSIONS: In this large individual participant meta-analysis, moderately strong associations were found between usual plasma fibrinogen level and the risks of CHD, stroke, other vascular mortality, and nonvascular mortality in a wide range of circumstances in healthy middle-aged adults. Assessment of any causal relevance of elevated fibrinogen levels to disease requires additional research.
••
TL;DR: The annual mortality rate from prostate cancer appears to remain stable after 15 years from diagnosis, which does not support aggressive treatment for localized low-grade prostate cancer.
Abstract: Context: The appropriate therapy for men with clinically localized prostate cancer is uncertain. A recent study suggested an increasing prostate cancer mortality rate for men who are alive more than 15 years following diagnosis. Objective: To estimate 20-year survival based on a competing risk analysis of men who were diagnosed with clinically localized prostate cancer and treated with observation or androgen withdrawal therapy alone, stratified by age at diagnosis and histological findings. Design, Setting, and Patients: A retrospective population-based cohort study using Connecticut Tumor Registry data supplemented by hospital record and histology review of 767 men aged 55 to 74 years with clinically localized prostate cancer diagnosed between January 1, 1971, and December 31, 1984. Patients were treated with either observation or immediate or delayed androgen withdrawal therapy, with a median observation of 24 years. Main Outcome Measures: Probability of mortality from prostate cancer or other competing medical conditions, given a patient’s age at diagnosis and tumor grade. Results: The prostate cancer mortality rate was 33 per 1000 person-years during the first 15 years of follow-up (95% confidence interval [CI], 28-38) and 18 per 1000 personyears after 15 years of follow-up (95% CI, 10-29). The mortality rates for these 2 follow-up periods were not statistically different, after adjusting for differences in tumor histology (rate ratio, 1.1; 95% CI, 0.6-1.9). Men with low-grade prostate cancers have a minimal risk of dying from prostate cancer during 20 years of follow-up (Gleason score of 2-4, 6 deaths per 1000 person-years; 95% CI, 2-11). Men with high-grade prostate cancers have a high probability of dying from prostate cancer within 10 years of diagnosis (Gleason score of 8-10, 121 deaths per 1000 person-years; 95% CI, 90-156). Men with Gleason score of 5 or 6 tumors have an intermediate risk of prostate cancer death. Conclusion: The annual mortality rate from prostate cancer appears to remain stable after 15 years from diagnosis, which does not support aggressive treatment for localized low-grade prostate cancer.
••
TL;DR: In the setting of a healthful diet, partial substitution of carbohydrate with either protein or monounsaturated fat can further lower blood pressure, improve lipid levels, and reduce estimated cardiovascular risk.
Abstract: ContextReduced intake of saturated fat is widely recommended for prevention of cardiovascular disease. The type of macronutrient that should replace saturated fat remains uncertain.ObjectiveTo compare the effects of 3 healthful diets, each with reduced saturated fat intake, on blood pressure and serum lipids.Design, Setting, and ParticipantsRandomized, 3-period, crossover feeding study (April 2003 to June 2005) conducted in Baltimore, Md, and Boston, Mass. Participants were 164 adults with prehypertension or stage 1 hypertension. Each feeding period lasted 6 weeks and body weight was kept constant.InterventionsA diet rich in carbohydrates; a diet rich in protein, about half from plant sources; and a diet rich in unsaturated fat, predominantly monounsaturated fat.Main Outcome MeasuresSystolic blood pressure and low-density lipoprotein cholesterol.ResultsBlood pressure, low-density lipoprotein cholesterol, and estimated coronary heart disease risk were lower on each diet compared with baseline. Compared with the carbohydrate diet, the protein diet further decreased mean systolic blood pressure by 1.4 mm Hg (P = .002) and by 3.5 mm Hg (P = .006) among those with hypertension and decreased low-density lipoprotein cholesterol by 3.3 mg/dL (0.09 mmol/L; P = .01), high-density lipoprotein cholesterol by 1.3 mg/dL (0.03 mmol/L; P = .02), and triglycerides by 15.7 mg/dL (0.18 mmol/L; P<.001). Compared with the carbohydrate diet, the unsaturated fat diet decreased systolic blood pressure by 1.3 mm Hg (P = .005) and by 2.9 mm Hg among those with hypertension (P = .02), had no significant effect on low-density lipoprotein cholesterol, increased high-density lipoprotein cholesterol by 1.1 mg/dL (0.03 mmol/L; P = .03), and lowered triglycerides by 9.6 mg/dL (0.11 mmol/L; P = .02). Compared with the carbohydrate diet, estimated 10-year coronary heart disease risk was lower and similar on the protein and unsaturated fat diets.ConclusionIn the setting of a healthful diet, partial substitution of carbohydrate with either protein or monounsaturated fat can further lower blood pressure, improve lipid levels, and reduce estimated cardiovascular risk.Clinical Trials RegistrationClinicalTrials.gov Identifier: NCT00051350.
••
TL;DR: In patients with vascular disease or diabetes mellitus, long-term vitamin E supplementation does not prevent cancer or major cardiovascular events and may increase the risk for heart failure.
Abstract: Context Experimental and epidemiological data suggest that vitamin E supplementation may prevent cancer and cardiovascular events. Clinical trials have generally failed to confirm benefits, possibly due to their relatively short duration. Objective To evaluate whether long-term supplementation with vitamin E decreases the risk of cancer, cancer death, and major cardiovascular events. Design, setting, and patients A randomized, double-blind, placebo-controlled international trial (the initial Heart Outcomes Prevention Evaluation [HOPE] trial conducted between December 21, 1993, and April 15, 1999) of patients at least 55 years old with vascular disease or diabetes mellitus was extended (HOPE-The Ongoing Outcomes [HOPE-TOO]) between April 16, 1999, and May 26, 2003. Of the initial 267 HOPE centers that had enrolled 9541 patients, 174 centers participated in the HOPE-TOO trial. Of 7030 patients enrolled at these centers, 916 were deceased at the beginning of the extension, 1382 refused participation, 3994 continued to take the study intervention, and 738 agreed to passive follow-up. Median duration of follow-up was 7.0 years. Intervention Daily dose of natural source vitamin E (400 IU) or matching placebo. Main outcome measures Primary outcomes included cancer incidence, cancer deaths, and major cardiovascular events (myocardial infarction, stroke, and cardiovascular death). Secondary outcomes included heart failure, unstable angina, and revascularizations. Results Among all HOPE patients, there were no significant differences in the primary analysis: for cancer incidence, there were 552 patients (11.6%) in the vitamin E group vs 586 (12.3%) in the placebo group (relative risk [RR], 0.94; 95% confidence interval [CI], 0.84-1.06; P = .30); for cancer deaths, 156 (3.3%) vs 178 (3.7%), respectively (RR, 0.88; 95% CI, 0.71-1.09; P = .24); and for major cardiovascular events, 1022 (21.5%) vs 985 (20.6%), respectively (RR, 1.04; 95% CI, 0.96-1.14; P = .34). Patients in the vitamin E group had a higher risk of heart failure (RR, 1.13; 95% CI, 1.01-1.26; P = .03) and hospitalization for heart failure (RR, 1.21; 95% CI, 1.00-1.47; P = .045). Similarly, among patients enrolled at the centers participating in the HOPE-TOO trial, there were no differences in cancer incidence, cancer deaths, and major cardiovascular events, but higher rates of heart failure and hospitalizations for heart failure. Conclusion In patients with vascular disease or diabetes mellitus, long-term vitamin E supplementation does not prevent cancer or major cardiovascular events and may increase the risk for heart failure.
••
Duke University1, University of Barcelona2, Drexel University3, Tel Aviv University4, Alfred Hospital5, University of Michigan6, University of Zagreb7, Medical University of South Carolina8, Federal University of Rio de Janeiro9, Autonomous University of Barcelona10, Geelong Hospital11, University of Amsterdam12, Wayne State University13, University of California, Los Angeles14
TL;DR: Characteristics of patients with S aureus IE vary significantly by region, and further studies are required to determine the causes of regional variation.
Abstract: ContextThe global significance of infective endocarditis (IE) caused by Staphylococcus aureus is unknown.ObjectivesTo document the international emergence of health care–associated S aureus IE and methicillin-resistant S aureus (MRSA) IE and to evaluate regional variation in patients with S aureus IE.Design, Setting, and ParticipantsProspective observational cohort study set in 39 medical centers in
16 countries. Participants were a population of 1779 patients with definite
IE as defined by Duke criteria who were enrolled in the International Collaboration
on Endocarditis-Prospective Cohort Study from June 2000 to December 2003.Main Outcome MeasureIn-hospital mortality.ResultsS aureus was the most common pathogen among
the 1779 cases of definite IE in the International Collaboration on Endocarditis
Prospective-Cohort Study (558 patients, 31.4%). Health care−associated
infection was the most common form of S aureus IE
(218 patients, 39.1%), accounting for 25.9% (Australia/New Zealand) to 54.2%
(Brazil) of cases. Most patients with health care−associated S aureus IE (131 patients, 60.1%) acquired the infection outside of
the hospital. MRSA IE was more common in the United States (37.2%) and Brazil
(37.5%) than in Europe/Middle East (23.7%) and Australia/New Zealand (15.5%, P<.001). Persistent bacteremia was independently associated
with MRSA IE (odds ratio, 6.2; 95% confidence interval, 2.9-13.2). Patients
in the United States were most likely to be hemodialysis dependent, to have
diabetes, to have a presumed intravascular device source, to receive vancomycin,
to be infected with MRSA, and to have persistent bacteremia (P<.001 for all comparisons).ConclusionsS aureus is the leading cause of IE in many
regions of the world. Characteristics of patients with S aureus IE vary significantly by region. Further studies are required
to determine the causes of regional variation.
••
TL;DR: The data from this large trial indicated that 600 IU of natural-source vitamin E taken every other day provided no overall benefit for major cardiovascular events or cancer, did not affect total mortality, and decreased cardiovascular mortality in healthy women.
Abstract: Results During follow-up, there were 482 major cardiovascular events in the vitamin E group and 517 in the placebo group, a nonsignificant 7% risk reduction (relative risk [RR], 0.93; 95% confidence interval [CI], 0.82-1.05; P=.26). There were no significant effects on the incidences of myocardial infarction (RR, 1.01; 95% CI, 0.821.23; P=.96) or stroke (RR, 0.98; 95% CI, 0.82-1.17; P=.82), as well as ischemic or hemorrhagic stroke. For cardiovascular death, there was a significant 24% reduction (RR, 0.76; 95% CI, 0.59-0.98; P=.03). There was no significant effect on the incidences of total cancer (1437 cases in the vitamin E group and 1428 in the placebo group; RR, 1.01; 95% CI, 0.94-1.08; P=.87) or breast (RR, 1.00; 95% CI, 0.90-1.12; P=.95), lung (RR, 1.09; 95% CI, 0.83-1.44; P=.52), or colon cancers (RR, 1.00; 95% CI, 0.77-1.31; P=.99). Cancer deaths also did not differ significantly between groups. There was no significant effect of vitamin E on total mortality (636 in the vitamin E group and 615 in the placebo group; RR, 1.04; 95% CI, 0.93-1.16; P=.53). Conclusions The data from this large trial indicated that 600 IU of natural-source vitamin E taken every other day provided no overall benefit for major cardiovascular events or cancer, did not affect total mortality, and decreased cardiovascular mortality in healthy women. These data do not support recommending vitamin E supplementation for cardiovascular disease or cancer prevention among healthy women.
••
TL;DR: Defensive medicine is highly prevalent among physicians in Pennsylvania who pay the most for liability insurance, with potentially serious implications for cost, access, and both technical and interpersonal quality of care.
Abstract: ContextHow often physicians alter their clinical behavior because of the threat
of malpractice liability, termed defensive medicine,
and the consequences of those changes, are central questions in the ongoing
medical malpractice reform debate.ObjectiveTo study the prevalence and characteristics of defensive medicine among
physicians practicing in high-liability specialties during a period of substantial
instability in the malpractice environment.Design, Setting, and ParticipantsMail survey of physicians in 6 specialties at high risk of litigation
(emergency medicine, general surgery, orthopedic surgery, neurosurgery, obstetrics/gynecology,
and radiology) in Pennsylvania in May 2003.Main Outcome MeasuresNumber of physicians in each specialty reporting defensive medicine
or changes in scope of practice and characteristics of defensive medicine
(assurance and avoidance behavior).ResultsA total of 824 physicians (65%) completed the survey. Nearly all (93%)
reported practicing defensive medicine. “Assurance behavior” such
as ordering tests, performing diagnostic procedures, and referring patients
for consultation, was very common (92%). Among practitioners of defensive
medicine who detailed their most recent defensive act, 43% reported using
imaging technology in clinically unnecessary circumstances. Avoidance of procedures
and patients that were perceived to elevate the probability of litigation
was also widespread. Forty-two percent of respondents reported that they had
taken steps to restrict their practice in the previous 3 years, including
eliminating procedures prone to complications, such as trauma surgery, and
avoiding patients who had complex medical problems or were perceived as litigious.
Defensive practice correlated strongly with respondents’ lack of confidence
in their liability insurance and perceived burden of insurance premiums.ConclusionDefensive medicine is highly prevalent among physicians in Pennsylvania
who pay the most for liability insurance, with potentially serious implications
for cost, access, and both technical and interpersonal quality of care.