scispace - formally typeset
Search or ask a question

Showing papers in "PLOS Medicine in 2019"


Journal ArticleDOI
TL;DR: It is shown that a CNN can assess the human tumor microenvironment and predict prognosis directly from histopathological images and was an independent prognostic factor for overall survival in a multivariable Cox proportional hazard model.
Abstract: BACKGROUND: For virtually every patient with colorectal cancer (CRC), hematoxylin-eosin (HE)-stained tissue slides are available. These images contain quantitative information, which is not routinely used to objectively extract prognostic biomarkers. In the present study, we investigated whether deep convolutional neural networks (CNNs) can extract prognosticators directly from these widely available images. METHODS AND FINDINGS: We hand-delineated single-tissue regions in 86 CRC tissue slides, yielding more than 100,000 HE image patches, and used these to train a CNN by transfer learning, reaching a nine-class accuracy of >94% in an independent data set of 7,180 images from 25 CRC patients. With this tool, we performed automated tissue decomposition of representative multitissue HE images from 862 HE slides in 500 stage I-IV CRC patients in the The Cancer Genome Atlas (TCGA) cohort, a large international multicenter collection of CRC tissue. Based on the output neuron activations in the CNN, we calculated a "deep stroma score," which was an independent prognostic factor for overall survival (OS) in a multivariable Cox proportional hazard model (hazard ratio [HR] with 95% confidence interval [CI]: 1.99 [1.27-3.12], p = 0.0028), while in the same cohort, manual quantification of stromal areas and a gene expression signature of cancer-associated fibroblasts (CAFs) were only prognostic in specific tumor stages. We validated these findings in an independent cohort of 409 stage I-IV CRC patients from the "Darmkrebs: Chancen der Verhutung durch Screening" (DACHS) study who were recruited between 2003 and 2007 in multiple institutions in Germany. Again, the score was an independent prognostic factor for OS (HR 1.63 [1.14-2.33], p = 0.008), CRC-specific OS (HR 2.29 [1.5-3.48], p = 0.0004), and relapse-free survival (RFS; HR 1.92 [1.34-2.76], p = 0.0004). A prospective validation is required before this biomarker can be implemented in clinical workflows. CONCLUSIONS: In our retrospective study, we show that a CNN can assess the human tumor microenvironment and predict prognosis directly from histopathological images.

557 citations


Journal ArticleDOI
Ellis Voerman1, Susana Santos1, Bernadeta Patro Golab1, Bernadeta Patro Golab2, Pilar Amiano, Ferran Ballester3, Henrique Barros4, Anna Bergström5, Marie-Aline Charles6, Marie-Aline Charles7, Leda Chatzi8, Leda Chatzi9, Leda Chatzi10, Cécile Chevrier11, George P. Chrousos12, Eva Corpeleijn13, Nathalie Costet11, Sarah Crozier14, Graham Devereux15, Merete Eggesbø16, Sandra Ekström17, Maria Pia Fantini18, Sara Farchi, Francesco Forastiere, Vagelis Georgiu10, Keith M. Godfrey14, Keith M. Godfrey19, Davide Gori18, Veit Grote20, Wojciech Hanke21, Irva Hertz-Picciotto22, Barbara Heude7, Barbara Heude6, Daniel O. Hryhorczuk23, Rae-Chi Huang24, Hazel Inskip14, Hazel Inskip19, Nina Iszatt16, Anne M. Karvonen25, Louise C. Kenny26, Berthold Koletzko20, Leanne K. Küpers27, Hanna Lagström28, Irina Lehmann29, Per Magnus16, Renata Majewska30, Johanna Mäkelä31, Yannis Manios32, Fionnuala M. McAuliffe33, Sheila McDonald34, John Mehegan33, Monique Mommers35, Camilla Schmidt Morgen36, Camilla Schmidt Morgen37, Trevor A. Mori24, George Moschonis38, Deirdre M. Murray26, Carol Ní Chaoimh26, Ellen A. Nohr37, Anne-Marie Nybo Andersen36, Emily Oken39, Adriette J. J. M. Oostvogels35, Agnieszka Pac30, Eleni Papadopoulou16, Juha Pekkanen40, Costanza Pizzi41, Kinga Polańska21, Daniela Porta, Lorenzo Richiardi41, Sheryl L. Rifas-Shiman39, Luca Ronfani42, Ana Cristina Santos4, Marie Standl, Camilla Stoltenberg43, Elisabeth Thiering20, Carel Thijs35, Maties Torrent, Suzanne Tough34, Tomas Trnovec44, Steve Turner45, Lenie van Rossem46, Andrea von Berg, Martine Vrijheid47, Tanja G. M. Vrijkotte35, Jane West48, Alet H. Wijga, John Wright48, Oleksandr Zvinchuk, Thorkild I. A. Sørensen36, Debbie A Lawlor27, Romy Gaillard1, Vincent W. V. Jaddoe1 
TL;DR: In this article, the authors conducted an individual participant data meta-analysis of data from 162,129 mothers and children from 37 pregnancy and birth cohort studies from Europe, North-America and Australia, using multilevel binary logistic regression models with a random intercept at cohort level adjusted for maternal socio-demographic and life style related characteristics.
Abstract: Background: Maternal obesity and excessive gestational weight gain may have persistent effects on offspring fat development. However, it remains unclear whether these risks differ by severity of obesity, and whether these effects are restricted to the extremes of maternal body mass index (BMI) and gestational weight gain. We aimed to assess the separate and combined associations of maternal BMI and gestational weight gain with the risk of overweight/obesity throughout childhood, and their population impact. Methods and Findings: We conducted an individual participant data meta-analysis of data from 162,129 mothers and children from 37 pregnancy and birth cohort studies from Europe, North-America and Australia. We assessed the individual and combined associations of maternal pre-pregnancy BMI and gestational weight gain, both in clinical categories and across their full ranges with the risks of overweight/obesity in early- (2.0-5.0 years), mid- (5.0-10.0 years) and late childhood (10.0-18.0 years), using multilevel binary logistic regression models with a random intercept at cohort level adjusted for maternal socio-demographic and life style related characteristics. We observed that a higher maternal pre-pregnancy BMI and gestational weight gain both in clinical categories and across their full ranges were associated with higher risks of childhood overweight/obesity, with the strongest effects in late childhood (Odds Ratios (OR) for overweight/obesity in early-, mid- and late childhood, respectively: 1.66 (95% Confidence Interval (CI): 1.56, 1.78), OR 1.91 (95% CI: 1.85, 1.98), and OR 2.28 (95% CI: 2.08, 2.50) for maternal overweight, OR 2.43 (95% CI: 2.24, 2.64), OR 3.12 (95% CI: 2.98, 3.27), and OR 4.47 (95% CI: 3.99, 5.23) for maternal obesity, and OR 1.39 (95% CI: 1.30, 1.49), OR 1.55 (95% CI: 1.49, 1.60), and 1.72 (95% CI: 1.56, 1.91) for excessive gestational weight gain. The proportions of childhood overweight/obesity prevalence attributable to maternal overweight, maternal obesity and excessive gestational weight gain ranged from 10.2 to 21.6%. Relative to the effect of maternal BMI, excessive gestational weight gain only slightly increased the risk of childhood overweight/obesity within each clinical BMI category (P-values for interactions of maternal BMI with gestational weight gain: p=0.038, p<0.001 and p=0.637, in early-, mid- and late childhood, respectively). Limitations of this study include the self-report of maternal BMI and gestational weight gain for some of the cohorts, and the potential of residual confounding. Also, as this study only included participants from Europe, North-America and Australia, results need to be interpreted with caution with respect to other populations. Conclusions: In this study, higher maternal pre-pregnancy BMI and gestational weight gain were associated with an increased risk of childhood overweight/obesity, with the strongest effects at later ages. The additional effect of gestational weight gain in women who are overweight or obese before pregnancy is small. Given the large population impact, future intervention trials aiming to reduce the prevalence of childhood overweight and obesity should focus on maternal weight status before pregnancy, in addition to weight gain during pregnancy.

248 citations


Journal ArticleDOI
TL;DR: As many reviews of observational studies on etiology are being performed, this document may provide researchers with guidance on how to conduct and analyse such reviews.
Abstract: BACKGROUND To our knowledge, no publication providing overarching guidance on the conduct of systematic reviews of observational studies of etiology exists. METHODS AND FINDINGS Conducting Systematic Reviews and Meta-Analyses of Observational Studies of Etiology (COSMOS-E) provides guidance on all steps in systematic reviews of observational studies of etiology, from shaping the research question, defining exposure and outcomes, to assessing the risk of bias and statistical analysis. The writing group included researchers experienced in meta-analyses and observational studies of etiology. Standard peer-review was performed. While the structure of systematic reviews of observational studies on etiology may be similar to that for systematic reviews of randomised controlled trials, there are specific tasks within each component that differ. Examples include assessment for confounding, selection bias, and information bias. In systematic reviews of observational studies of etiology, combining studies in meta-analysis may lead to more precise estimates, but such greater precision does not automatically remedy potential bias. Thorough exploration of sources of heterogeneity is key when assessing the validity of estimates and causality. CONCLUSION As many reviews of observational studies on etiology are being performed, this document may provide researchers with guidance on how to conduct and analyse such reviews.

223 citations


Journal ArticleDOI
TL;DR: This research has identified a 264% increase in the odds of child obesity when mothers have obesity before conception, providing substantial evidence for the need to develop interventions that commence prior to conception to support women of childbearing age with weight management in order to halt intergenerational obesity.
Abstract: Background There is a global obesity crisis, particularly among women and disadvantaged populations. Early-life intervention to prevent childhood obesity is a priority for public health, global health, and clinical practice. Understanding the association between childhood obesity and maternal pre-pregnancy weight status would inform policy and practice by allowing one to estimate the potential for offspring health gain through channelling resources into intervention. This systematic review and meta-analysis aimed to examine the dose–response association between maternal body mass index (BMI) and childhood obesity in the offspring. Methods and findings Searches in MEDLINE, Child Development & Adolescent Studies, CINAHL, Embase, and PsycInfo were carried out in August 2017 and updated in March 2019. Supplementary searches included hand-searching reference lists, performing citation searching, and contacting authors. Two researchers carried out independent screening, data extraction, and quality assessment. Observational studies published in English and reporting associations between continuous and/or categorical maternal and child BMI or z-score were included. Categorical outcomes were child obesity (≥95th percentile, primary outcome), overweight/obesity (≥85th percentile), and overweight (85th to 95th percentile). Linear and nonlinear dose–response meta-analyses were conducted using random effects models. Studies that could not be included in meta-analyses were summarised narratively. Seventy-nine of 41,301 studies identified met the inclusion criteria (n = 59 cohorts). Meta-analyses of child obesity included 20 studies (n = 88,872); child overweight/obesity, 22 studies (n = 181,800); and overweight, 10 studies (n = 53,238). Associations were nonlinear and there were significantly increased odds of child obesity with maternal obesity (odds ratio [OR] 3.64, 95% CI 2.68–4.95) and maternal overweight (OR 1.89, 95% CI 1.62–2.19). Significantly increased odds were observed for child overweight/obesity (OR 2.69, 95% CI 2.10–3.46) and for child overweight (OR 1.80, 95% CI 1.25, 2.59) with maternal obesity. A limitation of this research is that the included studies did not always report the data in a format that enabled inclusion in this complex meta-analysis. Conclusions This research has identified a 264% increase in the odds of child obesity when mothers have obesity before conception. This study provides substantial evidence for the need to develop interventions that commence prior to conception, to support women of childbearing age with weight management in order to halt intergenerational obesity.

205 citations


Journal ArticleDOI
TL;DR: The role of stigma in responses to the US opioid crisis is discussed and the importance of awareness and education in response to stigma is discussed.
Abstract: Alexander Tsai and co-authors discuss the role of stigma in responses to the US opioid crisis.

188 citations


Journal ArticleDOI
TL;DR: Through synthesis of extant qualitative studies of menstrual experience, this model hypothesises directional pathways that could be tested by future studies and may serve as a framework for program and policy development by highlighting critical antecedents and pathways through which interventions could improve women’s and girls’ health and well-being.
Abstract: Background Attention to women's and girls' menstrual needs is critical for global health and gender equality. The importance of this neglected experience has been elucidated by a growing body of qualitative research, which we systematically reviewed and synthesised. Methods and findings We undertook systematic searching to identify qualitative studies of women's and girls' experiences of menstruation in low- and middle-income countries (LMICs). Of 6,892 citations screened, 76 studies reported in 87 citations were included. Studies captured the experiences of over 6,000 participants from 35 countries. This included 45 studies from sub-Saharan Africa (with the greatest number of studies from Kenya [n = 7], Uganda [n = 6], and Ethiopia [n = 5]), 21 from South Asia (including India [n = 12] and Nepal [n = 5]), 8 from East Asia and the Pacific, 5 from Latin America and the Caribbean, 5 from the Middle East and North Africa, and 1 study from Europe and Central Asia. Through synthesis, we identified overarching themes and their relationships to develop a directional model of menstrual experience. This model maps distal and proximal antecedents of menstrual experience through to the impacts of this experience on health and well-being. The sociocultural context, including menstrual stigma and gender norms, influenced experiences by limiting knowledge about menstruation, limiting social support, and shaping internalised and externally enforced behavioural expectations. Resource limitations underlay inadequate physical infrastructure to support menstruation, as well as an economic environment restricting access to affordable menstrual materials. Menstrual experience included multiple themes: menstrual practices, perceptions of practices and environments, confidence, shame and distress, and containment of bleeding and odour. These components of experience were interlinked and contributed to negative impacts on women's and girls' lives. Impacts included harms to physical and psychological health as well as education and social engagement. Our review is limited by the available studies. Study quality was varied, with 18 studies rated as high, 35 medium, and 23 low trustworthiness. Sampling and analysis tended to be untrustworthy in lower-quality studies. Studies focused on the experiences of adolescent girls were most strongly represented, and we achieved early saturation for this group. Reflecting the focus of menstrual health research globally, there was an absence of studies focused on adult women and those from certain geographical areas. Conclusions Through synthesis of extant qualitative studies of menstrual experience, we highlight consistent challenges and developed an integrated model of menstrual experience. This model hypothesises directional pathways that could be tested by future studies and may serve as a framework for program and policy development by highlighting critical antecedents and pathways through which interventions could improve women's and girls' health and well-being. Review protocol registration The review protocol registration is PROSPERO: CRD42018089581.

172 citations


Journal ArticleDOI
TL;DR: Health system performance for management of diabetes showed large losses to care at the stage of being tested, and low rates of diabetes control along the care cascade, indicating large unmet need for diabetes care across 28 LMICs.
Abstract: CITATION: Manne-Goehler, J., et al. 2019. Health system performance for people with diabetes in 28 low- and middle-income countries : a cross-sectional study of nationally representative surveys. PLoS Medicine, 16(3):e1002751, doi:10.1371/journal.pmed.1002751.

155 citations


Journal ArticleDOI
TL;DR: It is suggested that social network interventions can be effective in the short term (<6 months) and longer term (> 6 months) for sexual health outcomes and for alcohol misuse, well-being, and smoking cessation.
Abstract: Background There has been a growing interest in understanding the effects of social networks on health-related behaviour, with a particular backdrop being the emerging prominence of complexity or systems science in public health. Social network interventions specifically use or alter the characteristics of social networks to generate, accelerate, or maintain health behaviours. We conducted a systematic review and meta-analysis to investigate health behaviour outcomes of social network interventions. Methods and findings We searched eight databases and two trial registries from 1990 to May 28, 2019, for English-language reports of randomised controlled trials (RCTs) and before-and-after studies investigating social network interventions for health behaviours and outcomes. Trials that did not specifically use social networks or that did not include a comparator group were excluded. We screened studies and extracted data from published reports independently. The primary outcome of health behaviours or outcomes at ≤6 months was assessed by random-effects meta-analysis. Secondary outcomes included those measures at >6–12 months and >12 months. This study is registered with the International Prospective Register of Systematic Reviews, PROSPERO: CRD42015023541. We identified 26,503 reports; after exclusion, 37 studies, conducted between 1996 and 2018 from 11 countries, were eligible for analysis, with a total of 53,891 participants (mean age 32.4 years [SD 12.7]; 45.5% females). A range of study designs were included: 27 used RCT/cluster RCT designs, and 10 used other study designs. Eligible studies addressed a variety of health outcomes, in particular sexual health and substance use. Social network interventions showed a significant intervention effect compared with comparator groups for sexual health outcomes. The pooled odds ratio (OR) was 1.46 (95% confidence interval [CI] 1.01–2.11; I2 = 76%) for sexual health outcomes at ≤6 months and OR 1.51 (95% CI 1.27–1.81; I2 = 40%) for sexual health outcomes at >6–12 months. Intervention effects for drug risk outcomes at each time point were not significant. There were also significant intervention effects for some other health outcomes including alcohol misuse, well-being, change in haemoglobin A1c (HbA1c), and smoking cessation. Because of clinical and measurement heterogeneity, it was not appropriate to pool data on these other behaviours in a meta-analysis. For sexual health outcomes, prespecified subgroup analyses were significant for intervention approach (p < 0.001), mean age of participants (p = 0.002), and intervention length (p = 0.05). Overall, 22 of the 37 studies demonstrated a high risk of bias, as measured by the Cochrane Risk of Bias tool. The main study limitations identified were the inclusion of studies of variable quality; difficulty in isolating the effects of specific social network intervention components on health outcomes, as interventions included other active components; and reliance on self-reported outcomes, which have inherent recall and desirability biases. Conclusions Our findings suggest that social network interventions can be effective in the short term ( 6 months) for sexual health outcomes. Intervention effects for drug risk outcomes at each time point were not significant. There were also significant intervention effects for some other health outcomes including alcohol misuse, well-being, change in HbA1c, and smoking cessation.

143 citations


Journal ArticleDOI
TL;DR: The prevalence of prescription opioid use among adolescents and young adults in the US is high despite known risks for future opioid and other drug use disorders, underscoring the importance of drug and alcohol screening programs in this population.
Abstract: Background Prescription opioid misuse has become a leading cause of unintentional injury and death among adolescents and young adults in the United States. However, there is limited information on how adolescents and young adults obtain prescription opioids. There are also inadequate recent data on the prevalence of additional drug abuse among those misusing prescription opioids. In this study, we evaluated past-year prevalence of prescription opioid use and misuse, sources of prescription opioids, and additional substance use among adolescents and young adults. Methods and findings This was a retrospective analysis of the National Survey on Drug Use and Health (NSDUH) for the years 2015 and 2016. Prevalence of opioid use, misuse, use disorder, and additional substance use were calculated with 95% confidence intervals (CIs), stratified by age group and other demographic variables. Sources of prescription opioids were determined for respondents reporting opioid misuse. We calculated past-year prevalence of opioid use and misuse with or without use disorder, sources of prescription opioids, and prevalence of additional substance use. We included 27,857 adolescents (12–17 years of age) and 28,213 young adults (18–25 years of age) in our analyses, corresponding to 119.3 million individuals in the extrapolated national population. There were 15,143 respondents (27.5% [95% CI 27.0–28.0], corresponding to 32.8 million individuals) who used prescription opioids in the previous year, including 21.0% (95% CI 20.4–21.6) of adolescents and 32.2% (95% CI 31.4–33.0) of young adults. Significantly more females than males reported using any prescription opioid (30.3% versus 24.8%, P < 0.001), and non-Hispanic whites and blacks were more likely to have had any opioid use compared to Hispanics (28.9%, 28.1%, and 25.8%, respectively; P < 0.001). Opioid misuse was reported by 1,050 adolescents (3.8%; 95% CI 3.5–4.0) and 2,207 young adults (7.8%; 95% CI 7.3–8.2; P < 0.001). Male respondents using opioids were more likely to have opioid misuse without use disorder compared with females (23.2% versus 15.8%, respectively; P < 0.001), with similar prevalence by race/ethnicity. Among those misusing opioids, 55.7% obtained them from friends or relatives, 25.4% from the healthcare system, and 18.9% through other means. Obtaining opioids free from friends or relatives was the most common source for both adolescents (33.5%) and young adults (41.4%). Those with opioid misuse reported high prevalence of prior cocaine (35.5%), hallucinogen (49.4%), heroin (8.7%), and inhalant (30.4%) use. In addition, at least half had used tobacco (55.5%), alcohol (66.9%), or cannabis (49.9%) in the past month. Potential limitations of the study are that we cannot exclude selection bias in the study design or socially desirable reporting among participants, and that longitudinal data are not available for long-term follow-up of individuals. Conclusions Results from this study suggest that the prevalence of prescription opioid use among adolescents and young adults in the US is high despite known risks for future opioid and other drug use disorders. Reported prescription opioid misuse is common among adolescents and young adults and often associated with additional substance abuse, underscoring the importance of drug and alcohol screening programs in this population. Prevention and treatment efforts should take into account that greater than half of youths misusing prescription opioids obtain these medications through friends and relatives.

128 citations


Journal ArticleDOI
TL;DR: Findings support that prior DENV infection may protect individuals from symptomatic Zika and address the possible immunological mechanism(s) of cross-protection between ZikV and DENV and whether DENV immunity also modulates other ZIKV infection outcomes such as neurological or congenital syndromes.
Abstract: Background Zika virus (ZIKV) emerged in northeast Brazil in 2015 and spread rapidly across the Americas, in populations that have been largely exposed to dengue virus (DENV). The impact of prior DENV infection on ZIKV infection outcome remains unclear. To study this potential impact, we analyzed the large 2016 Zika epidemic in Managua, Nicaragua, in a pediatric cohort with well-characterized DENV infection histories. Methods and findings Symptomatic ZIKV infections (Zika cases) were identified by real-time reverse transcription PCR and serology in a community-based cohort study that follows approximately 3,700 children aged 2–14 years old. Annual blood samples were used to identify clinically inapparent ZIKV infections using a novel, well-characterized serological assay. Multivariable Poisson regression was used to examine the relation between prior DENV infection and incidence of symptomatic and inapparent ZIKV infection. The generalized-growth method was used to estimate the effective reproduction number. From January 1, 2016, to February 28, 2017, 560 symptomatic ZIKV infections and 1,356 total ZIKV infections (symptomatic and inapparent) were identified, for an overall incidence of 14.0 symptomatic infections (95% CI: 12.9, 15.2) and 36.5 total infections (95% CI: 34.7, 38.6) per 100 person-years. Effective reproduction number estimates ranged from 3.3 to 3.4, depending on the ascending wave period. Incidence of symptomatic and total ZIKV infections was higher in females and older children. Analysis of the effect of prior DENV infection was performed on 3,027 participants with documented DENV infection histories, of which 743 (24.5%) had experienced at least 1 prior DENV infection during cohort follow-up. Prior DENV infection was inversely associated with risk of symptomatic ZIKV infection in the total cohort population (incidence rate ratio [IRR]: 0.63; 95% CI: 0.48, 0.81; p < 0.005) and with risk of symptomatic presentation given ZIKV infection (IRR: 0.62; 95% CI: 0.44, 0.86) when adjusted for age, sex, and recent DENV infection (1–2 years before ZIKV infection). Recent DENV infection was significantly associated with decreased risk of symptomatic ZIKV infection when adjusted for age and sex, but not when adjusted for prior DENV infection. Prior or recent DENV infection did not affect the rate of total ZIKV infections. Our findings are limited to a pediatric population and constrained by the epidemiology of the site. Conclusions These findings support that prior DENV infection may protect individuals from symptomatic Zika. More research is needed to address the possible immunological mechanism(s) of cross-protection between ZIKV and DENV and whether DENV immunity also modulates other ZIKV infection outcomes such as neurological or congenital syndromes.

123 citations


Journal ArticleDOI
TL;DR: It is found that host-response-based diagnostics could accurately identify patients with ATB and predict individuals with high risk of progression from LTBI to ATB prior to sputum conversion.
Abstract: Background The World Health Organization (WHO) and Foundation for Innovative New Diagnostics (FIND) have published target product profiles (TPPs) calling for non-sputum-based diagnostic tests for the diagnosis of active tuberculosis (ATB) disease and for predicting the progression from latent tuberculosis infection (LTBI) to ATB. A large number of host-derived blood-based gene-expression biomarkers for diagnosis of patients with ATB have been proposed to date, but none have been implemented in clinical settings. The focus of this study is to directly compare published gene signatures for diagnosis of patients with ATB across a large, diverse list of publicly available gene expression datasets, and evaluate their performance against the WHO/FIND TPPs. Methods and findings We searched PubMed, Gene Expression Omnibus (GEO), and ArrayExpress in June 2018. We included all studies irrespective of study design and enrollment criteria. We found 16 gene signatures for the diagnosis of ATB compared to other clinical conditions in PubMed. For each signature, we implemented a classification model as described in the corresponding original publication of the signature. We identified 24 datasets containing 3,083 transcriptome profiles from whole blood or peripheral blood mononuclear cell samples of healthy controls or patients with ATB, LTBI, or other diseases from 14 countries in GEO. Using these datasets, we calculated weighted mean area under the receiver operating characteristic curve (AUROC), specificity at 90% sensitivity, and negative predictive value (NPV) for each gene signature across all datasets. We also compared the diagnostic odds ratio (DOR), heterogeneity in DOR, and false positive rate (FPR) for each signature using bivariate meta-analysis. Across 9 datasets of patients with culture-confirmed diagnosis of ATB, 11 signatures had weighted mean AUROC > 0.8, and 2 signatures had weighted mean AUROC ≤ 0.6. All but 2 signatures had high NPV (>98% at 2% prevalence). Two gene signatures achieved the minimal WHO TPP for a non-sputum-based triage test. When including datasets with clinical diagnosis of ATB, there was minimal reduction in the weighted mean AUROC and specificity of all but 3 signatures compared to when using only culture-confirmed ATB data. Only 4 signatures had homogeneous DOR and lower FPR when datasets with clinical diagnosis of ATB were included; other signatures either had heterogeneous DOR or higher FPR or both. Finally, 7 of 16 gene signatures predicted progression from LTBI to ATB 6 months prior to sputum conversion with positive predictive value > 6% at 2% prevalence. Our analyses may have under- or overestimated the performance of certain ATB diagnostic signatures because our implementation may be different from the published models for those signatures. We re-implemented published models because the exact models were not publicly available. Conclusions We found that host-response-based diagnostics could accurately identify patients with ATB and predict individuals with high risk of progression from LTBI to ATB prior to sputum conversion. We found that a higher number of genes in a signature did not increase the accuracy of the signature. Overall, the Sweeney3 signature performed robustly across all comparisons. Our results provide strong evidence for the potential of host-response-based diagnostics in achieving the WHO goal of ending tuberculosis by 2035, and host-response-based diagnostics should be pursued for clinical implementation.

Journal ArticleDOI
TL;DR: The results suggest that the ability to reach and maintain therapeutic concentrations is both lesion and drug specific, indicating that stratifying patients based on disease extent, lesion types, and individual drug-susceptibility profiles may eventually be useful for guiding the selection of patient-tailored drug regimens and may lead to improved TB treatment outcomes.
Abstract: Background The sites of mycobacterial infection in the lungs of tuberculosis (TB) patients have complex structures and poor vascularization, which obstructs drug distribution to these hard-to-reach and hard-to-treat disease sites, further leading to suboptimal drug concentrations, resulting in compromised TB treatment response and resistance development. Quantifying lesion-specific drug uptake and pharmacokinetics (PKs) in TB patients is necessary to optimize treatment regimens at all infection sites, to identify patients at risk, to improve existing regimens, and to advance development of novel regimens. Using drug-level data in plasma and from 9 distinct pulmonary lesion types (vascular, avascular, and mixed) obtained from 15 hard-to-treat TB patients who failed TB treatments and therefore underwent lung resection surgery, we quantified the distribution and the penetration of 7 major TB drugs at these sites, and we provide novel tools for treatment optimization. Methods and findings A total of 329 plasma- and 1,362 tissue-specific drug concentrations from 9 distinct lung lesion types were obtained according to optimal PK sampling schema from 15 patients (10 men, 5 women, aged 23 to 58) undergoing lung resection surgery (clinical study NCT00816426 performed in South Korea between 9 June 2010 and 24 June 2014). Seven major TB drugs (rifampin [RIF], isoniazid [INH], linezolid [LZD], moxifloxacin [MFX], clofazimine [CFZ], pyrazinamide [PZA], and kanamycin [KAN]) were quantified. We developed and evaluated a site-of-action mechanistic PK model using nonlinear mixed effects methodology. We quantified population- and patient-specific lesion/plasma ratios (RPLs), dynamics, and variability of drug uptake into each lesion for each drug. CFZ and MFX had higher drug exposures in lesions compared to plasma (median RPL 2.37, range across lesions 1.26–22.03); RIF, PZA, and LZD showed moderate yet suboptimal lesion penetration (median RPL 0.61, range 0.21–2.4), while INH and KAN showed poor tissue penetration (median RPL 0.4, range 0.03–0.73). Stochastic PK/pharmacodynamic (PD) simulations were carried out to evaluate current regimen combinations and dosing guidelines in distinct patient strata. Patients receiving standard doses of RIF and INH, who are of the lower range of exposure distribution, spent substantial periods (>12 h/d) below effective concentrations in hard-to-treat lesions, such as caseous lesions and cavities. Standard doses of INH (300 mg) and KAN (1,000 mg) did not reach therapeutic thresholds in most lesions for a majority of the population. Drugs and doses that did reach target exposure in most subjects include 400 mg MFX and 100 mg CFZ. Patients with cavitary lesions, irrespective of drug choice, have an increased likelihood of subtherapeutic concentrations, leading to a higher risk of resistance acquisition while on treatment. A limitation of this study was the small sample size of 15 patients, performed in a unique study population of TB patients who failed treatment and underwent lung resection surgery. These results still need further exploration and validation in larger and more diverse cohorts. Conclusions Our results suggest that the ability to reach and maintain therapeutic concentrations is both lesion and drug specific, indicating that stratifying patients based on disease extent, lesion types, and individual drug-susceptibility profiles may eventually be useful for guiding the selection of patient-tailored drug regimens and may lead to improved TB treatment outcomes. We provide a web-based tool to further explore this model and results at http://saviclab.org/tb-lesion/.

Journal ArticleDOI
TL;DR: The primary outcome was the proportion of male partners who were reported to have tested for HIV and linked into care or prevention within 28 days, with referral for antiretroviral therapy (ART) or circumcision accordingly.
Abstract: BACKGROUND: Conventional HIV testing services have been less comprehensive in reaching men than in reaching women globally, but HIV self-testing (HIVST) appears to be an acceptable alternative. Measurement of linkage to post-test services following HIVST remains the biggest challenge, yet is the biggest driver of cost-effectiveness. We investigated the impact of HIVST alone or with additional interventions on the uptake of testing and linkage to care or prevention among male partners of antenatal care clinic attendees in a novel adaptive trial. METHODS AND FINDINGS: An adaptive multi-arm, 2-stage cluster randomised trial was conducted between 8 August 2016 and 30 June 2017, with antenatal care clinic (ANC) days (i.e., clusters of women attending on a single day) as the unit of randomisation. Recruitment was from Ndirande, Bangwe, and Zingwangwa primary health clinics in urban Blantyre, Malawi. Women attending an ANC for the first time for their current pregnancy (regardless of trimester), 18 years and older, with a primary male partner not known to be on ART were enrolled in the trial after giving consent. Randomisation was to either the standard of care (SOC; with a clinic invitation letter to the male partner) or 1 of 5 intervention arms: the first arm provided women with 2 HIVST kits for their partners; the second and third arms provided 2 HIVST kits along with a conditional fixed financial incentive of $3 or $10; the fourth arm provided 2 HIVST kits and a 10% chance of receiving $30 in a lottery; and the fifth arm provided 2 HIVST kits and a phone call reminder for the women's partners. The primary outcome was the proportion of male partners who were reported to have tested for HIV and linked into care or prevention within 28 days, with referral for antiretroviral therapy (ART) or circumcision accordingly. Women were interviewed at 28 days about partner testing and adverse events. Cluster-level summaries compared each intervention versus SOC using eligible women as the denominator (intention-to-treat). Risk ratios were adjusted for male partner testing history and recruitment clinic. A total of 2,349/3,137 (74.9%) women participated (71 ANC days), with a mean age of 24.8 years (SD: 5.4). The majority (2,201/2,233; 98.6%) of women were married, 254/2,107 (12.3%) were unable to read and write, and 1,505/2,247 (67.0%) were not employed. The mean age for male partners was 29.6 years (SD: 7.5), only 88/2,200 (4.0%) were unemployed, and 966/2,210 (43.7%) had never tested for HIV before. Women in the SOC arm reported that 17.4% (71/408) of their partners tested for HIV, whereas a much higher proportion of partners were reported to have tested for HIV in all intervention arms (87.0%-95.4%, p < 0.001 in all 5 intervention arms). As compared with those who tested in the SOC arm (geometric mean 13.0%), higher proportions of partners met the primary endpoint in the HIVST + $3 (geometric mean 40.9%, adjusted risk ratio [aRR] 3.01 [95% CI 1.63-5.57], p < 0.001), HIVST + $10 (51.7%, aRR 3.72 [95% CI 1.85-7.48], p < 0.001), and phone reminder (22.3%, aRR 1.58 [95% CI 1.07-2.33], p = 0.021) arms. In contrast, there was no significant increase in partners meeting the primary endpoint in the HIVST alone (geometric mean 17.5%, aRR 1.45 [95% CI 0.99-2.13], p = 0.130) or lottery (18.6%, aRR 1.43 [95% CI 0.96-2.13], p = 0.211) arms. The lottery arm was dropped at interim analysis. Overall, 46 male partners were confirmed to be HIV positive, 42 (91.3%) of whom initiated ART within 28 days; 222 tested HIV negative and were not already circumcised, of whom 135 (60.8%) were circumcised as part of the trial. No serious adverse events were reported. Costs per male partner who attended the clinic with a confirmed HIV test result were $23.73 and $28.08 for the HIVST + $3 and HIVST + $10 arms, respectively. Notable limitations of the trial included the relatively small number of clusters randomised to each arm, proxy reporting of the male partner testing outcome, and being unable to evaluate retention in care. CONCLUSIONS: In this study, the odds of men's linkage to care or prevention increased substantially using conditional fixed financial incentives plus partner-delivered HIVST; combinations were potentially affordable. TRIAL REGISTRATION: ISRCTN 18421340.

Journal ArticleDOI
TL;DR: The proportion of adults with hypertension who have been screened, are aware of their diagnosis, take antihypertensive treatment, and have achieved control in India is low, and there is large variation among states in health system performance in the management of hypertension.
Abstract: Background Evidence on where in the hypertension care process individuals are lost to care, and how this varies among states and population groups in a country as large as India, is essential for the design of targeted interventions and to monitor progress. Yet, to our knowledge, there has not yet been a nationally representative analysis of the proportion of adults who reach each step of the hypertension care process in India. This study aimed to determine (i) the proportion of adults with hypertension who have been screened, are aware of their diagnosis, take antihypertensive treatment, and have achieved control and (ii) the variation of these care indicators among states and sociodemographic groups. Methods and findings We used data from a nationally representative household survey carried out from 20 January 2015 to 4 December 2016 among individuals aged 15-49 years in all states and union territories (hereafter "states") of the country. The stages of the care process-computed among those with hypertension at the time of the survey-were (i) having ever had one's blood pressure (BP) measured before the survey ("screened"), (ii) having been diagnosed ("aware"), (iii) currently taking BP-lowering medication ("treated"), and (iv) reporting being treated and not having a raised BP ("controlled"). We disaggregated these stages by state, rural-urban residence, sex, age group, body mass index, tobacco consumption, household wealth quintile, education, and marital status. In total, 731,864 participants were included in the analysis. Hypertension prevalence was 18.1% (95% CI 17.8%-18.4%). Among those with hypertension, 76.1% (95% CI 75.3%-76.8%) had ever received a BP measurement, 44.7% (95% CI 43.6%-45.8%) were aware of their diagnosis, 13.3% (95% CI 12.9%-13.8%) were treated, and 7.9% (95% CI 7.6%-8.3%) had achieved control. Male sex, rural location, lower household wealth, and not being married were associated with greater losses at each step of the care process. Between states, control among individuals with hypertension varied from 2.4% (95% CI 1.7%-3.3%) in Nagaland to 21.0% (95% CI 9.8%-39.6%) in Daman and Diu. At 38.0% (95% CI 36.3%-39.0%), 28.8% (95% CI 28.5%-29.2%), 28.4% (95% CI 27.7%-29.0%), and 28.4% (95% CI 27.8%-29.0%), respectively, Puducherry, Tamil Nadu, Sikkim, and Haryana had the highest proportion of all adults (irrespective of hypertension status) in the sampled age range who had hypertension but did not achieve control. The main limitation of this study is that its results cannot be generalized to adults aged 50 years and older-the population group in which hypertension is most common. Conclusions Hypertension prevalence in India is high, but the proportion of adults with hypertension who are aware of their diagnosis, are treated, and achieve control is low. Even after adjusting for states' economic development, there is large variation among states in health system performance in the management of hypertension. Improvements in access to hypertension diagnosis and treatment are especially important among men, in rural areas, and in populations with lower household wealth.

Journal ArticleDOI
TL;DR: As countries develop economically, overweight prevalence increased substantially among the poorest and stayed mostly unchanged among the wealthiest, indicating the relative poor in upper- and lower-middle income countries may have the greatest burden.
Abstract: Background In high-income countries, obesity prevalence (body mass index greater than or equal to 30 kg/m2) is highest among the poor, while overweight (body mass index greater than or equal to 25 kg/m2) is prevalent across all wealth groups. In contrast, in low-income countries, the prevalence of overweight and obesity is higher among wealthier individuals than among poorer individuals. We characterize the transition of overweight and obesity from wealthier to poorer populations as countries develop, and project the burden of overweight and obesity among the poor for 103 countries. Methods and findings Our sample used 182 Demographic and Health Surveys and World Health Surveys (n = 2.24 million respondents) from 1995 to 2016. We created a standard wealth index using household assets common among all surveys and linked national wealth by country and year identifiers. We then estimated the changing probability of overweight and obesity across every wealth decile as countries’ per capita gross domestic product (GDP) rises using logistic and linear fixed-effect regression models. We found that obesity rates among the wealthiest decile were relatively stable with increasing national wealth, and the changing gradient was largely due to increasing obesity prevalence among poorer populations (3.5% [95% uncertainty interval: 0.0%–8.3%] to 14.3% [9.7%–19.0%]). Overweight prevalence among the richest (45.0% [35.6%–54.4%]) and the poorest (45.5% [35.9%–55.0%]) were roughly equal in high-income settings. At $8,000 GDP per capita, the adjusted probability of being obese was no longer highest in the richest decile, and the same was true of overweight at $10,000. Above $25,000, individuals in the richest decile were less likely than those in the poorest decile to be obese, and the same was true of overweight at $50,000. We then projected overweight and obesity rates by wealth decile to 2040 for all countries to quantify the expected rise in prevalence in the relatively poor. Our projections indicated that, if past trends continued, the number of people who are poor and overweight will increase in our study countries by a median 84.4% (range 3.54%–383.4%), most prominently in low-income countries. The main limitations of this study included the inclusion of cross-sectional, self-reported data, possible reverse causality of overweight and obesity on wealth, and the lack of physical activity and food price data. Conclusions Our findings indicate that as countries develop economically, overweight prevalence increased substantially among the poorest and stayed mostly unchanged among the wealthiest. The relative poor in upper- and lower-middle income countries may have the greatest burden, indicating important planning and targeting needs for national health programs.

Journal ArticleDOI
TL;DR: Evidence that higher BMI leads to a higher risk of psoriasis is provided, which supports the prioritization of therapies and lifestyle interventions aimed at controlling weight for the prevention or treatment of this common skin disease.
Abstract: In a mendelian randomization study, Ashley Budu-Aggrey and co-workers study the influence of body mass index on psoriasis.

Journal ArticleDOI
TL;DR: Following intrauterine exposure to meetformin for treatment of maternal GDM, neonates are significantly smaller than neonates whose mothers were treated with insulin during pregnancy, and metformin-exposed children appear to experience accelerated postnatal growth, resulting in heavier infants and higher BMI by mid-childhood compared to children whosemothers were treatment with insulin.
Abstract: Background Metformin is increasingly offered as an acceptable and economic alternative to insulin for treatment of gestational diabetes mellitus (GDM) in many countries. However, the impact of maternal metformin treatment on the trajectory of fetal, infant, and childhood growth is unknown. Methods and findings PubMed, Ovid Embase, Medline, Web of Science, ClinicalTrials.gov, and the Cochrane database were systematically searched (from database inception to 26 February 2019). Outcomes of GDM-affected pregnancies randomised to treatment with metformin versus insulin were included (randomised controlled trials and prospective randomised controlled studies) from cohorts including European, American, Asian, Australian, and African women. Studies including pregnant women with pre-existing diabetes or non-diabetic women were excluded, as were trials comparing metformin treatment with oral glucose-lowering agents other than insulin. Two reviewers independently assessed articles for eligibility and risk of bias, and conflicts were resolved by a third reviewer. Outcome measures were parameters of fetal, infant, and childhood growth, including weight, height, BMI, and body composition. In total, 28 studies (n = 3,976 participants) met eligibility criteria and were included in the meta-analysis. No studies reported fetal growth parameters; 19 studies (n = 3,723 neonates) reported measures of neonatal growth. Neonates born to metformin-treated mothers had lower birth weights (mean difference −107.7 g, 95% CI −182.3 to −32.7, I2 = 83%, p = 0.005) and lower ponderal indices (mean difference −0.13 kg/m3, 95% CI −0.26 to 0.00, I2 = 0%, p = 0.04) than neonates of insulin-treated mothers. The odds of macrosomia (odds ratio [OR] 0.59, 95% CI 0.46 to 0.77, p < 0.001) and large for gestational age (OR 0.78, 95% CI 0.62 to 0.99, p = 0.04) were lower following maternal treatment with metformin compared to insulin. There was no difference in neonatal height or incidence of small for gestational age between groups. Two studies (n = 411 infants) reported measures of infant growth (18–24 months of age). In contrast to the neonatal phase, metformin-exposed infants were significantly heavier than those in the insulin-exposed group (mean difference 440 g, 95% CI 50 to 830, I2 = 4%, p = 0.03). Three studies (n = 520 children) reported mid-childhood growth parameters (5–9 years). In mid-childhood, BMI was significantly higher (mean difference 0.78 kg/m2, 95% CI 0.23 to 1.33, I2 = 7%, p = 0.005) following metformin exposure than following insulin exposure, although the difference in absolute weights between the groups was not significantly different (p = 0.09). Limited evidence (1 study with data treated as 2 cohorts) suggested that adiposity indices (abdominal [p = 0.02] and visceral [p = 0.03] fat volumes) may be higher in children born to metformin-treated compared to insulin-treated mothers. Study limitations include heterogeneity in metformin dosing, heterogeneity in diagnostic criteria for GDM, and the scarcity of reporting of childhood outcomes. Conclusions Following intrauterine exposure to metformin for treatment of maternal GDM, neonates are significantly smaller than neonates whose mothers were treated with insulin during pregnancy. Despite lower average birth weight, metformin-exposed children appear to experience accelerated postnatal growth, resulting in heavier infants and higher BMI by mid-childhood compared to children whose mothers were treated with insulin. Such patterns of low birth weight and postnatal catch-up growth have been reported to be associated with adverse long-term cardio-metabolic outcomes. This suggests a need for further studies examining longitudinal perinatal and childhood outcomes following intrauterine metformin exposure. This review protocol was registered with PROSPERO under registration number CRD42018117503.

Journal ArticleDOI
TL;DR: It is suggested that short-term temperature variability exposure could increase the risk of cardiovascular disease, which may provide new insights into the health effects of climate change.
Abstract: Background Epidemiological studies have provided compelling evidence of associations between ambient temperature and cardiovascular disease. However, evidence of effects of daily temperature variability on cardiovascular disease is scarce and mixed. We aimed to examine short-term associations between temperature variability and hospital admissions for cause-specific cardiovascular disease in urban China. Methods and findings We conducted a national time-series analysis in 184 cities in China between 2014 and 2017. Data on daily hospital admissions for ischemic heart disease, heart failure, heart rhythm disturbances, and ischemic stroke were obtained from the database of Urban Employee Basic Medical Insurance (UEBMI) including 0.28 billion enrollees. Temperature data were acquired from the China Meteorological Data Sharing Service Center. Temperature variability was calculated from the standard deviation (SD) of daily minimum and maximum temperatures over exposure days. City-specific associations between temperature variability and cardiovascular disease were examined with overdispersed Poisson models controlling for calendar time, day of the week, public holiday, and daily mean temperature and relative humidity. Random-effects meta-analyses were performed to obtain national and regional average associations. We also plotted exposure-response relationship curve using a natural cubic spline of temperature variability. There were 8.0 million hospital admissions for cardiovascular disease during the study period. At the national-average level, a 1-°C increase in temperature variability at 0–1 days (TV0–1) was associated with a 0.44% (0.32%–0.55%), 0.31% (0.20%–0.43%), 0.48% (0.01%–0.96%), 0.34% (0.01%–0.67%), and 0.82% (0.59%–1.05%) increase in hospital admissions for cardiovascular disease, ischemic heart disease, heart failure, heart rhythm disturbances, and ischemic stroke, respectively. The estimates decreased but remained significant when controlling for ambient fine particulate matter (PM2.5), NO2, and SO2 pollution. The main limitation of the present study was the unavailability of data on individual exposure to temperature variability. Conclusions Our findings suggested that short-term temperature variability exposure could increase the risk of cardiovascular disease, which may provide new insights into the health effects of climate change.

Journal ArticleDOI
TL;DR: In high-income countries, low SEP is a risk factor for hospital death as well as other indicators of potentially poor-quality end-of-life care, with evidence of a dose response indicating that inequality persists across the social stratum.
Abstract: Background Low socioeconomic position (SEP) is recognized as a risk factor for worse health outcomes. How socioeconomic factors influence end-of-life care, and the magnitude of their effect, is not understood. This review aimed to synthesise and quantify the associations between measures of SEP and use of healthcare in the last year of life. Methods and findings MEDLINE, EMBASE, PsycINFO, CINAHL, and ASSIA databases were searched without language restrictions from inception to 1 February 2019. We included empirical observational studies from high-income countries reporting an association between SEP (e.g., income, education, occupation, private medical insurance status, housing tenure, housing quality, or area-based deprivation) and place of death, plus use of acute care, specialist and nonspecialist end-of-life care, advance care planning, and quality of care in the last year of life. Methodological quality was evaluated using the Newcastle-Ottawa Quality Assessment Scale (NOS). The overall strength and direction of associations was summarised, and where sufficient comparable data were available, adjusted odds ratios (ORs) were pooled and dose-response meta-regression performed. A total of 209 studies were included (mean NOS quality score of 4.8); 112 high- to medium-quality observational studies were used in the meta-synthesis and meta-analysis (53.5% from North America, 31.0% from Europe, 8.5% from Australia, and 7.0% from Asia). Compared to people living in the least deprived neighbourhoods, people living in the most deprived neighbourhoods were more likely to die in hospital versus home (OR 1.30, 95% CI 1.23–1.38, p < 0.001), to receive acute hospital-based care in the last 3 months of life (OR 1.16, 95% CI 1.08–1.25, p < 0.001), and to not receive specialist palliative care (OR 1.13, 95% CI 1.07–1.19, p < 0.001). For every quintile increase in area deprivation, hospital versus home death was more likely (OR 1.07, 95% CI 1.05–1.08, p < 0.001), and not receiving specialist palliative care was more likely (OR 1.03, 95% CI 1.02–1.05, p < 0.001). Compared to the most educated (qualifications or years of education completed), the least educated people were more likely to not receive specialist palliative care (OR 1.26, 95% CI 1.07–1.49, p = 0.005). The observational nature of the studies included and the focus on high-income countries limit the conclusions of this review. Conclusions In high-income countries, low SEP is a risk factor for hospital death as well as other indicators of potentially poor-quality end-of-life care, with evidence of a dose response indicating that inequality persists across the social stratum. These findings should stimulate widespread efforts to reduce socioeconomic inequality towards the end of life.

Journal ArticleDOI
TL;DR: Methods for estimating gaps and steps in the care cascade for active TB disease and potential uses of this model for evaluating the impact of interventions to improve case finding, diagnosis, linkage to care, retention in care, and post-treatment monitoring of TB patients are described.
Abstract: The cascade of care is a model for evaluating patient retention across sequential stages of care required to achieve a successful treatment outcome. This approach was first used to evaluate HIV care and has since been applied to other diseases. The tuberculosis (TB) community has only recently started using care cascade analyses to quantify gaps in quality of care. In this article, we describe methods for estimating gaps (patient losses) and steps (patients retained) in the care cascade for active TB disease. We highlight approaches for overcoming challenges in constructing the TB care cascade, which include difficulties in estimating the population-level burden of disease and the diagnostic gap due to the limited sensitivity of TB diagnostic tests. We also describe potential uses of this model for evaluating the impact of interventions to improve case finding, diagnosis, linkage to care, retention in care, and post-treatment monitoring of TB patients.

Journal ArticleDOI
TL;DR: It is found that bariatric surgery, especially gastric bypass, prior to pregnancy was associated with increased risk of some adverse perinatal outcomes, which suggests that women who have undergonebariatric surgery may benefit from specific preconception and pregnancy nutritional support and increased monitoring of fetal growth and development.
Abstract: Funding: This study was conducted as part of a Newcastle University Research Excellence Academy PhD Studentship received by ZA (https://www.ncl.ac.uk/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Data Availability: All relevant data are within the manuscript and its Supporting Information files.

Journal ArticleDOI
TL;DR: A protective effect of social contact against dementia is suggested and that more frequent contact confers higher cognitive reserve, although it is possible that the ability to maintain more social contact may be a marker of cognitive reserve.
Abstract: Background There is need to identify targets for preventing or delaying dementia. Social contact is a potential target for clinical and public health studies, but previous observational studies had short follow-up, making findings susceptible to reverse causation bias. We therefore examined the association of social contact with subsequent incident dementia and cognition with 28 years’ follow-up. Methods and findings We conducted a retrospective analysis of the Whitehall II longitudinal prospective cohort study of employees of London civil service departments, aged 35–55 at baseline assessment in 1985–1988 and followed to 2017. Social contact was measured six times through a self-report questionnaire about frequency of contact with non-cohabiting relatives and friends. Dementia status was ascertained from three linked clinical and mortality databases, and cognition was assessed five times using tests of verbal memory, verbal fluency, and reasoning. Cox regression models with inverse probability weighting to account for attrition and missingness examined the association between social contact at age 50, 60, and 70 years and subsequent incident dementia. Mixed linear models examined the association of midlife social contact between 45 and 55 years and cognitive trajectory during the subsequent 14 years. Analyses were adjusted for age, sex, ethnicity, socioeconomic status, education, health behaviours, employment status, and marital status. Of 10,308 Whitehall II study participants, 10,228 provided social contact data (mean age 44.9 years [standard deviation (SD) 6.1 years] at baseline; 33.1% female; 89.1% white ethnicity). More frequent social contact at age 60 years was associated with lower dementia risk (hazard ratio [HR] for each SD higher social contact frequency = 0.88 [95% CI 0.79, 0.98], p = 0.02); effect size of the association of social contact at 50 or 70 years with dementia was similar (0.92 [95% CI 0.83, 1.02], p = 0.13 and 0.91 [95% CI 0.78, 1.06], p = 0.23, respectively) but not statistically significant. The association between social contact and incident dementia was driven by contact with friends (HR = 0.90 [95% CI 0.81, 1.00], p = 0.05), but no association was found for contact with relatives. More frequent social contact during midlife was associated with better subsequent cognitive trajectory: global cognitive function was 0.07 (95% CI 0.03, 0.11), p = 0.002 SDs higher for those with the highest versus lowest tertile of social contact frequency, and this difference was maintained over 14 years follow-up. Results were consistent in a series of post hoc analyses, designed to assess potential biases. A limitation of our study is ascertainment of dementia status from electronic health records rather than in-person assessment of diagnostic status, with the possibility that milder dementia cases were more likely to be missed. Conclusions Findings from this study suggest a protective effect of social contact against dementia and that more frequent contact confers higher cognitive reserve, although it is possible that the ability to maintain more social contact may be a marker of cognitive reserve. Future intervention studies should seek to examine whether improving social contact frequency is feasible, acceptable, and efficacious in changing cognitive outcomes.

Journal ArticleDOI
Lorenz von Seidlein1, Lorenz von Seidlein2, Thomas J. Peto1, Thomas J. Peto2, Jordi Landier3, Jordi Landier1, Thuy-Nhien Nguyen2, Rupam Tripura1, Rupam Tripura2, Rupam Tripura4, Koukeo Phommasone4, Koukeo Phommasone5, Tiengkham Pongvongsa1, Khin Maung Lwin1, Lilly Keereecharoen1, Ladda Kajeechiwa1, May Myo Thwin1, Daniel M. Parker6, Daniel M. Parker1, Jacher Wiladphaingern1, Suphak Nosten1, Stephane Proux1, Vincent Corbel3, Nguyen Tuong-Vy2, Truong Le Phuc-Nhi2, Do Hung Son2, Pham Nguyen Huong-Thu2, Nguyen Thi Kim Tuyen2, Nguyen Thanh Tien2, Le Thanh Dong, Dao Van Hue, Huynh Hong Quang, Chea Nguon, Chan Davoeung, Huy Rekol, Bipin Adhikari2, Bipin Adhikari1, Gisela Henriques7, Gisela Henriques1, Panom Phongmany, Preyanan Suangkanarat5, Atthanee Jeeyapant1, Benchawan Vihokhern1, Rob W. van der Pluijm2, Rob W. van der Pluijm1, Yoel Lubell1, Yoel Lubell2, Lisa J. White2, Lisa J. White1, Ricardo Aguas1, Ricardo Aguas2, Cholrawee Promnarate1, Pasathorn Sirithiranont1, Benoit Malleret8, Benoit Malleret9, Laurent Rénia8, Carl Onsjö10, Carl Onsjö1, Xin Hui S Chan1, Xin Hui S Chan2, Jeremy Chalk1, Olivo Miotto11, Olivo Miotto1, Krittaya Patumrat1, Kesinee Chotivanich1, Borimas Hanboonkunupakarn1, Podjanee Jittmala1, Nils Kaehler1, Phaik Yeong Cheah1, Phaik Yeong Cheah2, Christopher Pell4, Mehul Dhorda1, Mallika Imwong1, Georges Snounou12, Mavuto Mukaka2, Mavuto Mukaka1, Pimnara Peerawaranun1, Sue J. Lee2, Sue J. Lee1, Julie A. Simpson13, Sasithon Pukrittayakamee1, Sasithon Pukrittayakamee14, Pratap Singhasivanon1, Martin P. Grobusch4, Frank Cobelens4, Frank Smithuis, Paul N. Newton5, Paul N. Newton2, Guy E. Thwaites2, Nicholas P. J. Day2, Nicholas P. J. Day1, Mayfong Mayxay15, Mayfong Mayxay5, Tran Tinh Hien2, Tran Tinh Hien3, François Nosten2, François Nosten1, Arjen M. Dondorp1, Arjen M. Dondorp2, Nicholas J. White2, Nicholas J. White1 
TL;DR: The results suggest that, if used as part of a comprehensive, well-organised, and well-resourced elimination programme, dihydroartemisinin-piperaquine MDA can be a useful additional tool to accelerate malaria elimination.
Abstract: BACKGROUND: The emergence and spread of multidrug-resistant Plasmodium falciparum in the Greater Mekong Subregion (GMS) threatens global malaria elimination efforts. Mass drug administration (MDA), the presumptive antimalarial treatment of an entire population to clear the subclinical parasite reservoir, is a strategy to accelerate malaria elimination. We report a cluster randomised trial to assess the effectiveness of dihydroartemisinin-piperaquine (DP) MDA in reducing falciparum malaria incidence and prevalence in 16 remote village populations in Myanmar, Vietnam, Cambodia, and the Lao People's Democratic Republic, where artemisinin resistance is prevalent. METHODS AND FINDINGS: After establishing vector control and community-based case management and following intensive community engagement, we used restricted randomisation within village pairs to select 8 villages to receive early DP MDA and 8 villages as controls for 12 months, after which the control villages received deferred DP MDA. The MDA comprised 3 monthly rounds of 3 daily doses of DP and, except in Cambodia, a single low dose of primaquine. We conducted exhaustive cross-sectional surveys of the entire population of each village at quarterly intervals using ultrasensitive quantitative PCR to detect Plasmodium infections. The study was conducted between May 2013 and July 2017. The investigators randomised 16 villages that had a total of 8,445 residents at the start of the study. Of these 8,445 residents, 4,135 (49%) residents living in 8 villages, plus an additional 288 newcomers to the villages, were randomised to receive early MDA; 3,790 out of the 4,423 (86%) participated in at least 1 MDA round, and 2,520 out of the 4,423 (57%) participated in all 3 rounds. The primary outcome, P. falciparum prevalence by month 3 (M3), fell by 92% (from 5.1% [171/3,340] to 0.4% [12/2,828]) in early MDA villages and by 29% (from 7.2% [246/3,405] to 5.1% [155/3,057]) in control villages. Over the following 9 months, the P. falciparum prevalence increased to 3.3% (96/2,881) in early MDA villages and to 6.1% (128/2,101) in control villages (adjusted incidence rate ratio 0.41 [95% CI 0.20 to 0.84]; p = 0.015). Individual protection was proportional to the number of completed MDA rounds. Of 221 participants with subclinical P. falciparum infections who participated in MDA and could be followed up, 207 (94%) cleared their infections, including 9 of 10 with artemisinin- and piperaquine-resistant infections. The DP MDAs were well tolerated; 6 severe adverse events were detected during the follow-up period, but none was attributable to the intervention. CONCLUSIONS: Added to community-based basic malaria control measures, 3 monthly rounds of DP MDA reduced the incidence and prevalence of falciparum malaria over a 1-year period in areas affected by artemisinin resistance. P. falciparum infections returned during the follow-up period as the remaining infections spread and malaria was reintroduced from surrounding areas. Limitations of this study include a relatively small sample of villages, heterogeneity between villages, and mobility of villagers that may have limited the impact of the intervention. These results suggest that, if used as part of a comprehensive, well-organised, and well-resourced elimination programme, DP MDA can be a useful additional tool to accelerate malaria elimination. TRIAL REGISTRATION: ClinicalTrials.gov NCT01872702.

Journal ArticleDOI
TL;DR: People with screen-detected atrial fibrillation are at elevated calculated stroke risk: above age 65, the majority have a Class-1 OAC recommendation for stroke prevention, and >70% have ≥1 additional stroke risk factor other than age/sex.
Abstract: NL is funded by a NSW Health Early Career Fellowship (H16/52168) https://www.medicalresearch.nsw.gov.au/early-mid-career-fellowships/. FDRH acknowledges his part-funding from the National Institute for Health Research (NIHR) School for Primary Care Research, the NIHR Collaboration for Leadership in Health Research and Care (CLARHC) Oxford, the NIHR Oxford Biomedical Research Centre (BRC, UHT), and the NIHR Oxford Medtech and In-Vitro Diagnostics Co-operative (MIC) https://www.nihr.ac.uk/. PSW is funded by the Federal Ministry of Education and Research (BMBF 01EO1503) https://www.bmbf.de/en/index.html and he is PI of the German Center for Cardiovascular Research (DZHK) https://dzhk.de/en/. BPY is supported by the Hong Kong Research Grants Council - General Research Fund (RGC Ref No.14118314) https://www.ugc.edu.hk/eng/rgc/funded_research/funding_results.html. RKS received a grant from the University Hospital Foundation. JSH has a personnel award from the Heart and Stroke Foundation, Ontario Provincial office (MC7450) https://www.heartandstroke.ca/more-locations. JJO is supported by an Australian Government Research Training Program scholarship https://www.education.gov.au/research-training-program. DDM receives research support from NIH 1U01HL105268-01, KL2RR031981, R01HL126911, R01HL137794, HRS174612, and Grant 1522052 from the National Science Foundation https://www.nsf.gov/. AKR acknowledges funding by the NIHR Oxford Biomedical Research Centre https://oxfordbrc.nihr.ac.uk/. LAPDT received a grant from the Andalusian Public Progress and Health Foundation for the financing of biomedical and health sciences research in Andalusia (PI-0117-2011) http://www.advantageja.eu/index.php/about-us/partner/55:consejeria-de-salud-de-lajunta-de-andalucia-csja&catid=21:page&Itemid=101, the XIII grant from the Spanish Primary Care Network, a grant "Isabel Fernandez" of the Spanish Society of Family and Community Medicine (semFYC) https://www.woncaeurope.org/organisation/spanish-society-of-family-and-communitymedicine, and another of the Andalusian Society of Family and Community Medicine (SAMFyC) . This project received some funding from the European Research Council (ERC) https://erc.europa.eu/ under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 648131), German Ministry of Research and Education (BMBF 01ZX1408A) https://www.bmbf.de/en/index.html, and German Center for Cardiovascular Research (DZHK e.V.) (81Z1710103) (RBS) https://dzhk.de/en/. The funding bodies had no role or influence in the collation, analysis or reporting of the data. The Gutenberg Health Study is funded through the government of Rhineland-Palatinate (“Stiftung Rheinland-Pfalz fur Innovation”, contract AZ 961-386261/733), the research programs “Wissen schafft Zukunft” and “Center for Translational Vascular Biology (CTVB)” of the Johannes Gutenberg-University of Mainz http://www.unimainz.de/eng/, and its contract with Boehringer Ingelheim https://www.boehringeringelheim. com.au/ and PHILIPS Medical Systems https://www.philips.com.au/healthcare, including an unrestricted grant for the Gutenberg Health Study. The PIAAF-Pharmacy study was supported by the Canadian Stroke Prevention Intervention Network http://www.cspin.ca/, Boehringer Ingelheim https://www.boehringer-ingelheim.com.au/ and in-kind support from CardioComm https://www.cardiocommsolutions.com/. The SEARCH-AF study was supported by an investigator-initiated grant from Bristol-Myers Squibb/Pfizer https://www.bms.com/, and a small investigator-initiated project award from Boehringer Ingelheim https://www.boehringer-ingelheim.com.au/; AliveCor provided ECG Heart Monitors for study purposes: the investigators are not affiliated with, nor have any financial or other interest in AliveCor https://www.alivecor.com/. The AF-SMART study was supported by a National Heart Foundation of Australia/ NSW Health Cardiovascular Research Network Project Grant (101133) https://www.heartfoundation.org.au/research/researchnetworks/nsw-cardiovascular-research-network; AliveCor provided free Kardia Heart Monitors for study purposes https://www.alivecor.com/. The Belgian Heart Rhythm Week Screening Programme was funded with unconditional grants from Boehringer Ingelheim https://www.boehringer-ingelheim.com.au/, St Jude Medical https://www.abbott.com/abbott-stjudemedical-en-uk.html, Sanofi https://www.sanofi.com.au/, MSD https://www.msd-belgium.be/en/home/, and MSH https://www.msh-intl.com/en/europe/individuals/belgium-country-guide.html; none of the companies had any role in the conduction of the screening programme, study design, collection, and interpretation of data or writing and revision of the manuscript. All the researchers were completely independent from the funders. DANCAVAS was supported by the Danish Heart Foundation http://guardheart.ernnet.eu/patients/epags/danish-heart-foundation/. PIAAF-FP was funded by the Canadian Stroke Prevention Intervention Network http://www.cspin.ca/, Boehringer-Ingelheim https://www.boehringer-ingelheim.com.au/ and in-kind support from CardioComm https://www.cardiocommsolutions.com/ and ManthaMed https://medical.andonline.com/home . OFRECE study was promoted by the “Agencia de investigacion de la Sociedad Espanola de Cardiologia” https://secardiologia.es/cientifico/investigacion/agencia-de-investigacion.

Journal ArticleDOI
TL;DR: It is found that correctional facilities should scale up OAT among incarcerated persons with OUD, and participants who received MMT or BPN/NLX while incarcerated had fewer nonfatal overdoses and lower mortality.
Abstract: Background Worldwide opioid-related overdose has become a major public health crisis. People with opioid use disorder (OUD) are overrepresented in the criminal justice system and at higher risk for opioid-related mortality. However, correctional facilities frequently adopt an abstinence-only approach, seldom offering the gold standard opioid agonist treatment (OAT) to incarcerated persons with OUD. In an attempt to inform adequate management of OUD among incarcerated persons, we conducted a systematic review of opioid-related interventions delivered before, during, and after incarceration. Methods and findings We systematically reviewed 8 electronic databases for original, peer-reviewed literature published between January 2008 and October 2019. Our review included studies conducted among adult participants with OUD who were incarcerated or recently released into the community (≤90 days post-incarceration). The search identified 2,356 articles, 46 of which met the inclusion criteria based on assessments by 2 independent reviewers. Thirty studies were conducted in North America, 9 in Europe, and 7 in Asia/Oceania. The systematic review included 22 randomized control trials (RCTs), 3 non-randomized clinical trials, and 21 observational studies. Eight observational studies utilized administrative data and included large sample sizes (median of 10,419 [range 2273–131,472] participants), and 13 observational studies utilized primary data, with a median of 140 (range 27–960) participants. RCTs and non-randomized clinical trials included a median of 198 (range 15–1,557) and 44 (range 27–382) participants, respectively. Twelve studies included only men, 1 study included only women, and in the remaining 33 studies, the percentage of women was below 30%. The majority of study participants were middle-aged adults (36–55 years). Participants treated at a correctional facility with methadone maintenance treatment (MMT) or buprenorphine (BPN)/naloxone (NLX) had lower rates of illicit opioid use, had higher adherence to OUD treatment, were less likely to be re-incarcerated, and were more likely to be working 1 year post-incarceration. Participants who received MMT or BPN/NLX while incarcerated had fewer nonfatal overdoses and lower mortality. The main limitation of our systematic review is the high heterogeneity of studies (different designs, settings, populations, treatments, and outcomes), precluding a meta-analysis. Other study limitations include the insufficient data about incarcerated women with OUD, and the lack of information about incarcerated populations with OUD who are not included in published research. Conclusions In this carefully conducted systematic review, we found that correctional facilities should scale up OAT among incarcerated persons with OUD. The strategy is likely to decrease opioid-related overdose and mortality, reduce opioid use and other risky behaviors during and after incarceration, and improve retention in addiction treatment after prison release. Immediate OAT after prison release and additional preventive strategies such as the distribution of NLX kits to at-risk individuals upon release greatly decrease the occurrence of opioid-related overdose and mortality. In an effort to mitigate the impact of the opioid-related overdose crisis, it is crucial to scale up OAT and opioid-related overdose prevention strategies (e.g., NLX) within a continuum of treatment before, during, and after incarceration.

Journal ArticleDOI
TL;DR: Evidence from a series of causal inference approaches using genetics does not support a causal effect of SU level on eGFR level or CKD risk, and reducing SU levels is unlikely to reduce the risk of CKD development.
Abstract: Background Studies have shown strong positive associations between serum urate (SU) levels and chronic kidney disease (CKD) risk; however, whether the relation is causal remains uncertain. We evaluate whether genetic data are consistent with a causal impact of SU level on the risk of CKD and estimated glomerular filtration rate (eGFR). Methods and findings We used Mendelian randomization (MR) methods to evaluate the presence of a causal effect. We used aggregated genome-wide association data (N = 110,347 for SU, N = 69,374 for gout, N = 133,413 for eGFR, N = 117,165 for CKD), electronic-medical-record-linked UK Biobank data (N = 335,212), and population-based cohorts (N = 13,425), all in individuals of European ancestry, for SU levels and CKD. Our MR analysis showed that SU has a causal effect on neither eGFR level nor CKD risk across all MR analyses (all P > 0.05). These null associations contrasted with our epidemiological association findings from the 4 population-based cohorts (change in eGFR level per 1-mg/dl [59.48 μmol/l] increase in SU: −1.99 ml/min/1.73 m2; 95% CI −2.86 to −1.11; P = 8.08 × 10−6; odds ratio [OR] for CKD: 1.48; 95% CI 1.32 to 1.65; P = 1.52 × 10−11). In contrast, the same MR approaches showed that SU has a causal effect on the risk of gout (OR estimates ranging from 3.41 to 6.04 per 1-mg/dl increase in SU, all P 99% power to detect a causal effect of SU level on the risk of CKD of the same magnitude as the observed epidemiological association between SU and CKD. Limitations of this study include the lifelong effect of a genetic perturbation not being the same as an acute perturbation, the inability to study non-European populations, and some sample overlap between the datasets used in the study. Conclusions Evidence from our series of causal inference approaches using genetics does not support a causal effect of SU level on eGFR level or CKD risk. Reducing SU levels is unlikely to reduce the risk of CKD development.

Journal ArticleDOI
TL;DR: The PrEP Implementation in Young Women and Adolescents (PrIYA) program is a real-world implementation program to demonstrate integration of PrEP delivery for at-risk AGYW in FP clinics in Kisumu, Kenya to demonstrate the feasibility of integrating PrEP Delivery within routine family planning clinics.
Abstract: Background Young women account for a disproportionate fraction of new HIV infections in Africa and are a priority population for HIV prevention, including implementation of preexposure prophylaxis (PrEP). The overarching goal of this project was to demonstrate the feasibility of integrating PrEP delivery within routine family planning (FP) clinics to serve as a platform to efficiently reach at-risk adolescent girls and young women (AGYW) for PrEP in HIV high-burden settings. Methods and findings The PrEP Implementation in Young Women and Adolescents (PrIYA) program is a real-world implementation program to demonstrate integration of PrEP delivery for at-risk AGYW in FP clinics in Kisumu, Kenya. Between November 2017 and June 2018, women aged 15 to 45 from the general population seeking FP services at 8 public health clinics were universally screened for HIV behavioral risk factors and offered PrEP following national PrEP guidelines. We evaluated PrEP uptake and continuation, and robust Poisson regression methods were used to identify correlates of uptake and early continuation of PrEP, with age included as a one-knot linear spline. Overall, 1,271 HIV-uninfected women accessing routine FP clinics were screened for PrEP; the median age was 25 years (interquartile range [IQR]: 22–29), 627 (49%) were <24 years old, 1,026 (82%) were married, more than one-third (34%) had partners of unknown HIV status, and the vast majority (n = 1,200 [94%]) reported recent condom-less sex. Of 1,271 women screened, 278 (22%) initiated PrEP, and 114 (41%) returned for at least one refill visit after initiation. PrEP uptake was independently associated with reported male-partner HIV status (HIV-positive 94%, unknown 35%, HIV-negative 8%; p < 0.001) and marital status (28% unmarried versus married 21%; p = 0.04), and a higher proportion of women ≥24 years (26%; 191/740) initiated PrEP compared to 16% (87/531) of young women <24 years (p < 0.001). There was a moderate and statistically non-significant unadjusted increase in PrEP uptake among women using oral contraception pills (OCPs) compared to women using injectable or long-acting reversible contraception methods (OCP 28% versus injectable/implants/intrauterine devices [IUDs] 18%; p = 0.06). Among women with at least one post-PrEP initiation follow-up visit (n = 278), no HIV infection was documented during the project period. Overall, continuation of PrEP use at 1, 3, and 6 months post initiation was 41%, 24%, and 15%, respectively. The likelihood for early continuation of PrEP use (i.e., return for at least one PrEP refill within 45 days post initiation) was strongly associated with reported male-partner HIV status (HIV-positive 67%, -negative 39%, unknown 31%; overall effect p = 0.001), and a higher proportion of women ≥24 years old continued PrEP at 1 month compared with young women <24 years old (47% versus 29%; p = 0.002). For women ≥24 years old, the likelihood to continue PrEP use at 1 month post initiation increased by 3% for each additional year of a woman’s age (adjusted prevalence ratio [PR]: 1.03; 95% confidence interval [CI]: 1.01–1.05; p = 0.01). In contrast, for women <24 years old, the likelihood of continuing PrEP for each additional year of a woman’s age was high in magnitude (approximately 6%) but statistically non-significant (adjusted PR: 1.06; 95% CI: 0.97–1.16; p = 0.18). Frequently reported reasons for discontinuing PrEP were low perceived risk of HIV (25%), knowledge that partner was HIV negative (24%), experiencing side effects (20%), and pill burden (17%). Study limitations include lack of qualitative work to provide insights into women’s decision-making on PrEP uptake and continuation, the small number of measured covariates imposed by the program data, and a nonrandomized design limiting definitive ascertainment of the robustness of a PrEP-dedicated nurse-led implementation strategy. Conclusions In this real-world PrEP implementation program in Kenya, integration of universal screening and counseling for PrEP in FP clinics was feasible, making this platform a potential “one-stop” location for FP and PrEP. There was a high drop-off in PrEP continuation, but a subset of women continued PrEP use at least through 1 month, possibly indicating further reflection or decision-making on PrEP use. Greater efforts to support PrEP normalization and persistence for African women are needed to help women navigate their decisions about HIV prevention preferences as their reproductive goals and HIV vulnerability evolve.

Journal ArticleDOI
TL;DR: It is estimated that current concentrations of PM2.5 are associated with mortality impacts and loss of life expectancy, with larger impacts in counties with lower income and higher poverty rate than in wealthier counties.
Abstract: Background Exposure to fine particulate matter pollution (PM2.5) is hazardous to health. Our aim was to directly estimate the health and longevity impacts of current PM2.5 concentrations and the benefits of reductions from 1999 to 2015, nationally and at county level, for the entire contemporary population of the contiguous United States. Methods and findings We used vital registration and population data with information on sex, age, cause of death, and county of residence. We used four Bayesian spatiotemporal models, with different adjustments for other determinants of mortality, to directly estimate mortality and life expectancy loss due to current PM2.5 pollution and the benefits of reductions since 1999, nationally and by county. The covariates included in the adjusted models were per capita income; percentage of population whose family income is below the poverty threshold, who are of Black or African American race, who have graduated from high school, who live in urban areas, and who are unemployed; cumulative smoking; and mean temperature and relative humidity. In the main model, which adjusted for these covariates and for unobserved county characteristics through the use of county-specific random intercepts, PM2.5 pollution in excess of the lowest observed concentration (2.8 μg/m3) was responsible for an estimated 15,612 deaths (95% credible interval 13,248–17,945) in females and 14,757 deaths (12,617–16,919) in males. These deaths would lower national life expectancy by an estimated 0.15 years (0.13–0.17) for women and 0.13 years (0.11–0.15) for men. The life expectancy loss due to PM2.5 was largest around Los Angeles and in some southern states such as Arkansas, Oklahoma, and Alabama. At any PM2.5 concentration, life expectancy loss was, on average, larger in counties with lower income and higher poverty rate than in wealthier counties. Reductions in PM2.5 since 1999 have lowered mortality in all but 14 counties where PM2.5 increased slightly. The main limitation of our study, similar to other observational studies, is that it is not guaranteed for the observed associations to be causal. We did not have annual county-level data on other important determinants of mortality, such as healthcare access and quality and diet, but these factors were adjusted for with use of county-specific random intercepts. Conclusions According to our estimates, recent reductions in particulate matter pollution in the USA have resulted in public health benefits. Nonetheless, we estimate that current concentrations are associated with mortality impacts and loss of life expectancy, with larger impacts in counties with lower income and higher poverty rate.

Journal ArticleDOI
TL;DR: The finding of no alternative causes for geographic differences in microcephaly rate leads us to hypothesize that the Northeast region was disproportionately affected by this Zika outbreak, with 94% of an estimated 8.5 million total cases occurring in this region, suggesting a need for seroprevalence surveys to determine the underlying reason.
Abstract: Background In 2015, high rates of microcephaly were reported in Northeast Brazil following the first South American Zika virus (ZIKV) outbreak. Reported microcephaly rates in other Zika-affected areas were significantly lower, suggesting alternate causes or the involvement of arboviral cofactors in exacerbating microcephaly rates. Methods and findings We merged data from multiple national reporting databases in Brazil to estimate exposure to 9 known or hypothesized causes of microcephaly for every pregnancy nationwide since the beginning of the ZIKV outbreak; this generated between 3.6 and 5.4 million cases (depending on analysis) over the time period 1 January 2015–23 May 2017. The association between ZIKV and microcephaly was statistically tested against models with alternative causes or with effect modifiers. We found no evidence for alternative non-ZIKV causes of the 2015–2017 microcephaly outbreak, nor that concurrent exposure to arbovirus infection or vaccination modified risk. We estimate an absolute risk of microcephaly of 40.8 (95% CI 34.2–49.3) per 10,000 births and a relative risk of 16.8 (95% CI 3.2–369.1) given ZIKV infection in the first or second trimester of pregnancy; however, because ZIKV infection rates were highly variable, most pregnant women in Brazil during the ZIKV outbreak will have been subject to lower risk levels. Statistically significant associations of ZIKV with other birth defects were also detected, but at lower relative risks than that of microcephaly (relative risk < 1.5). Our analysis was limited by missing data prior to the establishment of nationwide ZIKV surveillance, and its findings may be affected by unmeasured confounding causes of microcephaly not available in routinely collected surveillance data. Conclusions This study strengthens the evidence that congenital ZIKV infection, particularly in the first 2 trimesters of pregnancy, is associated with microcephaly and less frequently with other birth defects. The finding of no alternative causes for geographic differences in microcephaly rate leads us to hypothesize that the Northeast region was disproportionately affected by this Zika outbreak, with 94% of an estimated 8.5 million total cases occurring in this region, suggesting a need for seroprevalence surveys to determine the underlying reason.

Journal ArticleDOI
TL;DR: SGLT2is reduced the odds of suffering AKI with and without hospitalization in randomized trials and the real-world setting, despite the fact that more AEs related to hypovolemia are reported.
Abstract: Background Sodium-glucose cotransporter-2 inhibitors (SGLT2is) represent a new class of oral hypoglycemic agents used in the treatment of type 2 diabetes mellitus. They have a positive effect on the progression of chronic kidney disease, but there is a concern that they might cause acute kidney injury (AKI). Methods and findings We conducted a systematic review and meta-analysis of the effect of SGLT2is on renal adverse events (AEs) in randomized controlled trials and controlled observational studies. PubMed, EMBASE, Cochrane library, and ClinicalTrials.gov were searched without date restriction until 27 September 2019. Data extraction was performed using a standardized data form, and any discrepancies were resolved by consensus. One hundred and twelve randomized trials (n = 96,722) and 4 observational studies with 5 cohorts (n = 83,934) with a minimum follow-up of 12 weeks that provided information on at least 1 adverse renal outcome (AKI, combined renal AE, or hypovolemia-related events) were included. In 30 trials, 410 serious AEs due to AKI were reported. SGLT2is reduced the odds of suffering AKI by 36% (odds ratio [OR] 0.64 [95% confidence interval (CI) 0.53–0.78], p < 0.001). A total of 1,089 AKI events of any severity (AEs and serious AEs [SAEs]) were published in 41 trials (OR 0.75 [95% CI 0.66–0.84], p < 0.001). Empagliflozin, dapagliflozin, and canagliflozin had a comparable benefit on the SAE and AE rate. AEs related to hypovolemia were more commonly reported in SGLT2i-treated patients (OR 1.20 [95% CI 1.10–1.31], p < 0.001). In the observational studies, 777 AKI events were reported. The odds of suffering AKI were reduced in patients receiving SGLT2is (OR 0.40 [95% CI 0.33–0.48], p < 0.001). Limitations of this study are the reliance on nonadjudicated safety endpoints, discrepant inclusion criteria and baseline hypoglycemic therapy between studies, inconsistent definitions of renal AEs and hypovolemia, varying follow-up times in different studies, and a lack of information on the severity of AKI (stages I–III). Conclusions SGLT2is reduced the odds of suffering AKI with and without hospitalization in randomized trials and the real-world setting, despite the fact that more AEs related to hypovolemia are reported.