scispace - formally typeset
Search or ask a question

Showing papers by "Boston University published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
Theo Vos1, Christine Allen1, Megha Arora1, Ryan M Barber1  +696 moreInstitutions (260)
TL;DR: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) as discussed by the authors was used to estimate the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015.

5,050 citations


Journal ArticleDOI
Haidong Wang1, Mohsen Naghavi1, Christine Allen1, Ryan M Barber1  +841 moreInstitutions (293)
TL;DR: The Global Burden of Disease 2015 Study provides a comprehensive assessment of all-cause and cause-specific mortality for 249 causes in 195 countries and territories from 1980 to 2015, finding several countries in sub-Saharan Africa had very large gains in life expectancy, rebounding from an era of exceedingly high loss of life due to HIV/AIDS.

4,804 citations


Journal ArticleDOI
TL;DR: These recommendations address the best approaches for antibiotic stewardship programs to influence the optimal use of antibiotics.
Abstract: Evidence-based guidelines for implementation and measurement of antibiotic stewardship interventions in inpatient populations including long-term care were prepared by a multidisciplinary expert panel of the Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America. The panel included clinicians and investigators representing internal medicine, emergency medicine, microbiology, critical care, surgery, epidemiology, pharmacy, and adult and pediatric infectious diseases specialties. These recommendations address the best approaches for antibiotic stewardship programs to influence the optimal use of antibiotics.

1,969 citations


Journal ArticleDOI
TL;DR: The eigenstate thermalization hypothesis (ETH) as discussed by the authors is a natural extension of quantum chaos and random matrix theory (RMT) that allows one to describe thermalization in isolated chaotic systems without invoking the notion of an external bath.
Abstract: This review gives a pedagogical introduction to the eigenstate thermalization hypothesis (ETH), its basis, and its implications to statistical mechanics and thermodynamics. In the first part, ETH is introduced as a natural extension of ideas from quantum chaos and random matrix theory (RMT). To this end, we present a brief overview of classical and quantum chaos, as well as RMT and some of its most important predictions. The latter include the statistics of energy levels, eigenstate components, and matrix elements of observables. Building on these, we introduce the ETH and show that it allows one to describe thermalization in isolated chaotic systems without invoking the notion of an external bath. We examine numerical evidence of eigenstate thermalization from studies of many-body lattice systems. We also introduce the concept of a quench as a means of taking isolated systems out of equilibrium, and discuss results of numerical experiments on quantum quenches. The second part of the review explores the i...

1,536 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used three long-term satellite leaf area index (LAI) records and ten global ecosystem models to investigate four key drivers of LAI trends during 1982-2009.
Abstract: Global environmental change is rapidly altering the dynamics of terrestrial vegetation, with consequences for the functioning of the Earth system and provision of ecosystem services(1,2). Yet how global vegetation is responding to the changing environment is not well established. Here we use three long-term satellite leaf area index (LAI) records and ten global ecosystem models to investigate four key drivers of LAI trends during 1982-2009. We show a persistent and widespread increase of growing season integrated LAI (greening) over 25% to 50% of the global vegetated area, whereas less than 4% of the globe shows decreasing LAI (browning). Factorial simulations with multiple global ecosystem models suggest that CO2 fertilization effects explain 70% of the observed greening trend, followed by nitrogen deposition (9%), climate change (8%) and land cover change (LCC) (4%). CO2 fertilization effects explain most of the greening trends in the tropics, whereas climate change resulted in greening of the high latitudes and the Tibetan Plateau. LCC contributed most to the regional greening observed in southeast China and the eastern United States. The regional effects of unexplained factors suggest that the next generation of ecosystem models will need to explore the impacts of forest demography, differences in regional management intensities for cropland and pastures, and other emerging productivity constraints such as phosphorus availability.

1,534 citations


Journal ArticleDOI
Nicholas J Kassebaum1, Megha Arora1, Ryan M Barber1, Zulfiqar A Bhutta2  +679 moreInstitutions (268)
TL;DR: In this paper, the authors used the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) for all-cause mortality, cause-specific mortality, and non-fatal disease burden to derive HALE and DALYs by sex for 195 countries and territories from 1990 to 2015.

1,533 citations


Journal ArticleDOI
TL;DR: In this article, the authors study the impact of the short-term accommodation market on the hotel industry and find that the impact is non-uniformly distributed, with lower-priced hotels and those hotels not catering to business travelers being the most affected.
Abstract: Peer-to-peer markets, collectively known as the sharing economy, have emerged as alternative suppliers of goods and services traditionally provided by long-established industries. A central question regards the impact of these sharing economy platforms on incumbent firms. We study the case of Airbnb, specifically analyzing Airbnb’s entry into the short-term accommodation market in Texas and its impact on the incumbent hotel industry. We first explore Airbnb’s impact on hotel revenue, by using a difference- in-differences empirical strategy that exploits the significant spatiotemporal variation in the patterns of Airbnb adoption across city-level markets. We estimate that in Austin, where Airbnb supply is highest, the causal impact on hotel revenue is in the 8-10% range; moreover, the impact is non-uniformly distributed, with lower-priced hotels and those hotels not catering to business travelers being the most affected. We find that this impact materializes through less aggressive hotel room pricing, an impact that benefits all consumers, not just participants in the sharing economy. The impact on hotel prices is especially pronounced during periods of peak demand, such as SXSW. We find that by enabling supply to scale – a differentiating feature of peer-to-peer platforms – Airbnb has significantly crimped hotels’ ability to raise prices during periods of peak demand. Our work provides empirical evidence that the sharing economy is making inroads by successfully competing with, differentiating from, and acquiring market share from incumbent firms.

1,519 citations


Posted Content
TL;DR: This paper extends CORAL to learn a nonlinear transformation that aligns correlations of layer activations in deep neural networks (Deep CORAL), and shows state-of-the-art performance on standard benchmark datasets.
Abstract: Deep neural networks are able to learn powerful representations from large quantities of labeled input data, however they cannot always generalize well across changes in input distributions. Domain adaptation algorithms have been proposed to compensate for the degradation in performance due to domain shift. In this paper, we address the case when the target domain is unlabeled, requiring unsupervised adaptation. CORAL is a "frustratingly easy" unsupervised domain adaptation method that aligns the second-order statistics of the source and target distributions with a linear transformation. Here, we extend CORAL to learn a nonlinear transformation that aligns correlations of layer activations in deep neural networks (Deep CORAL). Experiments on standard benchmark datasets show state-of-the-art performance.

1,501 citations


Journal ArticleDOI
TL;DR: A massive quantitative analysis of Facebook shows that information related to distinct narratives––conspiracy theories and scientific news––generates homogeneous and polarized communities having similar information consumption patterns, and derives a data-driven percolation model of rumor spreading that demonstrates that homogeneity and polarization are the main determinants for predicting cascades’ size.
Abstract: The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web (WWW) also allows for the rapid dissemination of unsubstantiated rumors and conspiracy theories that often elicit rapid, large, but naive social responses such as the recent case of Jade Helm 15––where a simple military exercise turned out to be perceived as the beginning of a new civil war in the United States. In this work, we address the determinants governing misinformation spreading through a thorough quantitative analysis. In particular, we focus on how Facebook users consume information related to two distinct narratives: scientific and conspiracy news. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Finally, we introduce a data-driven percolation model mimicking rumor spreading and we show that homogeneity and polarization are the main determinants for predicting cascades’ size.

1,457 citations


Proceedings Article
05 Dec 2016
TL;DR: The authors showed that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent, which raises concerns because their widespread use often tends to amplify these biases.
Abstract: The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding. Second, gender neutral words are shown to be linearly separable from gender definition words in the word embedding. Using these properties, we provide a methodology for modifying an embedding to remove gender stereotypes, such as the association between the words receptionist and female, while maintaining desired associations such as between the words queen and female. Using crowd-worker evaluation as well as standard benchmarks, we empirically demonstrate that our algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks. The resulting embeddings can be used in applications without amplifying gender bias.

Book ChapterDOI
08 Oct 2016
TL;DR: Deep CORAL as mentioned in this paper aligns correlations of layer activations in deep neural networks (DeepCORAL) to learn a nonlinear transformation that aligns correlation between the source and target distributions.
Abstract: Deep neural networks are able to learn powerful representations from large quantities of labeled input data, however they cannot always generalize well across changes in input distributions. Domain adaptation algorithms have been proposed to compensate for the degradation in performance due to domain shift. In this paper, we address the case when the target domain is unlabeled, requiring unsupervised adaptation. CORAL [18] is a simple unsupervised domain adaptation method that aligns the second-order statistics of the source and target distributions with a linear transformation. Here, we extend CORAL to learn a nonlinear transformation that aligns correlations of layer activations in deep neural networks (Deep CORAL). Experiments on standard benchmark datasets show state-of-the-art performance. Our code is available at: https://github.com/VisionLearningGroup/CORAL.

Journal ArticleDOI
17 Mar 2016-Nature
TL;DR: These results show the first substantive post-exposure protection by a small-molecule antiviral compound against EBOV in nonhuman primates, and the broad-spectrum antiviral activity of GS-5734 in vitro against other pathogenic RNA viruses, including filoviruses, arenavirus, and coronavirus suggests the potential for wider medical use.
Abstract: The most recent Ebola virus outbreak in West Africa, which was unprecedented in the number of cases and fatalities, geographic distribution, and number of nations affected, highlights the need for safe, effective, and readily available antiviral agents for treatment and prevention of acute Ebola virus (EBOV) disease (EVD) or sequelae. No antiviral therapeutics have yet received regulatory approval or demonstrated clinical efficacy. Here we report the discovery of a novel small molecule GS-5734, a monophosphoramidate prodrug of an adenosine analogue, with antiviral activity against EBOV. GS-5734 exhibits antiviral activity against multiple variants of EBOV and other filoviruses in cell-based assays. The pharmacologically active nucleoside triphosphate (NTP) is efficiently formed in multiple human cell types incubated with GS-5734 in vitro, and the NTP acts as an alternative substrate and RNA-chain terminator in primer-extension assays using a surrogate respiratory syncytial virus RNA polymerase. Intravenous administration of GS-5734 to nonhuman primates resulted in persistent NTP levels in peripheral blood mononuclear cells (half-life, 14 h) and distribution to sanctuary sites for viral replication including testes, eyes, and brain. In a rhesus monkey model of EVD, once-daily intravenous administration of 10 mg kg(-1) GS-5734 for 12 days resulted in profound suppression of EBOV replication and protected 100% of EBOV-infected animals against lethal disease, ameliorating clinical disease signs and pathophysiological markers, even when treatments were initiated three days after virus exposure when systemic viral RNA was detected in two out of six treated animals. These results show the first substantive post-exposure protection by a small-molecule antiviral compound against EBOV in nonhuman primates. The broad-spectrum antiviral activity of GS-5734 in vitro against other pathogenic RNA viruses, including filoviruses, arenaviruses, and coronaviruses, suggests the potential for wider medical use. GS-5734 is amenable to large-scale manufacturing, and clinical studies investigating the drug safety and pharmacokinetics are ongoing.

Journal ArticleDOI
TL;DR: Bone scintigraphy enables the diagnosis of cardiac ATTR amyloidosis to be made reliably without the need for histology in patients who do not have a monoclonal gammopathy, and proposes noninvasive diagnostic criteria that are applicable to the majority of patients with this disease.
Abstract: Background—Cardiac transthyretin (ATTR) amyloidosis is a progressive and fatal cardiomyopathy for which several promising therapies are in development. The diagnosis is frequently delayed or missed...

Journal ArticleDOI
TL;DR: The results support the hypothesis that rare coding variants can pinpoint causal genes within known genetic loci and illustrate that applying the approach systematically to detect new loci requires extremely large sample sizes.
Abstract: Advanced age-related macular degeneration (AMD) is the leading cause of blindness in the elderly, with limited therapeutic options. Here we report on a study of >12 million variants, including 163,714 directly genotyped, mostly rare, protein-altering variants. Analyzing 16,144 patients and 17,832 controls, we identify 52 independently associated common and rare variants (P < 5 × 10(-8)) distributed across 34 loci. Although wet and dry AMD subtypes exhibit predominantly shared genetics, we identify the first genetic association signal specific to wet AMD, near MMP9 (difference P value = 4.1 × 10(-10)). Very rare coding variants (frequency <0.1%) in CFH, CFI and TIMP3 suggest causal roles for these genes, as does a splice variant in SLC16A8. Our results support the hypothesis that rare coding variants can pinpoint causal genes within known genetic loci and illustrate that applying the approach systematically to detect new loci requires extremely large sample sizes.

Posted Content
TL;DR: This work empirically demonstrates that its algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks.
Abstract: The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding. Second, gender neutral words are shown to be linearly separable from gender definition words in the word embedding. Using these properties, we provide a methodology for modifying an embedding to remove gender stereotypes, such as the association between between the words receptionist and female, while maintaining desired associations such as between the words queen and female. We define metrics to quantify both direct and indirect gender biases in embeddings, and develop algorithms to "debias" the embedding. Using crowd-worker evaluation as well as standard benchmarks, we empirically demonstrate that our algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks. The resulting embeddings can be used in applications without amplifying gender bias.

Journal ArticleDOI
TL;DR: This review highlights the recent advances of smart MNPs categorized according to their activation stimulus (physical, chemical, or biological) and looks forward to future pharmaceutical applications.
Abstract: New achievements in the realm of nanoscience and innovative techniques of nanomedicine have moved micro/nanoparticles (MNPs) to the point of becoming actually useful for practical applications in the near future. Various differences between the extracellular and intracellular environments of cancerous and normal cells and the particular characteristics of tumors such as physicochemical properties, neovasculature, elasticity, surface electrical charge, and pH have motivated the design and fabrication of inventive “smart” MNPs for stimulus-responsive controlled drug release. These novel MNPs can be tailored to be responsive to pH variations, redox potential, enzymatic activation, thermal gradients, magnetic fields, light, and ultrasound (US), or can even be responsive to dual or multi-combinations of different stimuli. This unparalleled capability has increased their importance as site-specific controlled drug delivery systems (DDSs) and has encouraged their rapid development in recent years. An in-depth understanding of the underlying mechanisms of these DDS approaches is expected to further contribute to this groundbreaking field of nanomedicine. Smart nanocarriers in the form of MNPs that can be triggered by internal or external stimulus are summarized and discussed in the present review, including pH-sensitive peptides and polymers, redox-responsive micelles and nanogels, thermo- or magnetic-responsive nanoparticles (NPs), mechanical- or electrical-responsive MNPs, light or ultrasound-sensitive particles, and multi-responsive MNPs including dual stimuli-sensitive nanosheets of graphene. This review highlights the recent advances of smart MNPs categorized according to their activation stimulus (physical, chemical, or biological) and looks forward to future pharmaceutical applications.

Journal ArticleDOI
19 May 2016-Cell
TL;DR: A pipeline for the rapid design, assembly, and validation of cell-free, paper-based sensors for the detection of the Zika virus RNA genome is reported, which detect clinically relevant concentrations of Zika virus sequences and demonstrate specificity against closely related Dengue virus sequences.

Journal ArticleDOI
28 Jun 2016-JAMA
TL;DR: Among ambulatory adults aged 75 years or older, treating to an SBP target of less than 120 mm Hg compared with an SBp target of more than 140mm Hg resulted in significantly lower rates of fatal and nonfatal major cardiovascular events and death from any cause.
Abstract: Importance The appropriate treatment target for systolic blood pressure (SBP) in older patients with hypertension remains uncertain. Objective To evaluate the effects of intensive ( Design, Setting, and Participants A multicenter, randomized clinical trial of patients aged 75 years or older who participated in the Systolic Blood Pressure Intervention Trial (SPRINT). Recruitment began on October 20, 2010, and follow-up ended on August 20, 2015. Interventions Participants were randomized to an SBP target of less than 120 mm Hg (intensive treatment group, n = 1317) or an SBP target of less than 140 mm Hg (standard treatment group, n = 1319). Main Outcomes and Measures The primary cardiovascular disease outcome was a composite of nonfatal myocardial infarction, acute coronary syndrome not resulting in a myocardial infarction, nonfatal stroke, nonfatal acute decompensated heart failure, and death from cardiovascular causes. All-cause mortality was a secondary outcome. Results Among 2636 participants (mean age, 79.9 years; 37.9% women), 2510 (95.2%) provided complete follow-up data. At a median follow-up of 3.14 years, there was a significantly lower rate of the primary composite outcome (102 events in the intensive treatment group vs 148 events in the standard treatment group; hazard ratio [HR], 0.66 [95% CI, 0.51-0.85]) and all-cause mortality (73 deaths vs 107 deaths, respectively; HR, 0.67 [95% CI, 0.49-0.91]). The overall rate of serious adverse events was not different between treatment groups (48.4% in the intensive treatment group vs 48.3% in the standard treatment group; HR, 0.99 [95% CI, 0.89-1.11]). Absolute rates of hypotension were 2.4% in the intensive treatment group vs 1.4% in the standard treatment group (HR, 1.71 [95% CI, 0.97-3.09]), 3.0% vs 2.4%, respectively, for syncope (HR, 1.23 [95% CI, 0.76-2.00]), 4.0% vs 2.7% for electrolyte abnormalities (HR, 1.51 [95% CI, 0.99-2.33]), 5.5% vs 4.0% for acute kidney injury (HR, 1.41 [95% CI, 0.98-2.04]), and 4.9% vs 5.5% for injurious falls (HR, 0.91 [95% CI, 0.65-1.29]). Conclusions and Relevance Among ambulatory adults aged 75 years or older, treating to an SBP target of less than 120 mm Hg compared with an SBP target of less than 140 mm Hg resulted in significantly lower rates of fatal and nonfatal major cardiovascular events and death from any cause. Trial Registration clinicaltrials.gov Identifier:NCT01206062

Journal ArticleDOI
TL;DR: Issues from grading of acne to the topical and systemic management of the disease are reviewed and suggestions on use are provided based on available evidence.
Abstract: Acne is one of the most common disorders treated by dermatologists and other health care providers. While it most often affects adolescents, it is not uncommon in adults and can also be seen in children. This evidence-based guideline addresses important clinical questions that arise in its management. Issues from grading of acne to the topical and systemic management of the disease are reviewed. Suggestions on use are provided based on available evidence.

Journal ArticleDOI
TL;DR: This guideline provides recommendations on the clinical and public health management of tuberculosis in children and adults in settings in which mycobacterial cultures, molecular and phenotypic drug susceptibility tests, and radiographic studies, among other diagnostic tools, are available on a routine basis.
Abstract: The American Thoracic Society, Centers for Disease Control and Prevention, and Infectious Diseases Society of America jointly sponsored the development of this guideline for the treatment of drug-susceptible tuberculosis, which is also endorsed by the European Respiratory Society and the US National Tuberculosis Controllers Association. Representatives from the American Academy of Pediatrics, the Canadian Thoracic Society, the International Union Against Tuberculosis and Lung Disease, and the World Health Organization also participated in the development of the guideline. This guideline provides recommendations on the clinical and public health management of tuberculosis in children and adults in settings in which mycobacterial cultures, molecular and phenotypic drug susceptibility tests, and radiographic studies, among other diagnostic tools, are available on a routine basis. For all recommendations, literature reviews were performed, followed by discussion by an expert committee according to the Grading of Recommendations, Assessment, Development and Evaluation methodology. Given the public health implications of prompt diagnosis and effective management of tuberculosis, empiric multidrug treatment is initiated in almost all situations in which active tuberculosis is suspected. Additional characteristics such as presence of comorbidities, severity of disease, and response to treatment influence management decisions. Specific recommendations on the use of case management strategies (including directly observed therapy), regimen and dosing selection in adults and children (daily vs intermittent), treatment of tuberculosis in the presence of HIV infection (duration of tuberculosis treatment and timing of initiation of antiretroviral therapy), as well as treatment of extrapulmonary disease (central nervous system, pericardial among other sites) are provided. The development of more potent and better-tolerated drug regimens, optimization of drug exposure for the component drugs, optimal management of tuberculosis in special populations, identification of accurate biomarkers of treatment effect, and the assessment of new strategies for implementing regimens in the field remain key priority areas for research. See the full-text online version of the document for detailed discussion of the management of tuberculosis and recommendations for practice.

Journal ArticleDOI
11 Jul 2016-Nature
TL;DR: In this paper, the authors performed whole-genome sequencing in 2,657 European individuals with and without diabetes, and exome sequencing for 12,940 individuals from five ancestry groups.
Abstract: The genetic architecture of common traits, including the number, frequency, and effect sizes of inherited variants that contribute to individual risk, has been long debated. Genome-wide association studies have identified scores of common variants associated with type 2 diabetes, but in aggregate, these explain only a fraction of the heritability of this disease. Here, to test the hypothesis that lower-frequency variants explain much of the remainder, the GoT2D and T2D-GENES consortia performed whole-genome sequencing in 2,657 European individuals with and without diabetes, and exome sequencing in 12,940 individuals from five ancestry groups. To increase statistical power, we expanded the sample size via genotyping and imputation in a further 111,548 subjects. Variants associated with type 2 diabetes after sequencing were overwhelmingly common and most fell within regions previously identified by genome-wide association studies. Comprehensive enumeration of sequence variation is necessary to identify functional alleles that provide important clues to disease pathophysiology, but large-scale sequencing does not support the idea that lower-frequency variants have a major role in predisposition to type 2 diabetes.

Journal ArticleDOI
01 Apr 2016-Science
TL;DR: Electronic design automation principles from EDA are applied to enable increased circuit complexity and to simplify the incorporation of synthetic gene regulation into genetic engineering projects, and it is demonstrated that engineering principles can be applied to identify and suppress errors that complicate the compositions of larger systems.
Abstract: INTRODUCTION Cells respond to their environment, make decisions, build structures, and coordinate tasks. Underlying these processes are computational operations performed by networks of regulatory proteins that integrate signals and control the timing of gene expression. Harnessing this capability is critical for biotechnology projects that require decision-making, control, sensing, or spatial organization. It has been shown that cells can be programmed using synthetic genetic circuits composed of regulators organized to generate a desired operation. However, the construction of even simple circuits is time-intensive and unreliable. RATIONALE Electronic design automation (EDA) was developed to aid engineers in the design of semiconductor-based electronics. In an effort to accelerate genetic circuit design, we applied principles from EDA to enable increased circuit complexity and to simplify the incorporation of synthetic gene regulation into genetic engineering projects. We used the hardware description language Verilog to enable a user to describe a circuit function. The user also specifies the sensors, actuators, and “user constraints file” (UCF), which defines the organism, gate technology, and valid operating conditions. Cello (www.cellocad.org) uses this information to automatically design a DNA sequence encoding the desired circuit. This is done via a set of algorithms that parse the Verilog text, create the circuit diagram, assign gates, balance constraints to build the DNA, and simulate performance. RESULTS Cello designs circuits by drawing upon a library of Boolean logic gates. Here, the gate technology consists of NOT/NOR logic based on repressors. Gate connection is simplified by defining the input and output signals as RNA polymerase (RNAP) fluxes. We found that the gates need to be insulated from their genetic context to function reliably in the context of different circuits. Each gate is isolated using strong terminators to block RNAP leakage, and input interchangeability is improved using ribozymes and promoter spacers. These parts are varied for each gate to avoid breakage due to recombination. Measuring the load of each gate and incorporating this into the optimization algorithms further reduces evolutionary pressure. Cello was applied to the design of 60 circuits for Escherichia coli , where the circuit function was specified using Verilog code and transformed to a DNA sequence. The DNA sequences were built as specified with no additional tuning, requiring 880,000 base pairs of DNA assembly. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts). Across all circuits, 92% of the 412 output states functioned as predicted. CONCLUSION Our work constitutes a hardware description language for programming living cells. This required the co-development of design algorithms with gates that are sufficiently simple and robust to be connected by automated algorithms. We demonstrate that engineering principles can be applied to identify and suppress errors that complicate the compositions of larger systems. This approach leads to highly repetitive and modular genetics, in stark contrast to the encoding of natural regulatory networks. The use of a hardware-independent language and the creation of additional UCFs will allow a single design to be transformed into DNA for different organisms, genetic endpoints, operating conditions, and gate technologies.

Journal ArticleDOI
01 Dec 2016-Nature
TL;DR: In this article, the authors present a comprehensive analysis of warming-induced changes in soil carbon stocks by assembling data from 49 field experiments located across North America, Europe and Asia, and provide estimates of soil carbon sensitivity to warming that may help to constrain Earth system model projections.
Abstract: The majority of the Earth's terrestrial carbon is stored in the soil. If anthropogenic warming stimulates the loss of this carbon to the atmosphere, it could drive further planetary warming. Despite evidence that warming enhances carbon fluxes to and from the soil, the net global balance between these responses remains uncertain. Here we present a comprehensive analysis of warming-induced changes in soil carbon stocks by assembling data from 49 field experiments located across North America, Europe and Asia. We find that the effects of warming are contingent on the size of the initial soil carbon stock, with considerable losses occurring in high-latitude areas. By extrapolating this empirical relationship to the global scale, we provide estimates of soil carbon sensitivity to warming that may help to constrain Earth system model projections. Our empirical relationship suggests that global soil carbon stocks in the upper soil horizons will fall by 30 ± 30 petagrams of carbon to 203 ± 161 petagrams of carbon under one degree of warming, depending on the rate at which the effects of warming are realized. Under the conservative assumption that the response of soil carbon to warming occurs within a year, a business-as-usual climate scenario would drive the loss of 55 ± 50 petagrams of carbon from the upper soil horizons by 2050. This value is around 12-17 per cent of the expected anthropogenic emissions over this period. Despite the considerable uncertainty in our estimates, the direction of the global soil carbon response is consistent across all scenarios. This provides strong empirical support for the idea that rising temperatures will stimulate the net loss of soil carbon to the atmosphere, driving a positive land carbon-climate feedback that could accelerate climate change.

Journal ArticleDOI
TL;DR: Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development, suggesting early memory deficit associated with the primary disease factors.
Abstract: Multifactorial mechanisms underlying late-onset Alzheimer's disease (LOAD) are poorly characterized from an integrative perspective. Here spatiotemporal alterations in brain amyloid-β deposition, metabolism, vascular, functional activity at rest, structural properties, cognitive integrity and peripheral proteins levels are characterized in relation to LOAD progression. We analyse over 7,700 brain images and tens of plasma and cerebrospinal fluid biomarkers from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Through a multifactorial data-driven analysis, we obtain dynamic LOAD-abnormality indices for all biomarkers, and a tentative temporal ordering of disease progression. Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development. Cognitive decline is noticeable from initial LOAD stages, suggesting early memory deficit associated with the primary disease factors. High abnormality levels are also observed for specific proteins associated with the vascular system's integrity. Although still subjected to the sensitivity of the algorithms and biomarkers employed, our results might contribute to the development of preventive therapeutic interventions.

Journal ArticleDOI
28 Sep 2016
TL;DR: Evidence that epigenetic age predicts all-cause mortality above and beyond chronological age and traditional risk factors is strengthened and estimates that incorporate information on blood cell counts lead to highly significant associations with all- Cause mortality are demonstrated.
Abstract: Estimates of biological age based on DNA methylation patterns, often referred to as "epigenetic age", "DNAm age", have been shown to be robust biomarkers of age in humans. We previously demonstrated that independent of chronological age, epigenetic age assessed in blood predicted all-cause mortality in four human cohorts. Here, we expanded our original observation to 13 different cohorts for a total sample size of 13,089 individuals, including three racial/ethnic groups. In addition, we examined whether incorporating information on blood cell composition into the epigenetic age metrics improves their predictive power for mortality. All considered measures of epigenetic age acceleration were predictive of mortality (p≤8.2x10-9), independent of chronological age, even after adjusting for additional risk factors (p<5.4x10-4), and within the racial/ethnic groups that we examined (non-Hispanic whites, Hispanics, African Americans). Epigenetic age estimates that incorporated information on blood cell composition led to the smallest p-values for time to death (p=7.5x10-43). Overall, this study a) strengthens the evidence that epigenetic age predicts all-cause mortality above and beyond chronological age and traditional risk factors, and b) demonstrates that epigenetic age estimates that incorporate information on blood cell counts lead to highly significant associations with all-cause mortality.

Journal ArticleDOI
TL;DR: Evidence-based recommendations were created to assist clinicians in the optimal treatment of patients with pHPT to develop evidence-based guidelines to enhance the appropriate, safe, and effective practice of parathyroidectomy.
Abstract: Importance Primary hyperparathyroidism (pHPT) is a common clinical problem for which the only definitive management is surgery. Surgical management has evolved considerably during the last several decades. Objective To develop evidence-based guidelines to enhance the appropriate, safe, and effective practice of parathyroidectomy. Evidence Review A multidisciplinary panel used PubMed to review the medical literature from January 1, 1985, to July 1, 2015. Levels of evidence were determined using the American College of Physicians grading system, and recommendations were discussed until consensus. Findings Initial evaluation should include 25-hydroxyvitamin D measurement, 24-hour urine calcium measurement, dual-energy x-ray absorptiometry, and supplementation for vitamin D deficiency. Parathyroidectomy is indicated for all symptomatic patients, should be considered for most asymptomatic patients, and is more cost-effective than observation or pharmacologic therapy. Cervical ultrasonography or other high-resolution imaging is recommended for operative planning. Patients with nonlocalizing imaging remain surgical candidates. Preoperative parathyroid biopsy should be avoided. Surgeons who perform a high volume of operations have better outcomes. The possibility of multigland disease should be routinely considered. Both focused, image-guided surgery (minimally invasive parathyroidectomy) and bilateral exploration are appropriate operations that achieve high cure rates. For minimally invasive parathyroidectomy, intraoperative parathyroid hormone monitoring via a reliable protocol is recommended. Minimally invasive parathyroidectomy is not routinely recommended for known or suspected multigland disease. Ex vivo aspiration of resected parathyroid tissue may be used to confirm parathyroid tissue intraoperatively. Clinically relevant thyroid disease should be assessed preoperatively and managed during parathyroidectomy. Devascularized normal parathyroid tissue should be autotransplanted. Patients should be observed postoperatively for hematoma, evaluated for hypocalcemia and symptoms of hypocalcemia, and followed up to assess for cure defined as eucalcemia at more than 6 months. Calcium supplementation may be indicated postoperatively. Familial pHPT, reoperative parathyroidectomy, and parathyroid carcinoma are challenging entities that require special consideration and expertise. Conclusions and Relevance Evidence-based recommendations were created to assist clinicians in the optimal treatment of patients with pHPT.

Journal ArticleDOI
TL;DR: In vivo data showing that Aβ expression protects against fungal and bacterial infections in mouse, nematode, and cell culture models of AD are presented, and Aβ oligomerization, a behavior traditionally viewed as intrinsically pathological, may be necessary for the antimicrobial activities of the peptide are shown.
Abstract: The amyloid-β peptide (Aβ) is a key protein in Alzheimer’s disease (AD) pathology. We previously reported in vitro evidence suggesting that Aβ is an antimicrobial peptide. We present in vivo data showing that Aβ expression protects against fungal and bacterial infections in mouse, nematode, and cell culture models of AD. We show that Aβ oligomerization, a behavior traditionally viewed as intrinsically pathological, may be necessary for the antimicrobial activities of the peptide. Collectively, our data are consistent with a model in which soluble Aβ oligomers first bind to microbial cell wall carbohydrates via a heparin-binding domain. Developing protofibrils inhibited pathogen adhesion to host cells. Propagating β-amyloid fibrils mediate agglutination and eventual entrapment of unatttached microbes. Consistent with our model, Salmonella Typhimurium bacterial infection of the brains of transgenic 5XFAD mice resulted in rapid seeding and accelerated β-amyloid deposition, which closely colocalized with the invading bacteria. Our findings raise the intriguing possibility that β-amyloid may play a protective role in innate immunity and infectious or sterile inflammatory stimuli may drive amyloidosis. These data suggest a dual protective/damaging role for Aβ, as has been described for other antimicrobial peptides.

Journal ArticleDOI
TL;DR: Prior SITBs confer risk for later suicidal thoughts and behaviors, however, they only provide a marginal improvement in diagnostic accuracy above chance, and addressing gaps in study design, assessment, and underlying mechanisms may prove useful in improving prediction and prevention of suicidal thought and behaviors.
Abstract: Background A history of self-injurious thoughts and behaviors (SITBs) is consistently cited as one of the strongest predictors of future suicidal behavior. However, stark discrepancies in the literature raise questions about the true magnitude of these associations. The objective of this study is to examine the magnitude and clinical utility of the associations between SITBs and subsequent suicide ideation, attempts, and death. Method We searched PubMed, PsycInfo, and Google Scholar for papers published through December 2014. Inclusion required that studies include at least one longitudinal analysis predicting suicide ideation, attempts, or death using any SITB variable. We identified 2179 longitudinal studies; 172 met inclusion criteria. Results The most common outcome was suicide attempt (47.80%), followed by death (40.50%) and ideation (11.60%). Median follow-up was 52 months (mean = 82.52, s.d. = 102.29). Overall prediction was weak, with weighted mean odds ratios (ORs) of 2.07 [95% confidence interval (CI) 1.76–2.43] for ideation, 2.14 (95% CI 2.00–2.30) for attempts, and 1.54 (95% CI 1.39–1.71) for death. Adjusting for publication bias further reduced estimates. Diagnostic accuracy analyses indicated acceptable specificity (86–87%) and poor sensitivity (10–26%), with areas under the curve marginally above chance (0.60–0.62). Most risk factors generated OR estimates of <2.0 and no risk factor exceeded 4.5. Effects were consistent regardless of sample severity, sample age groups, or follow-up length. Conclusions Prior SITBs confer risk for later suicidal thoughts and behaviors. However, they only provide a marginal improvement in diagnostic accuracy above chance. Addressing gaps in study design, assessment, and underlying mechanisms may prove useful in improving prediction and prevention of suicidal thoughts and behaviors.

Journal ArticleDOI
26 Apr 2016-JAMA
TL;DR: A clinical decision tool to identify patients expected to derive benefit vs harm from continuing thienopyridine beyond 1 year after percutaneous coronary intervention is developed to inform dual antiplatelet therapy duration.
Abstract: Importance Dual antiplatelet therapy after percutaneous coronary intervention (PCI) reduces ischemia but increases bleeding. Objective To develop a clinical decision tool to identify patients expected to derive benefit vs harm from continuing thienopyridine beyond 1 year after PCI. Design, Setting, and Participants Among 11 648 randomized DAPT Study patients from 11 countries (August 2009-May 2014), a prediction rule was derived stratifying patients into groups to distinguish ischemic and bleeding risk 12 to 30 months after PCI. Validation was internal via bootstrap resampling and external among 8136 patients from 36 countries randomized in the PROTECT trial (June 2007-July 2014). Exposures Twelve months of open-label thienopyridine plus aspirin, then randomized to 18 months of continued thienopyridine plus aspirin vs placebo plus aspirin. Main Outcomes and Measures Ischemia (myocardial infarction or stent thrombosis) and bleeding (moderate or severe) 12 to 30 months after PCI. Results Among DAPT Study patients (derivation cohort; mean age, 61.3 years; women, 25.1%), ischemia occurred in 348 patients (3.0%) and bleeding in 215 (1.8%). Derivation cohort models predicting ischemia and bleeding hadcstatistics of 0.70 and 0.68, respectively. The prediction rule assigned 1 point each for myocardial infarction at presentation, prior myocardial infarction or PCI, diabetes, stent diameter less than 3 mm, smoking, and paclitaxel-eluting stent; 2 points each for history of congestive heart failure/low ejection fraction and vein graft intervention; −1 point for age 65 to younger than 75 years; and −2 points for age 75 years or older. Among the high score group (score ≥2, n = 5917), continued thienopyridine vs placebo was associated with reduced ischemic events (2.7% vs 5.7%; risk difference [RD], −3.0% [95% CI, −4.1% to −2.0%],P Conclusion and Relevance Among patients not sustaining major bleeding or ischemic events 1 year after PCI, a prediction rule assessing late ischemic and bleeding risks to inform dual antiplatelet therapy duration showed modest accuracy in derivation and validation cohorts. This rule requires further prospective evaluation to assess potential effects on patient care, as well as validation in other cohorts. Trial Registration clinicaltrials.gov Identifier:NCT00977938.