Showing papers by "Harvard University published in 2016"
••
Harvard University1, University of California, San Francisco2, University of Düsseldorf3, Heidelberg University4, Aix-Marseille University5, Ludwig Institute for Cancer Research6, International Agency for Research on Cancer7, German Cancer Research Center8, University of Zurich9, St. Jude Children's Research Hospital10
TL;DR: The 2016 World Health Organization Classification of Tumors of the Central Nervous System is both a conceptual and practical advance over its 2007 predecessor and is hoped that it will facilitate clinical, experimental and epidemiological studies that will lead to improvements in the lives of patients with brain tumors.
Abstract: The 2016 World Health Organization Classification of Tumors of the Central Nervous System is both a conceptual and practical advance over its 2007 predecessor. For the first time, the WHO classification of CNS tumors uses molecular parameters in addition to histology to define many tumor entities, thus formulating a concept for how CNS tumor diagnoses should be structured in the molecular era. As such, the 2016 CNS WHO presents major restructuring of the diffuse gliomas, medulloblastomas and other embryonal tumors, and incorporates new entities that are defined by both histology and molecular features, including glioblastoma, IDH-wildtype and glioblastoma, IDH-mutant; diffuse midline glioma, H3 K27M-mutant; RELA fusion-positive ependymoma; medulloblastoma, WNT-activated and medulloblastoma, SHH-activated; and embryonal tumour with multilayered rosettes, C19MC-altered. The 2016 edition has added newly recognized neoplasms, and has deleted some entities, variants and patterns that no longer have diagnostic and/or biological relevance. Other notable changes include the addition of brain invasion as a criterion for atypical meningioma and the introduction of a soft tissue-type grading system for the now combined entity of solitary fibrous tumor / hemangiopericytoma-a departure from the manner by which other CNS tumors are graded. Overall, it is hoped that the 2016 CNS WHO will facilitate clinical, experimental and epidemiological studies that will lead to improvements in the lives of patients with brain tumors.
11,197 citations
••
Broad Institute1, Harvard University2, Boston Children's Hospital3, University of Washington4, University of Arizona5, Cardiff University6, Google7, Icahn School of Medicine at Mount Sinai8, Samsung Medical Center9, Vertex Pharmaceuticals10, University of Michigan11, University of Cambridge12, State University of New York Upstate Medical University13, Karolinska Institutet14, University of Eastern Finland15, Wellcome Trust Centre for Human Genetics16, University of Oxford17, Cedars-Sinai Medical Center18, University of Ottawa19, University of Pennsylvania20, University of North Carolina at Chapel Hill21, University of Helsinki22, University of California, San Diego23, University of Mississippi Medical Center24
TL;DR: The aggregation and analysis of high-quality exome (protein-coding region) DNA sequence data for 60,706 individuals of diverse ancestries generated as part of the Exome Aggregation Consortium (ExAC) provides direct evidence for the presence of widespread mutational recurrence.
Abstract: Large-scale reference data sets of human genetic variation are critical for the medical and functional interpretation of DNA sequence changes. Here we describe the aggregation and analysis of high-quality exome (protein-coding region) DNA sequence data for 60,706 individuals of diverse ancestries generated as part of the Exome Aggregation Consortium (ExAC). This catalogue of human genetic diversity contains an average of one variant every eight bases of the exome, and provides direct evidence for the presence of widespread mutational recurrence. We have used this catalogue to calculate objective metrics of pathogenicity for sequence variants, and to identify genes subject to strong selection against various classes of mutation; identifying 3,230 genes with near-complete depletion of predicted protein-truncating variants, with 72% of these genes having no currently established human disease phenotype. Finally, we demonstrate that these data can be used for the efficient filtering of candidate disease-causing variants, and for the discovery of human 'knockout' variants in protein-coding genes.
8,758 citations
••
University of Bristol1, Harvard University2, University Hospitals Bristol NHS Foundation Trust3, Research Triangle Park4, University of Toronto5, University of Oxford6, University of Ottawa7, Paris Descartes University8, University of London9, University of York10, University of Birmingham11, University of Southern Denmark12, University of Liverpool13, University of East Anglia14, Loyola University Chicago15, University of Aberdeen16, Kaiser Permanente17, Baruch College18, McMaster University19, Cochrane Collaboration20, McGill University21, Ottawa Hospital Research Institute22, University of Louisville23, University of Melbourne24
TL;DR: Risk of Bias In Non-randomised Studies - of Interventions is developed, a new tool for evaluating risk of bias in estimates of the comparative effectiveness of interventions from studies that did not use randomisation to allocate units or clusters of individuals to comparison groups.
Abstract: Non-randomised studies of the effects of interventions are critical to many areas of healthcare evaluation, but their results may be biased. It is therefore important to understand and appraise their strengths and weaknesses. We developed ROBINS-I (“Risk Of Bias In Non-randomised Studies - of Interventions”), a new tool for evaluating risk of bias in estimates of the comparative effectiveness (harm or benefit) of interventions from studies that did not use randomisation to allocate units (individuals or clusters of individuals) to comparison groups. The tool will be particularly useful to those undertaking systematic reviews that include non-randomised studies.
8,028 citations
••
Technical University of Madrid1, Stanford University2, Elsevier3, VU University Amsterdam4, National Institutes of Health5, University of Leicester6, Harvard University7, Beijing Genomics Institute8, Maastricht University9, Wageningen University and Research Centre10, University of Oxford11, Heriot-Watt University12, University of Manchester13, University of California, San Diego14, Leiden University Medical Center15, Leiden University16, Federal University of São Paulo17, Science for Life Laboratory18, Bayer19, Swiss Institute of Bioinformatics20, Cray21, University Medical Center Groningen22, Erasmus University Rotterdam23
TL;DR: The FAIR Data Principles as mentioned in this paper are a set of data reuse principles that focus on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals.
Abstract: There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders—representing academia, industry, funding agencies, and scholarly publishers—have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community.
7,602 citations
••
TL;DR: The 2016 edition of the World Health Organization classification of tumors of the hematopoietic and lymphoid tissues represents a revision of the prior classification rather than an entirely new classification and attempts to incorporate new clinical, prognostic, morphologic, immunophenotypic, and genetic data that have emerged since the last edition.
7,147 citations
••
TL;DR: The revision clarifies the diagnosis and management of lesions at the very early stages of lymphomagenesis, refines the diagnostic criteria for some entities, details the expanding genetic/molecular landscape of numerous lymphoid neoplasms and their clinical correlates, and refers to investigations leading to more targeted therapeutic strategies.
5,321 citations
••
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes.
For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy.
Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.
5,187 citations
••
[...]
T. Prusti1, J. H. J. de Bruijne1, Anthony G. A. Brown2, Antonella Vallenari3 +621 more•Institutions (93)
TL;DR: Gaia as discussed by the authors is a cornerstone mission in the science programme of the European Space Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach.
Abstract: Gaia is a cornerstone mission in the science programme of the EuropeanSpace Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach. Both the spacecraft and the payload were built by European industry. The involvement of the scientific community focusses on data processing for which the international Gaia Data Processing and Analysis Consortium (DPAC) was selected in 2007. Gaia was launched on 19 December 2013 and arrived at its operating point, the second Lagrange point of the Sun-Earth-Moon system, a few weeks later. The commissioning of the spacecraft and payload was completed on 19 July 2014. The nominal five-year mission started with four weeks of special, ecliptic-pole scanning and subsequently transferred into full-sky scanning mode. We recall the scientific goals of Gaia and give a description of the as-built spacecraft that is currently (mid-2016) being operated to achieve these goals. We pay special attention to the payload module, the performance of which is closely related to the scientific performance of the mission. We provide a summary of the commissioning activities and findings, followed by a description of the routine operational mode. We summarise scientific performance estimates on the basis of in-orbit operations. Several intermediate Gaia data releases are planned and the data can be retrieved from the Gaia Archive, which is available through the Gaia home page.
5,164 citations
••
TL;DR: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) as discussed by the authors was used to estimate the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015.
5,050 citations
••
TL;DR: Pfam is now primarily based on the UniProtKB reference proteomes, with the counts of matched sequences and species reported on the website restricted to this smaller set, and the facility to view the relationship between families within a clan has been improved by the introduction of a new tool.
Abstract: In the last two years the Pfam database (http://pfam.xfam.org) has undergone a substantial reorganisation to reduce the effort involved in making a release, thereby permitting more frequent releases. Arguably the most significant of these changes is that Pfam is now primarily based on the UniProtKB reference proteomes, with the counts of matched sequences and species reported on the website restricted to this smaller set. Building families on reference proteomes sequences brings greater stability, which decreases the amount of manual curation required to maintain them. It also reduces the number of sequences displayed on the website, whilst still providing access to many important model organisms. Matches to the full UniProtKB database are, however, still available and Pfam annotations for individual UniProtKB sequences can still be retrieved. Some Pfam entries (1.6%) which have no matches to reference proteomes remain; we are working with UniProt to see if sequences from them can be incorporated into reference proteomes. Pfam-B, the automatically-generated supplement to Pfam, has been removed. The current release (Pfam 29.0) includes 16 295 entries and 559 clans. The facility to view the relationship between families within a clan has been improved by the introduction of a new tool.
4,906 citations
••
TL;DR: The Global Burden of Disease 2015 Study provides a comprehensive assessment of all-cause and cause-specific mortality for 249 causes in 195 countries and territories from 1980 to 2015, finding several countries in sub-Saharan Africa had very large gains in life expectancy, rebounding from an era of exceedingly high loss of life due to HIV/AIDS.
••
University of Texas Southwestern Medical Center1, Harvard University2, Novo Nordisk3, University of Erlangen-Nuremberg4, Ruhr University Bochum5, Cleveland Clinic6, University of London7, Imperial College London8, George Washington University9, University of Toronto10, University of North Carolina at Chapel Hill11
TL;DR: In the time-to-event analysis, the rate of the first occurrence of death from cardiovascular causes, nonfatal myocardial infarction, orNonfatal stroke among patients with type 2 diabetes mellitus was lower with liraglutide than with placebo.
Abstract: BackgroundThe cardiovascular effect of liraglutide, a glucagon-like peptide 1 analogue, when added to standard care in patients with type 2 diabetes, remains unknown. MethodsIn this double-blind trial, we randomly assigned patients with type 2 diabetes and high cardiovascular risk to receive liraglutide or placebo. The primary composite outcome in the time-to-event analysis was the first occurrence of death from cardiovascular causes, nonfatal myocardial infarction, or nonfatal stroke. The primary hypothesis was that liraglutide would be noninferior to placebo with regard to the primary outcome, with a margin of 1.30 for the upper boundary of the 95% confidence interval of the hazard ratio. No adjustments for multiplicity were performed for the prespecified exploratory outcomes. ResultsA total of 9340 patients underwent randomization. The median follow-up was 3.8 years. The primary outcome occurred in significantly fewer patients in the liraglutide group (608 of 4668 patients [13.0%]) than in the placebo ...
••
Mariachiara Di Cesare1, Mariachiara Di Cesare2, James Bentham1, Gretchen A Stevens3 +738 more•Institutions (60)
TL;DR: The posterior probability of meeting the target of halting by 2025 the rise in obesity at its 2010 levels, if post-2000 trends continue, is calculated.
••
01 Jan 2016TL;DR: This review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications.
Abstract: Big Data applications are typically associated with systems involving large numbers of users, massive complex software systems, and large-scale heterogeneous computing and storage architectures. The construction of such systems involves many distributed design choices. The end products (e.g., recommendation systems, medical analysis tools, real-time game engines, speech recognizers) thus involve many tunable configuration parameters. These parameters are often specified and hard-coded into the software by various developers or teams. If optimized jointly, these parameters can result in significant improvements. Bayesian optimization is a powerful tool for the joint optimization of design choices that is gaining great popularity in recent years. It promises greater automation so as to increase both product quality and human productivity. This review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications.
••
TL;DR: E engineered fusions of CRISPR/Cas9 and a cytidine deaminase enzyme that retain the ability to be programmed with a guide RNA, do not induce dsDNA breaks, and mediate the direct conversion of cytidine to uridine, thereby effecting a C→T (or G→A) substitution.
Abstract: Current genome-editing technologies introduce double-stranded (ds) DNA breaks at a target locus as the first step to gene correction. Although most genetic diseases arise from point mutations, current approaches to point mutation correction are inefficient and typically induce an abundance of random insertions and deletions (indels) at the target locus resulting from the cellular response to dsDNA breaks. Here we report the development of 'base editing', a new approach to genome editing that enables the direct, irreversible conversion of one target DNA base into another in a programmable manner, without requiring dsDNA backbone cleavage or a donor template. We engineered fusions of CRISPR/Cas9 and a cytidine deaminase enzyme that retain the ability to be programmed with a guide RNA, do not induce dsDNA breaks, and mediate the direct conversion of cytidine to uridine, thereby effecting a C→T (or G→A) substitution. The resulting 'base editors' convert cytidines within a window of approximately five nucleotides, and can efficiently correct a variety of point mutations relevant to human disease. In four transformed human and murine cell lines, second- and third-generation base editors that fuse uracil glycosylase inhibitor, and that use a Cas9 nickase targeting the non-edited strand, manipulate the cellular DNA repair response to favour desired base-editing outcomes, resulting in permanent correction of ~15-75% of total cellular DNA with minimal (typically ≤1%) indel formation. Base editing expands the scope and efficiency of genome editing of point mutations.
••
University of Milan1, St. Michael's GAA, Sligo2, University of Toronto3, Paris Diderot University4, University of Paris5, University Health Network6, St. Michael's Hospital7, Australian National University8, Uppsala University9, Queen's University Belfast10, Sapienza University of Rome11, Sunnybrook Health Sciences Centre12, Harvard University13, Leipzig University14
TL;DR: Clinician recognition of ARDS was associated with higher PEEP, greater use of neuromuscular blockade, and prone positioning, which indicates the potential for improvement in the management of patients with ARDS.
Abstract: IMPORTANCE Limited information exists about the epidemiology, recognition, management, and outcomes of patients with the acute respiratory distress syndrome (ARDS). OBJECTIVES To evaluate intensive ...
••
University of Pittsburgh1, University of Texas MD Anderson Cancer Center2, Stanford University3, University of Duisburg-Essen4, University of Chicago5, Institut Gustave Roussy6, University of Michigan7, Emory University8, Complutense University of Madrid9, Harvard University10, University of Zurich11, Kobe University12, Bristol-Myers Squibb13, Ohio State University14
TL;DR: Among patients with platinum-refractory, recurrent squamous-cell carcinoma of the head and neck, treatment with nivolumab resulted in longer overall survival than treatment with standard, single-agent therapy.
Abstract: BackgroundPatients with recurrent or metastatic squamous-cell carcinoma of the head and neck after platinum chemotherapy have a very poor prognosis and limited therapeutic options. Nivolumab, an anti–programmed death 1 (PD-1) monoclonal antibody, was assessed as treatment for this condition. MethodsIn this randomized, open-label, phase 3 trial, we assigned, in a 2:1 ratio, 361 patients with recurrent squamous-cell carcinoma of the head and neck whose disease had progressed within 6 months after platinum-based chemotherapy to receive nivolumab (at a dose of 3 mg per kilogram of body weight) every 2 weeks or standard, single-agent systemic therapy (methotrexate, docetaxel, or cetuximab). The primary end point was overall survival. Additional end points included progression-free survival, rate of objective response, safety, and patient-reported quality of life. ResultsThe median overall survival was 7.5 months (95% confidence interval [CI], 5.5 to 9.1) in the nivolumab group versus 5.1 months (95% CI, 4.0 to...
••
TL;DR: The cellular ecosystem of tumors is begin to unravel and how single-cell genomics offers insights with implications for both targeted and immune therapies is unraveled.
Abstract: To explore the distinct genotypic and phenotypic states of melanoma tumors, we applied single-cell RNA sequencing (RNA-seq) to 4645 single cells isolated from 19 patients, profiling malignant, immune, stromal, and endothelial cells. Malignant cells within the same tumor displayed transcriptional heterogeneity associated with the cell cycle, spatial context, and a drug-resistance program. In particular, all tumors harbored malignant cells from two distinct transcriptional cell states, such that tumors characterized by high levels of the MITF transcription factor also contained cells with low MITF and elevated levels of the AXL kinase. Single-cell analyses suggested distinct tumor microenvironmental patterns, including cell-to-cell interactions. Analysis of tumor-infiltrating T cells revealed exhaustion programs, their connection to T cell activation and clonal expansion, and their variability across patients. Overall, we begin to unravel the cellular ecosystem of tumors and how single-cell genomics offers insights with implications for both targeted and immune therapies.
••
Memorial Sloan Kettering Cancer Center1, Thomas Jefferson University2, Queen Mary University of London3, Netherlands Cancer Institute4, New York University5, University of Milan6, MedStar Georgetown University Hospital7, University of Chicago8, University of Paris-Sud9, Stanford University10, Technische Universität München11, Cleveland Clinic12, Mayo Clinic13, Icahn School of Medicine at Mount Sinai14, Yale University15, University of Navarra16, Sarah Cannon Research Institute17, Ottawa Hospital Research Institute18, Harvard University19, Genentech20, Foundation Medicine21, University of Virginia22
TL;DR: Treatment with atezolizumab resulted in a significantly improved RECIST v1.1 response rate, compared with a historical control overall response rate of 10%, and Exploratory analyses showed The Cancer Genome Atlas (TCGA) subtypes and mutation load to be independently predictive for response to atezolediazepine.
••
TL;DR: Recently devised sgRNA design rules are used to create human and mouse genome-wide libraries, perform positive and negative selection screens and observe that the use of these rules produced improved results, and a metric to predict off-target sites is developed.
Abstract: CRISPR-Cas9-based genetic screens are a powerful new tool in biology. By simply altering the sequence of the single-guide RNA (sgRNA), one can reprogram Cas9 to target different sites in the genome with relative ease, but the on-target activity and off-target effects of individual sgRNAs can vary widely. Here, we use recently devised sgRNA design rules to create human and mouse genome-wide libraries, perform positive and negative selection screens and observe that the use of these rules produced improved results. Additionally, we profile the off-target activity of thousands of sgRNAs and develop a metric to predict off-target sites. We incorporate these findings from large-scale, empirical data to improve our computational design rules and create optimized sgRNA libraries that maximize on-target activity and minimize off-target effects to enable more effective and efficient genetic screens and genome engineering.
••
TL;DR: In this article, the authors used a Bayesian hierarchical model to estimate trends in diabetes prevalence, defined as fasting plasma glucose of 7.0 mmol/L or higher, or history of diagnosis with diabetes, or use of insulin or oral hypoglycaemic drugs in 200 countries and territories in 21 regions, by sex and from 1980 to 2014.
••
TL;DR: Improvements to imputation machinery are described that reduce computational requirements by more than an order of magnitude with no loss of accuracy in comparison to standard imputation tools.
Abstract: Christian Fuchsberger, Goncalo Abecasis and colleagues describe a new web-based imputation service that enables rapid imputation of large numbers of samples and allows convenient access to large reference panels of sequenced individuals. Their state space reduction provides a computationally efficient solution for genotype imputation with no loss in imputation accuracy.
••
University of Queensland1, University of Glasgow2, QIMR Berghofer Medical Research Institute3, Garvan Institute of Medical Research4, Baylor College of Medicine5, University of Utah6, South Australia Pathology7, University of Adelaide8, Harvard University9, Campbelltown Hospital10, St. Vincent's Health System11, University of New South Wales12, University of Newcastle13, Royal North Shore Hospital14, Royal Prince Alfred Hospital15, University of Sydney16, Fiona Stanley Hospital17, Royal Adelaide Hospital18, Princess Alexandra Hospital19, University of Western Australia20, Beatson West of Scotland Cancer Centre21, Southern General Hospital22, Dresden University of Technology23, University of Texas MD Anderson Cancer Center24, Memorial Sloan Kettering Cancer Center25, Johns Hopkins University School of Medicine26, University of Verona27, Mayo Clinic28, University of Melbourne29
TL;DR: Detailed genomic analysis of 456 pancreatic ductal adenocarcinomas identified 32 recurrently mutated genes that aggregate into 10 pathways: KRAS, TGF-β, WNT, NOTCH, ROBO/SLIT signalling, G1/S transition, SWI-SNF, chromatin modification, DNA repair and RNA processing.
Abstract: Integrated genomic analysis of 456 pancreatic ductal adenocarcinomas identified 32 recurrently mutated genes that aggregate into 10 pathways: KRAS, TGF-β, WNT, NOTCH, ROBO/SLIT signalling, G1/S transition, SWI-SNF, chromatin modification, DNA repair and RNA processing. Expression analysis defined 4 subtypes: (1) squamous; (2) pancreatic progenitor; (3) immunogenic; and (4) aberrantly differentiated endocrine exocrine (ADEX) that correlate with histopathological characteristics. Squamous tumours are enriched for TP53 and KDM6A mutations, upregulation of the TP63∆N transcriptional network, hypermethylation of pancreatic endodermal cell-fate determining genes and have a poor prognosis. Pancreatic progenitor tumours preferentially express genes involved in early pancreatic development (FOXA2/3, PDX1 and MNX1). ADEX tumours displayed upregulation of genes that regulate networks involved in KRAS activation, exocrine (NR5A2 and RBPJL), and endocrine differentiation (NEUROD1 and NKX2-2). Immunogenic tumours contained upregulated immune networks including pathways involved in acquired immune suppression. These data infer differences in the molecular evolution of pancreatic cancer subtypes and identify opportunities for therapeutic development.
••
TL;DR: The results firmly establish that metalenses can have widespread applications in laser-based microscopy, imaging, and spectroscopy, with image qualities comparable to a state-of-the-art commercial objective.
Abstract: Subwavelength resolution imaging requires high numerical aperture (NA) lenses, which are bulky and expensive. Metasurfaces allow the miniaturization of conventional refractive optics into planar structures. We show that high-aspect-ratio titanium dioxide metasurfaces can be fabricated and designed as metalenses with NA = 0.8. Diffraction-limited focusing is demonstrated at wavelengths of 405, 532, and 660 nm with corresponding efficiencies of 86, 73, and 66%. The metalenses can resolve nanoscale features separated by subwavelength distances and provide magnification as high as 170×, with image qualities comparable to a state-of-the-art commercial objective. Our results firmly establish that metalenses can have widespread applications in laser-based microscopy, imaging, and spectroscopy.
••
University of California, San Diego1, University of Montana2, Stanford University3, Scripps Institution of Oceanography4, National Autonomous University of Mexico5, Salk Institute for Biological Studies6, San Diego State University7, Strathclyde Institute of Pharmacy and Biomedical Sciences8, Lawrence Berkeley National Laboratory9, Harvard University10, University of Rennes11, University of Minnesota12, University of Lorraine13, Technical University of Denmark14, J. Craig Venter Institute15, University of California, Los Angeles16, University of Washington17, ETH Zurich18, University of Illinois at Chicago19, National Sun Yat-sen University20, Academia Sinica21, University of Münster22, Victoria University of Wellington23, University of North Carolina at Chapel Hill24, Indiana University25, Smithsonian Tropical Research Institute26, University of São Paulo27, Federal University of Mato Grosso do Sul28, University of Notre Dame29, University of California, Santa Cruz30, Oregon State University31, University of California, Berkeley32, Florida International University33, University of Hawaii at Manoa34, University of Geneva35, Institut de Chimie des Substances Naturelles36, Pacific Northwest National Laboratory37, National Institutes of Health38, Chinese Academy of Sciences39
TL;DR: In GNPS, crowdsourced curation of freely available community-wide reference MS libraries will underpin improved annotations and data-driven social-networking should facilitate identification of spectra and foster collaborations.
Abstract: The potential of the diverse chemistries present in natural products (NP) for biotechnology and medicine remains untapped because NP databases are not searchable with raw data and the NP community has no way to share data other than in published papers. Although mass spectrometry (MS) techniques are well-suited to high-throughput characterization of NP, there is a pressing need for an infrastructure to enable sharing and curation of data. We present Global Natural Products Social Molecular Networking (GNPS; http://gnps.ucsd.edu), an open-access knowledge base for community-wide organization and sharing of raw, processed or identified tandem mass (MS/MS) spectrometry data. In GNPS, crowdsourced curation of freely available community-wide reference MS libraries will underpin improved annotations. Data-driven social-networking should facilitate identification of spectra and foster collaborations. We also introduce the concept of 'living data' through continuous reanalysis of deposited data.
••
University of Nebraska Medical Center1, University of Connecticut2, Harvard University3, Queen's University4, University of California, San Diego5, Stony Brook University6, University of Michigan7, National Institutes of Health8, Johns Hopkins University9, University of Barcelona10, University at Buffalo11, Summa Health System12, University of Texas Health Science Center at San Antonio13, University of Queensland14, Royal Brisbane and Women's Hospital15, University of Western Australia16, University of Colorado Denver17, McMaster University18
TL;DR: These guidelines are intended for use by healthcare professionals who care for patients at risk for hospital-acquired pneumonia (HAP) and ventilator-associated pneumonia (VAP), including specialists in infectious diseases, pulmonary diseases, critical care, and surgeons, anesthesiologists, hospitalists, and any clinicians and healthcare providers caring for hospitalized patients with nosocomial pneumonia.
Abstract: It is important to realize that guidelines cannot always account for individual variation among patients. They are not intended to supplant physician judgment with respect to particular patients or special clinical situations. IDSA considers adherence to these guidelines to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient's individual circumstances.These guidelines are intended for use by healthcare professionals who care for patients at risk for hospital-acquired pneumonia (HAP) and ventilator-associated pneumonia (VAP), including specialists in infectious diseases, pulmonary diseases, critical care, and surgeons, anesthesiologists, hospitalists, and any clinicians and healthcare providers caring for hospitalized patients with nosocomial pneumonia. The panel's recommendations for the diagnosis and treatment of HAP and VAP are based upon evidence derived from topic-specific systematic literature reviews.
••
TL;DR: A relationship between clonal neoantigen burden and overall survival in primary lung adenocarcinomas and the impact of neoantigens intratumor heterogeneity (ITH) on antitumor immunity is demonstrated.
Abstract: As tumors grow, they acquire mutations, some of which create neoantigens that influence the response of patients to immune checkpoint inhibitors. We explored the impact of neoantigen intratumor heterogeneity (ITH) on antitumor immunity. Through integrated analysis of ITH and neoantigen burden, we demonstrate a relationship between clonal neoantigen burden and overall survival in primary lung adenocarcinomas. CD8+ tumor-infiltrating lymphocytes reactive to clonal neoantigens were identified in early-stage non–small cell lung cancer and expressed high levels of PD-1. Sensitivity to PD-1 and CTLA-4 blockade in patients with advanced NSCLC and melanoma was enhanced in tumors enriched for clonal neoantigens. T cells recognizing clonal neoantigens were detectable in patients with durable clinical benefit. Cytotoxic chemotherapy–induced subclonal neoantigens, contributing to an increased mutational load, were enriched in certain poor responders. These data suggest that neoantigen heterogeneity may influence immune surveillance and support therapeutic developments targeting clonal neoantigens.
••
TL;DR: In this paper, the authors used the Wide Field Camera 3 (WFC3) on the Hubble Space Telescope (HST) to reduce the uncertainty in the local value of the Hubble constant from 3.3% to 2.4%.
Abstract: We use the Wide Field Camera 3 (WFC3) on the Hubble Space Telescope (HST) to reduce the uncertainty in the local value of the Hubble constant from 3.3% to 2.4%. The bulk of this improvement comes from new near-infrared (NIR) observations of Cepheid variables in 11 host galaxies of recent type Ia supernovae (SNe Ia), more than doubling the sample of reliable SNe Ia having a Cepheid-calibrated distance to a total of 19, these in turn leverage the magnitude-redshift relation based on ∼300 SNe Ia at z < 0.15. All 19 hosts as well as the megamaser system NGC 4258 have been observed with WFC3 in the optical and NIR, thus nullifying cross-instrument zeropoint errors in the relative distance estimates from Cepheids. Other noteworthy improvements include a 33% reduction in the systematic uncertainty in the maser distance to NGC 4258, a larger sample of Cepheids in the Large Magellanic Cloud (LMC), a more robust distance to the LMC based on late-type detached eclipsing binaries (DEBs), HST observations of Cepheids in M31, and new HST-based trigonometric parallaxes for Milky Way (MW) Cepheids. We consider four geometric distance calibrations of Cepheids: (i) megamasers in NGC 4258, (ii) 8 DEBs in the LMC, (iii) 15 MW Cepheids with parallaxes measured with HST/FGS, HST/WFC3 spatial scanning and/or Hipparcos, and (iv) 2 DEBs in M31. The Hubble constant from each is 72.25 ± 2.51, 72.04 ± 2.67, 76.18 ± 2.37, and 74.50 ± 3.27 km s(−)(1) Mpc(−)(1), respectively. Our best estimate of H (0) = 73.24 ± 1.74 km s(−)(1) Mpc(−)(1) combines the anchors NGC 4258, MW, and LMC, yielding a 2.4% determination (all quoted uncertainties include fully propagated statistical and systematic components). This value is 3.4σ higher than 66.93 ± 0.62 km s(−)(1) Mpc(−)(1) predicted by ΛCDM with 3 neutrino flavors having a mass of 0.06 eV and the new Planck data, but the discrepancy reduces to 2.1σ relative to the prediction of 69.3 ± 0.7 km s(−)(1) Mpc(−)(1) based on the comparably precise combination of WMAP+ACT+SPT+BAO observations, suggesting that systematic uncertainties in CMB radiation measurements may play a role in the tension. If we take the conflict between Planck high-redshift measurements and our local determination of H (0) at face value, one plausible explanation could involve an additional source of dark radiation in the early universe in the range of ΔN (eff) ≈ 0.4–1. We anticipate further significant improvements in H (0) from upcoming parallax measurements of long-period MW Cepheids.
••
TL;DR: The Prostate Imaging - Reporting and Data System Version 2 (PI-RADS™ v2) simplifies and standardizes terminology and content of reports, and provides assessment categories that summarize levels of suspicion or risk of clinically significant prostate cancer that can be used to assist selection of patients for biopsies and management.
••
TL;DR: The optimal simulation protocol for each program has been implemented in CHARMM-GUI and is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.
Abstract: Proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find the optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order para...