Showing papers by "University of Washington published in 2019"
••
University of Jyväskylä1, University of California, Los Angeles2, California Polytechnic State University3, Los Alamos National Laboratory4, National Research University – Higher School of Economics5, University of California, Berkeley6, University of Birmingham7, Australian Nuclear Science and Technology Organisation8, University of Washington9, University of Massachusetts Amherst10, University of West Bohemia11, University of Texas at Austin12, Brigham Young University13, Universidade Federal de Minas Gerais14, Google15
TL;DR: SciPy as discussed by the authors is an open source scientific computing library for the Python programming language, which includes functionality spanning clustering, Fourier transforms, integration, interpolation, file I/O, linear algebra, image processing, orthogonal distance regression, minimization algorithms, signal processing, sparse matrix handling, computational geometry, and statistics.
Abstract: SciPy is an open source scientific computing library for the Python programming language. SciPy 1.0 was released in late 2017, about 16 years after the original version 0.1 release. SciPy has become a de facto standard for leveraging scientific algorithms in the Python programming language, with more than 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories, and millions of downloads per year. This includes usage of SciPy in almost half of all machine learning projects on GitHub, and usage by high profile projects including LIGO gravitational wave analysis and creation of the first-ever image of a black hole (M87). The library includes functionality spanning clustering, Fourier transforms, integration, interpolation, file I/O, linear algebra, image processing, orthogonal distance regression, minimization algorithms, signal processing, sparse matrix handling, computational geometry, and statistics. In this work, we provide an overview of the capabilities and development practices of the SciPy library and highlight some recent technical developments.
12,774 citations
•
TL;DR: PyTorch as discussed by the authors is a machine learning library that provides an imperative and Pythonic programming style that makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs.
Abstract: Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it provides an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs.
In this paper, we detail the principles that drove the implementation of PyTorch and how they are reflected in its architecture. We emphasize that every aspect of PyTorch is a regular Python program under the full control of its user. We also explain how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.
We demonstrate the efficiency of individual subsystems, as well as the overall speed of PyTorch on several common benchmarks.
12,767 citations
•
01 Jan 2019
TL;DR: This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.
Abstract: Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it was designed from first principles to support an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs. In this paper, we detail the principles that drove the implementation of PyTorch and how they are reflected in its architecture. We emphasize that every aspect of PyTorch is a regular Python program under the full control of its user. We also explain how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance. We demonstrate the efficiency of individual subsystems, as well as the overall speed of PyTorch on several commonly used benchmarks.
10,045 citations
••
Northern Arizona University1, National Institutes of Health2, University of Minnesota3, University of California, Davis4, Woods Hole Oceanographic Institution5, Massachusetts Institute of Technology6, University of Copenhagen7, University of Trento8, Chinese Academy of Sciences9, University of California, San Francisco10, University of Pennsylvania11, Pacific Northwest National Laboratory12, North Carolina State University13, University of California, San Diego14, Institute for Systems Biology15, Dalhousie University16, University of British Columbia17, Statens Serum Institut18, Anschutz Medical Campus19, University of Washington20, Michigan State University21, Stanford University22, Harvard University23, Broad Institute24, Australian National University25, University of Düsseldorf26, University of New South Wales27, Sookmyung Women's University28, San Diego State University29, Howard Hughes Medical Institute30, Cornell University31, Max Planck Society32, Colorado State University33, Google34, Syracuse University35, Webster University36, United States Department of Agriculture37, University of Arkansas for Medical Sciences38, Colorado School of Mines39, University of Southern Mississippi40, National Oceanic and Atmospheric Administration41, University of California, Merced42, Wageningen University and Research Centre43, University of Arizona44, Environment Agency45, University of Florida46, Merck & Co.47
TL;DR: QIIME 2 development was primarily funded by NSF Awards 1565100 to J.G.C. and R.K.P. and partial support was also provided by the following: grants NIH U54CA143925 and U54MD012388.
Abstract: QIIME 2 development was primarily funded by NSF Awards 1565100 to J.G.C. and 1565057 to R.K. Partial support was also provided by the following: grants NIH U54CA143925 (J.G.C. and T.P.) and U54MD012388 (J.G.C. and T.P.); grants from the Alfred P. Sloan Foundation (J.G.C. and R.K.); ERCSTG project MetaPG (N.S.); the Strategic Priority Research Program of the Chinese Academy of Sciences QYZDB-SSW-SMC021 (Y.B.); the Australian National Health and Medical Research Council APP1085372 (G.A.H., J.G.C., Von Bing Yap and R.K.); the Natural Sciences and Engineering Research Council (NSERC) to D.L.G.; and the State of Arizona Technology and Research Initiative Fund (TRIF), administered by the Arizona Board of Regents, through Northern Arizona University. All NCI coauthors were supported by the Intramural Research Program of the National Cancer Institute. S.M.G. and C. Diener were supported by the Washington Research Foundation Distinguished Investigator Award.
8,821 citations
••
Scott & White Hospital1, Columbia University2, Georgetown University3, Rutgers University4, Cleveland Clinic5, Laval University6, University of British Columbia7, New York University8, University of Washington9, Emory University10, Lankenau Institute for Medical Research11, University of Pennsylvania12, Morristown Medical Center13, University of London14
TL;DR: Among patients with severe aortic stenosis who were at low surgical risk, the rate of the composite of death, stroke, or rehospitalization at 1 year was significantly lower with TAVR than with surgery.
Abstract: Background Among patients with aortic stenosis who are at intermediate or high risk for death with surgery, major outcomes are similar with transcatheter aortic-valve replacement (TAVR) an...
2,917 citations
••
15 Jun 2019TL;DR: DeepSDF as mentioned in this paper represents a shape's surface by a continuous volumetric field: the magnitude of a point in the field represents the distance to the surface boundary and the sign indicates whether the region is inside (-) or outside (+) of the shape.
Abstract: Computer graphics, 3D computer vision and robotics communities have produced multiple approaches to representing 3D geometry for rendering and reconstruction. These provide trade-offs across fidelity, efficiency and compression capabilities. In this work, we introduce DeepSDF, a learned continuous Signed Distance Function (SDF) representation of a class of shapes that enables high quality shape representation, interpolation and completion from partial and noisy 3D input data. DeepSDF, like its classical counterpart, represents a shape's surface by a continuous volumetric field: the magnitude of a point in the field represents the distance to the surface boundary and the sign indicates whether the region is inside (-) or outside (+) of the shape, hence our representation implicitly encodes a shape's boundary as the zero-level-set of the learned function while explicitly representing the classification of space as being part of the shapes interior or not. While classical SDF's both in analytical or discretized voxel form typically represent the surface of a single shape, DeepSDF can represent an entire class of shapes. Furthermore, we show state-of-the-art performance for learned 3D shape representation and completion while reducing the model size by an order of magnitude compared with previous work.
2,247 citations
••
TL;DR: The latest updates to CADD are reviewed, including the most recent version, 1.4, which supports the human genome build GRCh38, and also present updates to the website that include simplified variant lookup, extended documentation, an Application Program Interface and improved mechanisms for integrating CADD scores into other tools or applications.
Abstract: Combined Annotation-Dependent Depletion (CADD) is a widely used measure of variant deleteriousness that can effectively prioritize causal variants in genetic analyses, particularly highly penetrant contributors to severe Mendelian disorders. CADD is an integrative annotation built from more than 60 genomic features, and can score human single nucleotide variants and short insertion and deletions anywhere in the reference assembly. CADD uses a machine learning model trained on a binary distinction between simulated de novo variants and variants that have arisen and become fixed in human populations since the split between humans and chimpanzees; the former are free of selective pressure and may thus include both neutral and deleterious alleles, while the latter are overwhelmingly neutral (or, at most, weakly deleterious) by virtue of having survived millions of years of purifying selection. Here we review the latest updates to CADD, including the most recent version, 1.4, which supports the human genome build GRCh38. We also present updates to our website that include simplified variant lookup, extended documentation, an Application Program Interface and improved mechanisms for integrating CADD scores into other tools or applications. CADD scores, software and documentation are available at https://cadd.gs.washington.edu.
2,091 citations
••
TL;DR: A cell atlas of mouse organogenesis provides a global view of developmental processes occurring during this critical period, including focused analyses of the apical ectodermal ridge, limb mesenchyme and skeletal muscle.
Abstract: Mammalian organogenesis is a remarkable process. Within a short timeframe, the cells of the three germ layers transform into an embryo that includes most of the major internal and external organs. Here we investigate the transcriptional dynamics of mouse organogenesis at single-cell resolution. Using single-cell combinatorial indexing, we profiled the transcriptomes of around 2 million cells derived from 61 embryos staged between 9.5 and 13.5 days of gestation, in a single experiment. The resulting ‘mouse organogenesis cell atlas’ (MOCA) provides a global view of developmental processes during this critical window. We use Monocle 3 to identify hundreds of cell types and 56 trajectories, many of which are detected only because of the depth of cellular coverage, and collectively define thousands of corresponding marker genes. We explore the dynamics of gene expression within cell types and trajectories over time, including focused analyses of the apical ectodermal ridge, limb mesenchyme and skeletal muscle. Data from single-cell combinatorial-indexing RNA-sequencing analysis of 2 million cells from mouse embryos between embryonic days 9.5 and 13.5 are compiled in a cell atlas of mouse organogenesis, which provides a global view of developmental processes occurring during this critical period.
1,865 citations
••
01 Nov 2019TL;DR: SciBERT leverages unsupervised pretraining on a large multi-domain corpus of scientific publications to improve performance on downstream scientific NLP tasks and demonstrates statistically significant improvements over BERT.
Abstract: Obtaining large-scale annotated data for NLP tasks in the scientific domain is challenging and expensive. We release SciBERT, a pretrained language model based on BERT (Devlin et. al., 2018) to address the lack of high-quality, large-scale labeled scientific data. SciBERT leverages unsupervised pretraining on a large multi-domain corpus of scientific publications to improve performance on downstream scientific NLP tasks. We evaluate on a suite of tasks including sequence tagging, sentence classification and dependency parsing, with datasets from a variety of scientific domains. We demonstrate statistically significant improvements over BERT and achieve new state-of-the-art results on several of these tasks. The code and pretrained models are available at https://github.com/allenai/scibert/.
1,864 citations
••
Ljubljana University Medical Centre1, King's College London2, Vita-Salute San Raffaele University3, Stanford University4, American Diabetes Association5, University of Padua6, Harvard University7, University of Amsterdam8, University of Sydney9, University of Colorado Denver10, University of Sheffield11, University of Washington12, University of Cambridge13, Shanghai Jiao Tong University14, University of Virginia15, JDRF16, Katholieke Universiteit Leuven17, University of East Anglia18, San Antonio River Authority19, Steno Diabetes Center20, University of Montpellier21, University of Florida22, Nihon University23, Yale University24, Tel Aviv University25
TL;DR: This article summarizes the ATTD consensus recommendations for relevant aspects of CGM data utilization and reporting among the various diabetes populations.
Abstract: Improvements in sensor accuracy, greater convenience and ease of use, and expanding reimbursement have led to growing adoption of continuous glucose monitoring (CGM). However, successful utilization of CGM technology in routine clinical practice remains relatively low. This may be due in part to the lack of clear and agreed-upon glycemic targets that both diabetes teams and people with diabetes can work toward. Although unified recommendations for use of key CGM metrics have been established in three separate peer-reviewed articles, formal adoption by diabetes professional organizations and guidance in the practical application of these metrics in clinical practice have been lacking. In February 2019, the Advanced Technologies & Treatments for Diabetes (ATTD) Congress convened an international panel of physicians, researchers, and individuals with diabetes who are expert in CGM technologies to address this issue. This article summarizes the ATTD consensus recommendations for relevant aspects of CGM data utilization and reporting among the various diabetes populations.
1,776 citations
••
TL;DR: Liu et al. as mentioned in this paper discuss crucial conditions needed to achieve a specific energy higher than 350 Wh kg−1, up to 500 Wh kg −1, for rechargeable Li metal batteries using high-nickel-content lithium nickel manganese cobalt oxides as cathode materials.
Abstract: State-of-the-art lithium (Li)-ion batteries are approaching their specific energy limits yet are challenged by the ever-increasing demand of today’s energy storage and power applications, especially for electric vehicles. Li metal is considered an ultimate anode material for future high-energy rechargeable batteries when combined with existing or emerging high-capacity cathode materials. However, much current research focuses on the battery materials level, and there have been very few accounts of cell design principles. Here we discuss crucial conditions needed to achieve a specific energy higher than 350 Wh kg−1, up to 500 Wh kg−1, for rechargeable Li metal batteries using high-nickel-content lithium nickel manganese cobalt oxides as cathode materials. We also provide an analysis of key factors such as cathode loading, electrolyte amount and Li foil thickness that impact the cell-level cycle life. Furthermore, we identify several important strategies to reduce electrolyte-Li reaction, protect Li surfaces and stabilize anode architectures for long-cycling high-specific-energy cells. Jun Liu and Battery500 Consortium colleagues contemplate the way forward towards high-energy and long-cycling practical batteries.
••
Harvard University1, University of Western Australia2, University of Texas Health Science Center at San Antonio3, McMaster University4, University of Washington5, Centers for Disease Control and Prevention6, Primary Children's Hospital7, University of Utah8, University of Pittsburgh9, University of Michigan10, Vanderbilt University11, University of Connecticut12, United States Department of Veterans Affairs13, Baylor College of Medicine14
TL;DR: Although some recommendations remain unchanged from the 2007 guideline, the availability of results from new therapeutic trials and epidemiological investigations led to revised recommendations for empiric treatment strategies and additional management decisions.
Abstract: Background: This document provides evidence-based clinical practice guidelines on the management of adult patients with community-acquired pneumonia.Methods: A multidisciplinary panel conducted pra...
•
TL;DR: This paper showed that decoding strategies alone alone can dramatically affect the quality of machine text, even when generated from exactly the same neural language model, and they proposed Nucleus Sampling, a simple but effective method to draw the best out of neural generation.
Abstract: Despite considerable advancements with deep neural language models, the enigma of neural text degeneration persists when these models are tested as text generators. The counter-intuitive empirical observation is that even though the use of likelihood as training objective leads to high quality models for a broad range of language understanding tasks, using likelihood as a decoding objective leads to text that is bland and strangely repetitive.
In this paper, we reveal surprising distributional differences between human text and machine text. In addition, we find that decoding strategies alone can dramatically effect the quality of machine text, even when generated from exactly the same neural language model. Our findings motivate Nucleus Sampling, a simple but effective method to draw the best out of neural generation. By sampling text from the dynamic nucleus of the probability distribution, which allows for diversity while effectively truncating the less reliable tail of the distribution, the resulting text better demonstrates the quality of human text, yielding enhanced diversity without sacrificing fluency and coherence.
••
Carleton University1, Michigan State University2, University of Saskatchewan3, University of California, Santa Barbara4, Federation University Australia5, University of Colorado Boulder6, McMaster University7, Mount Allison University8, University of Washington9, Cardiff University10, Queen's University11, Leibniz Association12, University of Hong Kong13
TL;DR: Efforts to reverse global trends in freshwater degradation now depend on bridging an immense gap between the aspirations of conservation biologists and the accelerating rate of species endangerment.
Abstract: In the 12 years since Dudgeon et al. (2006) reviewed major pressures on freshwater ecosystems, the biodiversity crisis in
the world’s lakes, reservoirs, rivers, streams and wetlands has deepened. While lakes, reservoirs and rivers cover only
2.3% of the Earth’s surface, these ecosystems host at least 9.5% of the Earth’s described animal species. Furthermore,
using the World Wide Fund for Nature’s Living Planet Index, freshwater population declines (83% between 1970 and
2014) continue to outpace contemporaneous declines in marine or terrestrial systems. The Anthropocene has brought
multiple new and varied threats that disproportionately impact freshwater systems. We document 12 emerging threats
to freshwater biodiversity that are either entirely new since 2006 or have since intensified: (i) changing climates; (ii)
e-commerce and invasions; (iii) infectious diseases; (iv) harmful algal blooms; (v) expanding hydropower; (vi) emerging
contaminants; (vii) engineered nanomaterials; (viii) microplastic pollution; (ix) light and noise; (x) freshwater salinisation;
(xi) declining calcium; and (xii) cumulative stressors. Effects are evidenced for amphibians, fishes, invertebrates, microbes,
plants, turtles and waterbirds, with potential for ecosystem-level changes through bottom-up and top-down processes.
In our highly uncertain future, the net effects of these threats raise serious concerns for freshwater ecosystems. However,
we also highlight opportunities for conservation gains as a result of novel management tools (e.g. environmental flows,
environmental DNA) and specific conservation-oriented actions (e.g. dam removal, habitat protection policies,managed
relocation of species) that have been met with varying levels of success.Moving forward, we advocate hybrid approaches
that manage fresh waters as crucial ecosystems for human life support as well as essential hotspots of biodiversity and
ecological function. Efforts to reverse global trends in freshwater degradation now depend on bridging an immense gap
between the aspirations of conservation biologists and the accelerating rate of species endangerment.
••
TL;DR: It is found that malignant cells in glioblastoma exist in four main cellular states that recapitulate distinct neural cell types, are influenced by the tumor microenvironment, and exhibit plasticity.
••
University of Texas MD Anderson Cancer Center1, Johns Hopkins University2, City of Hope National Medical Center3, University of California, Davis4, Emory University5, Fred Hutchinson Cancer Research Center6, University of Washington7, Northwestern University8, Monash University9, AbbVie10, Genentech11, University of Colorado Denver12, Harvard University13
TL;DR: The novel combination of venetoclax with decitabine or azacitidine was effective and well tolerated in elderly patients with AML and achieved complete remission (CR) + CR with incomplete count recovery (CRi).
••
TL;DR: Using an improved human mutation rate model, human protein-coding genes are classified along a spectrum representing tolerance to inactivation, validate this classification using data from model organisms and engineered human cells, and show that it can be used to improve gene discovery power for both common and rare diseases.
Abstract: Summary Genetic variants that inactivate protein-coding genes are a powerful source of information about the phenotypic consequences of gene disruption: genes critical for an organism’s function will be depleted for such variants in natural populations, while non-essential genes will tolerate their accumulation. However, predicted loss-of-function (pLoF) variants are enriched for annotation errors, and tend to be found at extremely low frequencies, so their analysis requires careful variant annotation and very large sample sizes. Here, we describe the aggregation of 125,748 exomes and 15,708 genomes from human sequencing studies into the Genome Aggregation Database (gnomAD). We identify 443,769 high-confidence pLoF variants in this cohort after filtering for sequencing and annotation artifacts. Using an improved model of human mutation, we classify human protein-coding genes along a spectrum representing intolerance to inactivation, validate this classification using data from model organisms and engineered human cells, and show that it can be used to improve gene discovery power for both common and rare diseases.
••
TL;DR: An overview of machine learning for fluid mechanics can be found in this article, where the strengths and limitations of these methods are addressed from the perspective of scientific inquiry that considers data as an inherent part of modeling, experimentation, and simulation.
Abstract: The field of fluid mechanics is rapidly advancing, driven by unprecedented volumes of data from field measurements, experiments and large-scale simulations at multiple spatiotemporal scales. Machine learning offers a wealth of techniques to extract information from data that could be translated into knowledge about the underlying fluid mechanics. Moreover, machine learning algorithms can augment domain knowledge and automate tasks related to flow control and optimization. This article presents an overview of past history, current developments, and emerging opportunities of machine learning for fluid mechanics. It outlines fundamental machine learning methodologies and discusses their uses for understanding, modeling, optimizing, and controlling fluid flows. The strengths and limitations of these methods are addressed from the perspective of scientific inquiry that considers data as an inherent part of modeling, experimentation, and simulation. Machine learning provides a powerful information processing framework that can enrich, and possibly even transform, current lines of fluid mechanics research and industrial applications.
••
McGill University1, WWF-India2, University of Basel3, WWF-Canada4, The Nature Conservancy5, University of Nevada, Reno6, Delft University of Technology7, Konstanz University of Applied Sciences8, King's College London9, Umeå University10, Swedish University of Agricultural Sciences11, University of Washington12, Harvard University13, University of Wisconsin-Madison14, Conservation International15, Michigan Technological University16, Stanford University17, Free University of Berlin18, Leibniz Association19, University of Tübingen20
TL;DR: A comprehensive assessment of the world’s rivers and their connectivity shows that only 37 per cent of rivers longer than 1,000 kilometres remain free-flowing over their entire length.
Abstract: Free-flowing rivers (FFRs) support diverse, complex and dynamic ecosystems globally, providing important societal and economic services. Infrastructure development threatens the ecosystem processes, biodiversity and services that these rivers support. Here we assess the connectivity status of 12 million kilometres of rivers globally and identify those that remain free-flowing in their entire length. Only 37 per cent of rivers longer than 1,000 kilometres remain free-flowing over their entire length and 23 per cent flow uninterrupted to the ocean. Very long FFRs are largely restricted to remote regions of the Arctic and of the Amazon and Congo basins. In densely populated areas only few very long rivers remain free-flowing, such as the Irrawaddy and Salween. Dams and reservoirs and their up- and downstream propagation of fragmentation and flow regulation are the leading contributors to the loss of river connectivity. By applying a new method to quantify riverine connectivity and map FFRs, we provide a foundation for concerted global and national strategies to maintain or restore them. A comprehensive assessment of the world’s rivers and their connectivity shows that only 37 per cent of rivers longer than 1,000 kilometres remain free-flowing over their entire length.
••
TL;DR: RNA-sequencing analysis of cells in the human cortex enabled identification of diverse cell types, revealing well-conserved architecture and homologous cell types as well as extensive differences when compared with datasets covering the analogous region of the mouse brain.
Abstract: Elucidating the cellular architecture of the human cerebral cortex is central to understanding our cognitive abilities and susceptibility to disease. Here we used single-nucleus RNA-sequencing analysis to perform a comprehensive study of cell types in the middle temporal gyrus of human cortex. We identified a highly diverse set of excitatory and inhibitory neuron types that are mostly sparse, with excitatory types being less layer-restricted than expected. Comparison to similar mouse cortex single-cell RNA-sequencing datasets revealed a surprisingly well-conserved cellular architecture that enables matching of homologous types and predictions of properties of human cell types. Despite this general conservation, we also found extensive differences between homologous human and mouse cell types, including marked alterations in proportions, laminar distributions, gene expression and morphology. These species-specific features emphasize the importance of directly studying human brain.
••
University of Washington1, California Institute of Technology2, Stockholm University3, University of Maryland, College Park4, Humboldt University of Berlin5, Goddard Space Flight Center6, National Central University7, Weizmann Institute of Science8, Macau University of Science and Technology9, Tel Aviv University10, University of California, Santa Barbara11, University of Michigan12, Northwestern University13, Adler Planetarium14, University of California, Berkeley15, Lawrence Berkeley National Laboratory16, Soka University of America17, Centre national de la recherche scientifique18, Radboud University Nijmegen19, University of Wisconsin–Milwaukee20, Los Alamos National Laboratory21
TL;DR: The Zwicky Transient Facility (ZTF) as mentioned in this paper is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope, which provides a 47 deg^2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey.
Abstract: The Zwicky Transient Facility (ZTF) is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope. A custom-built wide-field camera provides a 47 deg^2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey, the Palomar Transient Factory. We describe the design and implementation of the camera and observing system. The ZTF data system at the Infrared Processing and Analysis Center provides near-real-time reduction to identify moving and varying objects. We outline the analysis pipelines, data products, and associated archive. Finally, we present on-sky performance analysis and first scientific results from commissioning and the early survey. ZTF's public alert stream will serve as a useful precursor for that of the Large Synoptic Survey Telescope.
•
TL;DR: BART as mentioned in this paper is a denoising autoencoder for pretraining sequence-to-sequence models, which is trained by corrupting text with an arbitrary noising function, and then learning a model to reconstruct the original text.
Abstract: We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes. We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token. BART is particularly effective when fine tuned for text generation but also works well for comprehension tasks. It matches the performance of RoBERTa with comparable training resources on GLUE and SQuAD, achieves new state-of-the-art results on a range of abstractive dialogue, question answering, and summarization tasks, with gains of up to 6 ROUGE. BART also provides a 1.1 BLEU increase over a back-translation system for machine translation, with only target language pretraining. We also report ablation experiments that replicate other pretraining schemes within the BART framework, to better measure which factors most influence end-task performance.
••
01 Aug 2019TL;DR: The results of new trials continue to help us understand the role of these novel agents and which patients are more likely to benefit; ICIs are now part of the first-line NSCLC treatment armamentarium as monotherapy, combined with chemotherapy, or after definite chemoradiotherapy in patients with stage III unresectable NSCLCs.
Abstract: Lung cancer remains the leading cause of cancer deaths in the United States. In the past decade, significant advances have been made in the science of non-small cell lung cancer (NSCLC). Screening has been introduced with the goal of early detection. The National Lung Screening Trial found a lung cancer mortality benefit of 20% and a 6.7% decrease in all-cause mortality with the use of low-dose chest computed tomography in high-risk individuals. The treatment of lung cancer has also evolved with the introduction of several lines of tyrosine kinase inhibitors in patients with EGFR, ALK, ROS1, and NTRK mutations. Similarly, immune checkpoint inhibitors (ICIs) have dramatically changed the landscape of NSCLC treatment. Furthermore, the results of new trials continue to help us understand the role of these novel agents and which patients are more likely to benefit; ICIs are now part of the first-line NSCLC treatment armamentarium as monotherapy, combined with chemotherapy, or after definite chemoradiotherapy in patients with stage III unresectable NSCLC. Expression of programmed cell death protein-ligand 1 in malignant cells has been studied as a potential biomarker for response to ICIs. However, important drawbacks exist that limit its discriminatory potential. Identification of accurate predictive biomarkers beyond programmed cell death protein-ligand 1 expression remains essential to select the most appropriate candidates for ICI therapy. Many questions remain unanswered regarding the proper sequence and combinations of these new agents; however, the field is moving rapidly, and the overall direction is optimistic.
••
TL;DR: The Large Synoptic Survey Telescope (LSST) as discussed by the authors is a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile.
Abstract: We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the solar system, exploring the transient optical sky, and mapping the Milky Way. LSST will be a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2 field of view, a 3.2-gigapixel camera, and six filters (ugrizy) covering the wavelength range 320–1050 nm. The project is in the construction phase and will begin regular survey operations by 2022. About 90% of the observing time will be devoted to a deep-wide-fast survey mode that will uniformly observe a 18,000 deg2 region about 800 times (summed over all six bands) during the anticipated 10 yr of operations and will yield a co-added map to r ~ 27.5. These data will result in databases including about 32 trillion observations of 20 billion galaxies and a similar number of stars, and they will serve the majority of the primary science programs. The remaining 10% of the observing time will be allocated to special projects such as Very Deep and Very Fast time domain surveys, whose details are currently under discussion. We illustrate how the LSST science drivers led to these choices of system parameters, and we describe the expected data products and their characteristics.
••
Shanghai Jiao Tong University1, University of Ottawa2, AmeriCorps VISTA3, Nagasaki University4, Seoul National University5, National Taiwan University6, Yale University7, Czech University of Life Sciences Prague8, University of Tartu9, Huazhong University of Science and Technology10, Health Effects Institute11, University of Washington12
TL;DR: The data show independent associations between short-term exposure to PM10 and PM2.5 and daily all-cause, cardiovascular, and respiratory mortality in more than 600 cities across the globe, and reinforce the evidence of a link between mortality and PM concentration established in regional and local studies.
Abstract: BACKGROUND: The systematic evaluation of the results of time-series studies of air pollution is challenged by differences in model specification and publication bias.METHODS: We evaluated the assoc ...
••
TL;DR: Results suggest that the origin of the observed effects is interlayer excitons trapped in a smooth moiré potential with inherited valley-contrasting physics, and presents opportunities to control two-dimensional moirÉ optics through variation of the twist angle.
Abstract: The formation of moire patterns in crystalline solids can be used to manipulate their electronic properties, which are fundamentally influenced by periodic potential landscapes. In two-dimensional materials, a moire pattern with a superlattice potential can be formed by vertically stacking two layered materials with a twist and/or a difference in lattice constant. This approach has led to electronic phenomena including the fractal quantum Hall effect1–3, tunable Mott insulators4,5 and unconventional superconductivity6. In addition, theory predicts that notable effects on optical excitations could result from a moire potential in two-dimensional valley semiconductors7–9, but these signatures have not been detected experimentally. Here we report experimental evidence of interlayer valley excitons trapped in a moire potential in molybdenum diselenide (MoSe2)/tungsten diselenide (WSe2) heterobilayers. At low temperatures, we observe photoluminescence close to the free interlayer exciton energy but with linewidths over one hundred times narrower (around 100 microelectronvolts). The emitter g-factors are homogeneous across the same sample and take only two values, −15.9 and 6.7, in samples with approximate twist angles of 60 degrees and 0 degrees, respectively. The g-factors match those of the free interlayer exciton, which is determined by one of two possible valley-pairing configurations. At twist angles of approximately 20 degrees the emitters become two orders of magnitude dimmer; however, they possess the same g-factor as the heterobilayer at a twist angle of approximately 60 degrees. This is consistent with the umklapp recombination of interlayer excitons near the commensurate 21.8-degree twist angle7. The emitters exhibit strong circular polarization of the same helicity for a given twist angle, which suggests that the trapping potential retains three-fold rotational symmetry. Together with a characteristic dependence on power and excitation energy, these results suggest that the origin of the observed effects is interlayer excitons trapped in a smooth moire potential with inherited valley-contrasting physics. This work presents opportunities to control two-dimensional moire optics through variation of the twist angle. The trapping of interlayer valley excitons in a moire potential formed by a molybdenum diselenide/tungsten diselenide heterobilayer with twist angle control is reported.
••
TL;DR: Cleavage Under Targets and Tagmentation (CUT&Tag), an enzyme-tethering strategy that provides efficient high-resolution sequencing libraries for profiling diverse chromatin components, is described.
Abstract: Many chromatin features play critical roles in regulating gene expression. A complete understanding of gene regulation will require the mapping of specific chromatin features in small samples of cells at high resolution. Here we describe Cleavage Under Targets and Tagmentation (CUT&Tag), an enzyme-tethering strategy that provides efficient high-resolution sequencing libraries for profiling diverse chromatin components. In CUT&Tag, a chromatin protein is bound in situ by a specific antibody, which then tethers a protein A-Tn5 transposase fusion protein. Activation of the transposase efficiently generates fragment libraries with high resolution and exceptionally low background. All steps from live cells to sequencing-ready libraries can be performed in a single tube on the benchtop or a microwell in a high-throughput pipeline, and the entire procedure can be performed in one day. We demonstrate the utility of CUT&Tag by profiling histone modifications, RNA Polymerase II and transcription factors on low cell numbers and single cells.
••
Mayo Clinic1, Duke University2, University of Washington3, National Institutes of Health4, Cardiovascular Institute of the South5, University of Missouri6, Primary Children's Hospital7, Ruhr University Bochum8, Moscow State University9, Loyola University Chicago10, Humanitas University11, Leipzig University12, Thomas Jefferson University13, Pennsylvania State University14, Columbia University15
TL;DR: Among patients with AF, the strategy of catheter ablation, compared with medical therapy, did not significantly reduce the primary composite end point of death, disabling stroke, serious bleeding, or cardiac arrest, which should be considered in interpreting the results of the trial.
Abstract: Importance Catheter ablation is effective in restoring sinus rhythm in atrial fibrillation (AF), but its effects on long-term mortality and stroke risk are uncertain. Objective To determine whether catheter ablation is more effective than conventional medical therapy for improving outcomes in AF. Design, Setting, and Participants The Catheter Ablation vs Antiarrhythmic Drug Therapy for Atrial Fibrillation trial is an investigator-initiated, open-label, multicenter, randomized trial involving 126 centers in 10 countries. A total of 2204 symptomatic patients with AF aged 65 years and older or younger than 65 years with 1 or more risk factors for stroke were enrolled from November 2009 to April 2016, with follow-up through December 31, 2017. Interventions The catheter ablation group (n = 1108) underwent pulmonary vein isolation, with additional ablative procedures at the discretion of site investigators. The drug therapy group (n = 1096) received standard rhythm and/or rate control drugs guided by contemporaneous guidelines. Main Outcomes and Measures The primary end point was a composite of death, disabling stroke, serious bleeding, or cardiac arrest. Among 13 prespecified secondary end points, 3 are included in this report: all-cause mortality; total mortality or cardiovascular hospitalization; and AF recurrence. Results Of the 2204 patients randomized (median age, 68 years; 37.2% female; 42.9% had paroxysmal AF and 57.1% had persistent AF), 89.3% completed the trial. Of the patients assigned to catheter ablation, 1006 (90.8%) underwent the procedure. Of the patients assigned to drug therapy, 301 (27.5%) ultimately received catheter ablation. In the intention-to-treat analysis, over a median follow-up of 48.5 months, the primary end point occurred in 8.0% (n = 89) of patients in the ablation group vs 9.2% (n = 101) of patients in the drug therapy group (hazard ratio [HR], 0.86 [95% CI, 0.65-1.15];P = .30). Among the secondary end points, outcomes in the ablation group vs the drug therapy group, respectively, were 5.2% vs 6.1% for all-cause mortality (HR, 0.85 [95% CI, 0.60-1.21];P = .38), 51.7% vs 58.1% for death or cardiovascular hospitalization (HR, 0.83 [95% CI, 0.74-0.93];P = .001), and 49.9% vs 69.5% for AF recurrence (HR, 0.52 [95% CI, 0.45-0.60];P Conclusions and Relevance Among patients with AF, the strategy of catheter ablation, compared with medical therapy, did not significantly reduce the primary composite end point of death, disabling stroke, serious bleeding, or cardiac arrest. However, the estimated treatment effect of catheter ablation was affected by lower-than-expected event rates and treatment crossovers, which should be considered in interpreting the results of the trial. Trial Registration ClinicalTrials.gov Identifier:NCT00911508
••
University College London1, International Institute for Applied Systems Analysis2, University of Reading3, Brighton and Sussex Medical School4, University of London5, Cooperative Institute for Research in Environmental Sciences6, Umeå University7, Tsinghua University8, Cardiff University9, University of Geneva10, University of New England (United States)11, University of Birmingham12, Yale University13, University of Washington14, Northeastern University15, Virginia Tech16, University of York17, Cayetano Heredia University18, University of Sussex19, Nelson Marlborough Institute of Technology20, Emory University21, Columbia University22, Centre for Environment, Fisheries and Aquaculture Science23, Babson College24, Iran University of Medical Sciences25, University of Exeter26, Imperial College London27, University of Colorado Boulder28, Griffith University29, University of Aberdeen30, European Centre for Disease Prevention and Control31, Universiti Teknologi MARA32, Atlantic Oceanographic and Meteorological Laboratory33
TL;DR: The 2019 report of The Lancet Countdown on health and climate change : ensuring that the health of a child born today is not defined by a changing climate is ensured.
••
02 May 2019TL;DR: This work proposes 18 generally applicable design guidelines for human-AI interaction that can serve as a resource to practitioners working on the design of applications and features that harness AI technologies, and to researchers interested in the further development of human- AI interaction design principles.
Abstract: Advances in artificial intelligence (AI) frame opportunities and challenges for user interface design. Principles for human-AI interaction have been discussed in the human-computer interaction community for over two decades, but more study and innovation are needed in light of advances in AI and the growing uses of AI technologies in human-facing applications. We propose 18 generally applicable design guidelines for human-AI interaction. These guidelines are validated through multiple rounds of evaluation including a user study with 49 design practitioners who tested the guidelines against 20 popular AI-infused products. The results verify the relevance of the guidelines over a spectrum of interaction scenarios and reveal gaps in our knowledge, highlighting opportunities for further research. Based on the evaluations, we believe the set of design guidelines can serve as a resource to practitioners working on the design of applications and features that harness AI technologies, and to researchers interested in the further development of human-AI interaction design principles.