scispace - formally typeset
Search or ask a question

Showing papers by "New York University published in 2019"


Journal ArticleDOI
13 Jun 2019-Cell
TL;DR: A strategy to "anchor" diverse datasets together, enabling us to integrate single-cell measurements not only across scRNA-seq technologies, but also across different modalities.

7,892 citations


Journal ArticleDOI
TL;DR: In patients with type 2 diabetes and kidney disease, the risk of kidney failure and cardiovascular events was lower in the canagliflozin group than in the placebo group at a median follow-up of 2.62 years.
Abstract: Background Type 2 diabetes mellitus is the leading cause of kidney failure worldwide, but few effective long-term treatments are available. In cardiovascular trials of inhibitors of sodium...

3,233 citations


Journal ArticleDOI
TL;DR: Among patients with severe aortic stenosis who were at low surgical risk, the rate of the composite of death, stroke, or rehospitalization at 1 year was significantly lower with TAVR than with surgery.
Abstract: Background Among patients with aortic stenosis who are at intermediate or high risk for death with surgery, major outcomes are similar with transcatheter aortic-valve replacement (TAVR) an...

2,917 citations


Posted Content
TL;DR: The authors proposed a self-supervised loss that focuses on modeling inter-sentence coherence, and showed it consistently helps downstream tasks with multientence inputs, achieving state-of-the-art results on the GLUE, RACE, and \squad benchmarks.
Abstract: Increasing model size when pretraining natural language representations often results in improved performance on downstream tasks. However, at some point further model increases become harder due to GPU/TPU memory limitations and longer training times. To address these problems, we present two parameter-reduction techniques to lower memory consumption and increase the training speed of BERT. Comprehensive empirical evidence shows that our proposed methods lead to models that scale much better compared to the original BERT. We also use a self-supervised loss that focuses on modeling inter-sentence coherence, and show it consistently helps downstream tasks with multi-sentence inputs. As a result, our best model establishes new state-of-the-art results on the GLUE, RACE, and \squad benchmarks while having fewer parameters compared to BERT-large. The code and the pretrained models are available at this https URL.

2,247 citations


Journal ArticleDOI
Seth Carbon1, Eric Douglass1, Nathan Dunn1, Benjamin M. Good1  +189 moreInstitutions (19)
TL;DR: GO-CAM, a new framework for representing gene function that is more expressive than standard GO annotations, has been released, and users can now explore the growing repository of these models.
Abstract: The Gene Ontology resource (GO; http://geneontology.org) provides structured, computable knowledge regarding the functions of genes and gene products. Founded in 1998, GO has become widely adopted in the life sciences, and its contents are under continual improvement, both in quantity and in quality. Here, we report the major developments of the GO resource during the past two years. Each monthly release of the GO resource is now packaged and given a unique identifier (DOI), enabling GO-based analyses on a specific release to be reproduced in the future. The molecular function ontology has been refactored to better represent the overall activities of gene products, with a focus on transcription regulator activities. Quality assurance efforts have been ramped up to address potentially out-of-date or inaccurate annotations. New evidence codes for high-throughput experiments now enable users to filter out annotations obtained from these sources. GO-CAM, a new framework for representing gene function that is more expressive than standard GO annotations, has been released, and users can now explore the growing repository of these models. We also provide the ‘GO ribbon’ widget for visualizing GO annotations to a gene; the widget can be easily embedded in any web page.

2,138 citations


Journal ArticleDOI
TL;DR: It is proposed that the Pearson residuals from “regularized negative binomial regression,” where cellular sequencing depth is utilized as a covariate in a generalized linear model, successfully remove the influence of technical characteristics from downstream analyses while preserving biological heterogeneity.
Abstract: Single-cell RNA-seq (scRNA-seq) data exhibits significant cell-to-cell variation due to technical factors, including the number of molecules detected in each cell, which can confound biological heterogeneity with technical effects. To address this, we present a modeling framework for the normalization and variance stabilization of molecular count data from scRNA-seq experiments. We propose that the Pearson residuals from “regularized negative binomial regression,” where cellular sequencing depth is utilized as a covariate in a generalized linear model, successfully remove the influence of technical characteristics from downstream analyses while preserving biological heterogeneity. Importantly, we show that an unconstrained negative binomial model may overfit scRNA-seq data, and overcome this by pooling information across genes with similar abundances to obtain stable parameter estimates. Our procedure omits the need for heuristic steps including pseudocount addition or log-transformation and improves common downstream analytical tasks such as variable gene selection, dimensional reduction, and differential expression. Our approach can be applied to any UMI-based scRNA-seq dataset and is freely available as part of the R package sctransform, with a direct interface to our single-cell toolkit Seurat.

1,898 citations


Journal ArticleDOI
TL;DR: This article reviews in a selective way the recent research on the interface between machine learning and the physical sciences, including conceptual developments in ML motivated by physical insights, applications of machine learning techniques to several domains in physics, and cross fertilization between the two fields.
Abstract: Machine learning (ML) encompasses a broad range of algorithms and modeling tools used for a vast array of data processing tasks, which has entered most scientific disciplines in recent years. This article reviews in a selective way the recent research on the interface between machine learning and the physical sciences. This includes conceptual developments in ML motivated by physical insights, applications of machine learning techniques to several domains in physics, and cross fertilization between the two fields. After giving a basic notion of machine learning methods and principles, examples are described of how statistical physics is used to understand methods in ML. This review then describes applications of ML methods in particle physics and cosmology, quantum many-body physics, quantum computing, and chemical and material physics. Research and development into novel computing architectures aimed at accelerating ML are also highlighted. Each of the sections describe recent successes as well as domain-specific methodology and challenges.

1,504 citations


Journal ArticleDOI
TL;DR: This paper offers the first in-depth look at the vast applications of THz wireless products and applications and provides approaches for how to reduce power and increase performance across several problem domains, giving early evidence that THz techniques are compelling and available for future wireless communications.
Abstract: Frequencies from 100 GHz to 3 THz are promising bands for the next generation of wireless communication systems because of the wide swaths of unused and unexplored spectrum. These frequencies also offer the potential for revolutionary applications that will be made possible by new thinking, and advances in devices, circuits, software, signal processing, and systems. This paper describes many of the technical challenges and opportunities for wireless communication and sensing applications above 100 GHz, and presents a number of promising discoveries, novel approaches, and recent results that will aid in the development and implementation of the sixth generation (6G) of wireless networks, and beyond. This paper shows recent regulatory and standard body rulings that are anticipating wireless products and services above 100 GHz and illustrates the viability of wireless cognition, hyper-accurate position location, sensing, and imaging. This paper also presents approaches and results that show how long distance mobile communications will be supported to above 800 GHz since the antenna gains are able to overcome air-induced attenuation, and present methods that reduce the computational complexity and simplify the signal processing used in adaptive antenna arrays, by exploiting the Special Theory of Relativity to create a cone of silence in over-sampled antenna arrays that improve performance for digital phased array antennas. Also, new results that give insights into power efficient beam steering algorithms, and new propagation and partition loss models above 100 GHz are given, and promising imaging, array processing, and position location results are presented. The implementation of spatial consistency at THz frequencies, an important component of channel modeling that considers minute changes and correlations over space, is also discussed. This paper offers the first in-depth look at the vast applications of THz wireless products and applications and provides approaches for how to reduce power and increase performance across several problem domains, giving early evidence that THz techniques are compelling and available for future wireless communications.

1,352 citations


Journal ArticleDOI
TL;DR: The extent and consequences of oral diseases, their social and commercial determinants, and their ongoing neglect in global health policy are described to highlight the urgent need to address oral diseases among other NCDs as a global health priority.

1,349 citations


Posted ContentDOI
14 Mar 2019-bioRxiv
TL;DR: It is proposed that the Pearson residuals from ’regularized negative binomial regression’, where cellular sequencing depth is utilized as a covariate in a generalized linear model, successfully remove the influence of technical characteristics from downstream analyses while preserving biological heterogeneity.
Abstract: Single-cell RNA-seq (scRNA-seq) data exhibits significant cell-to-cell variation due to technical factors, including the number of molecules detected in each cell, which can confound biological heterogeneity with technical effects. To address this, we present a modeling framework for the normalization and variance stabilization of molecular count data from scRNA-seq experiments. We propose that the Pearson residuals from ’regularized negative binomial regression’, where cellular sequencing depth is utilized as a covariate in a generalized linear model, successfully remove the influence of technical characteristics from downstream analyses while preserving biological heterogeneity. Importantly, we show that an unconstrained negative binomial model may overfit scRNA-seq data, and overcome this by pooling information across genes with similar abundances to obtain stable parameter estimates. Our procedure omits the need for heuristic steps including pseudocount addition or log-transformation, and improves common downstream analytical tasks such as variable gene selection, dimensional reduction, and differential expression. Our approach can be applied to any UMI-based scRNA-seq dataset and is freely available as part of the R package sctransform, with a direct interface to our single-cell toolkit Seurat.

1,175 citations


Journal ArticleDOI
TL;DR: A convolutional neural network performs automated prediction of malignancy risk of pulmonary nodules in chest CT scan volumes and improves accuracy of lung cancer screening.
Abstract: With an estimated 160,000 deaths in 2018, lung cancer is the most common cause of cancer death in the United States1. Lung cancer screening using low-dose computed tomography has been shown to reduce mortality by 20–43% and is now included in US screening guidelines1–6. Existing challenges include inter-grader variability and high false-positive and false-negative rates7–10. We propose a deep learning algorithm that uses a patient’s current and prior computed tomography volumes to predict the risk of lung cancer. Our model achieves a state-of-the-art performance (94.4% area under the curve) on 6,716 National Lung Cancer Screening Trial cases, and performs similarly on an independent clinical validation set of 1,139 cases. We conducted two reader studies. When prior computed tomography imaging was not available, our model outperformed all six radiologists with absolute reductions of 11% in false positives and 5% in false negatives. Where prior computed tomography imaging was available, the model performance was on-par with the same radiologists. This creates an opportunity to optimize the screening process via computer assistance and automation. While the vast majority of patients remain unscreened, we show the potential for deep learning models to increase the accuracy, consistency and adoption of lung cancer screening worldwide. A convolutional neural network performs automated prediction of malignancy risk of pulmonary nodules in chest CT scan volumes and improves accuracy of lung cancer screening.


Journal ArticleDOI
TL;DR: To develop new classification criteria for systemic lupus erythematosus (SLE) jointly supported by the European League Against Rheumatism and the American College of Rheumatology (ACR).
Abstract: Objective To develop new classification criteria for systemic lupus erythematosus (SLE) jointly supported by the European League Against Rheumatism (EULAR) and the American College of Rheumatology (ACR). Methods This international initiative had four phases. 1) Evaluation of antinuclear antibody (ANA) as an entry criterion through systematic review and meta-regression of the literature and criteria generation through an international Delphi exercise, an early patient cohort, and a patient survey. 2) Criteria reduction by Delphi and nominal group technique exercises. 3) Criteria definition and weighting based on criterion performance and on results of a multi-criteria decision analysis. 4) Refinement of weights and threshold scores in a new derivation cohort of 1,001 subjects and validation compared with previous criteria in a new validation cohort of 1,270 subjects. Results The 2019 EULAR/ACR classification criteria for SLE include positive ANA at least once as obligatory entry criterion; followed by additive weighted criteria grouped in 7 clinical (constitutional, hematologic, neuropsychiatric, mucocutaneous, serosal, musculoskeletal, renal) and 3 immunologic (antiphospholipid antibodies, complement proteins, SLE-specific antibodies) domains, and weighted from 2 to 10. Patients accumulating ≥10 points are classified. In the validation cohort, the new criteria had a sensitivity of 96.1% and specificity of 93.4%, compared with 82.8% sensitivity and 93.4% specificity of the ACR 1997 and 96.7% sensitivity and 83.7% specificity of the Systemic Lupus International Collaborating Clinics 2012 criteria. Conclusion These new classification criteria were developed using rigorous methodology with multidisciplinary and international input, and have excellent sensitivity and specificity. Use of ANA entry criterion, hierarchically clustered, and weighted criteria reflects current thinking about SLE and provides an improved foundation for SLE research.

Journal ArticleDOI
Željko Ivezić1, Steven M. Kahn2, J. Anthony Tyson3, Bob Abel4  +332 moreInstitutions (55)
TL;DR: The Large Synoptic Survey Telescope (LSST) as discussed by the authors is a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile.
Abstract: We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the solar system, exploring the transient optical sky, and mapping the Milky Way. LSST will be a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2 field of view, a 3.2-gigapixel camera, and six filters (ugrizy) covering the wavelength range 320–1050 nm. The project is in the construction phase and will begin regular survey operations by 2022. About 90% of the observing time will be devoted to a deep-wide-fast survey mode that will uniformly observe a 18,000 deg2 region about 800 times (summed over all six bands) during the anticipated 10 yr of operations and will yield a co-added map to r ~ 27.5. These data will result in databases including about 32 trillion observations of 20 billion galaxies and a similar number of stars, and they will serve the majority of the primary science programs. The remaining 10% of the observing time will be allocated to special projects such as Very Deep and Very Fast time domain surveys, whose details are currently under discussion. We illustrate how the LSST science drivers led to these choices of system parameters, and we describe the expected data products and their characteristics.

Journal ArticleDOI
TL;DR: This article used a corpus of 10,657 English sentences labeled as grammatical or ungrammatical from published linguistics literature to test the ability of artificial neural networks to judge the grammatical acceptability of a sentence, with the goal of testing their linguistic competence.
Abstract: This paper investigates the ability of artificial neural networks to judge the grammatical acceptability of a sentence, with the goal of testing their linguistic competence. We introduce the Corpus of Linguistic Acceptability (CoLA), a set of 10,657 English sentences labeled as grammatical or ungrammatical from published linguistics literature. As baselines, we train several recurrent neural network models on acceptability classification, and find that our models outperform unsupervised models by Lau et al. (2016) on CoLA. Error-analysis on specific grammatical phenomena reveals that both Lau et al.’s models and ours learn systematic generalizations like subject-verb-object order. However, all models we test perform far below human level on a wide range of grammatical constructions.

Journal ArticleDOI
TL;DR: Diverse approaches for integrative single-cell analysis are discussed, including experimental methods for profiling multiple omics types from the same cells, analytical approaches for extracting additional layers of information directly from scRNA-seq data and computational integration of omics data collected across different cell samples.
Abstract: The recent maturation of single-cell RNA sequencing (scRNA-seq) technologies has coincided with transformative new methods to profile genetic, epigenetic, spatial, proteomic and lineage information in individual cells. This provides unique opportunities, alongside computational challenges, for integrative methods that can jointly learn across multiple types of data. Integrated analysis can discover relationships across cellular modalities, learn a holistic representation of the cell state, and enable the pooling of data sets produced across individuals and technologies. In this Review, we discuss the recent advances in the collection and integration of different data types at single-cell resolution with a focus on the integration of gene expression data with other types of single-cell measurement.

Posted ContentDOI
29 Apr 2019-bioRxiv
TL;DR: This work uses unsupervised learning to train a deep contextual language model on 86 billion amino acids across 250 million protein sequences spanning evolutionary diversity, enabling state-of-the-art supervised prediction of mutational effect and secondary structure, and improving state- of- the-art features for long-range contact prediction.
Abstract: In the field of artificial intelligence, a combination of scale in data and model capacity enabled by unsupervised learning has led to major advances in representation learning and statistical generation. In biology, the anticipated growth of sequencing promises unprecedented data on natural sequence diversity. Learning the natural distribution of evolutionary protein sequence variation is a logical step toward predictive and generative modeling for biology. To this end we use unsupervised learning to train a deep contextual language model on 86 billion amino acids across 250 million sequences spanning evolutionary diversity. The resulting model maps raw sequences to representations of biological properties without labels or prior domain knowledge. The learned representation space organizes sequences at multiple levels of biological granularity from the biochemical to proteomic levels. Learning recovers information about protein structure: secondary structure and residue-residue contacts can be extracted by linear projections from learned representations. With small amounts of labeled data, the ability to identify tertiary contacts is further improved. Learning on full sequence diversity rather than individual protein families increases recoverable information about secondary structure. We show the networks generalize by adapting them to variant activity prediction from sequences only, with results that are comparable to a state-of-the-art variant predictor that uses evolutionary and structurally derived features.

Journal ArticleDOI
TL;DR: It is found that sharing this content was a relatively rare activity, and Conservatives were more likely to share articles from fake news domains, which in 2016 were largely pro-Trump in orientation, than liberals or moderates.
Abstract: So-called “fake news” has renewed concerns about the prevalence and effects of misinformation in political campaigns. Given the potential for widespread dissemination of this material, we examine the individual-level characteristics associated with sharing false articles during the 2016 U.S. presidential campaign. To do so, we uniquely link an original survey with respondents’ sharing activity as recorded in Facebook profile data. First and foremost, we find that sharing this content was a relatively rare activity. Conservatives were more likely to share articles from fake news domains, which in 2016 were largely pro-Trump in orientation, than liberals or moderates. We also find a strong age effect, which persists after controlling for partisanship and ideology: On average, users over 65 shared nearly seven times as many articles from fake news domains as the youngest age group.

Proceedings Article
02 May 2019
TL;DR: A new benchmark styled after GLUE is presented, a new set of more difficult language understanding tasks, a software toolkit, and a public leaderboard are presented.
Abstract: In the last year, new models and methods for pretraining and transfer learning have driven striking performance improvements across a range of language understanding tasks. The GLUE benchmark, introduced a little over one year ago, offers a single-number metric that summarizes progress on a diverse set of such tasks, but performance on the benchmark has recently surpassed the level of non-expert humans, suggesting limited headroom for further research. In this paper we present SuperGLUE, a new benchmark styled after GLUE with a new set of more difficult language understanding tasks, a software toolkit, and a public leaderboard. SuperGLUE is available at https://super.gluebenchmark.com.

Journal ArticleDOI
Andrea Cossarizza1, Hyun-Dong Chang, Andreas Radbruch, Andreas Acs2  +459 moreInstitutions (160)
TL;DR: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community providing the theory and key practical aspects offlow cytometry enabling immunologists to avoid the common errors that often undermine immunological data.
Abstract: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community. They provide the theory and key practical aspects of flow cytometry enabling immunologists to avoid the common errors that often undermine immunological data. Notably, there are comprehensive sections of all major immune cell types with helpful Tables detailing phenotypes in murine and human cells. The latest flow cytometry techniques and applications are also described, featuring examples of the data that can be generated and, importantly, how the data can be analysed. Furthermore, there are sections detailing tips, tricks and pitfalls to avoid, all written and peer-reviewed by leading experts in the field, making this an essential research companion.

Journal ArticleDOI
TL;DR: A broad framework is provided for incorporating AI and machine learning tools into clinical oncology, with an emphasis on biomarker development, and some of the challenges relating to the use of AI are discussed, including the need for well-curated validation datasets, regulatory approval and fair reimbursement strategies.
Abstract: In the past decade, advances in precision oncology have resulted in an increased demand for predictive assays that enable the selection and stratification of patients for treatment. The enormous divergence of signalling and transcriptional networks mediating the crosstalk between cancer, stromal and immune cells complicates the development of functionally relevant biomarkers based on a single gene or protein. However, the result of these complex processes can be uniquely captured in the morphometric features of stained tissue specimens. The possibility of digitizing whole-slide images of tissue has led to the advent of artificial intelligence (AI) and machine learning tools in digital pathology, which enable mining of subvisual morphometric phenotypes and might, ultimately, improve patient management. In this Perspective, we critically evaluate various AI-based computational approaches for digital pathology, focusing on deep neural networks and 'hand-crafted' feature-based methodologies. We aim to provide a broad framework for incorporating AI and machine learning tools into clinical oncology, with an emphasis on biomarker development. We discuss some of the challenges relating to the use of AI, including the need for well-curated validation datasets, regulatory approval and fair reimbursement strategies. Finally, we present potential future opportunities for precision oncology.

Journal ArticleDOI
TL;DR: This review aims to present consensus recommendations for the optimal perioperative management of patients undergoing thoracic surgery (principally lung resection) using meta-analyses, randomized controlled trials, large non-randomized studies and reviews.
Abstract: Enhanced recovery after surgery is well established in specialties such as colorectal surgery. It is achieved through the introduction of multiple evidence-based perioperative measures that aim to diminish postoperative organ dysfunction while facilitating recovery. This review aims to present consensus recommendations for the optimal perioperative management of patients undergoing thoracic surgery (principally lung resection). A systematic review of meta-analyses, randomized controlled trials, large non-randomized studies and reviews was conducted for each protocol element. Smaller prospective and retrospective cohort studies were considered only when higher-level evidence was unavailable. The quality of the evidence base was graded by the authors and used to form consensus recommendations for each topic. Development of these recommendations was endorsed by the Enhanced Recovery after Surgery Society and the European Society for Thoracic Surgery. Recommendations were developed for a total of 45 enhanced recovery items covering topics related to preadmission, admission, intraoperative care and postoperative care. Most are based on good-quality studies. In some instances, good-quality data were not available, and subsequent recommendations are generic or based on data extrapolated from other specialties. In other cases, no recommendation can currently be made because either equipoise exists or there is a lack of available evidence. Recommendations are based not only on the quality of the evidence but also on the balance between desirable and undesirable effects. Key recommendations include preoperative counselling, nutritional screening, smoking cessation, prehabilitation for high-risk patients, avoidance of fasting, carbohydrate loading, avoidance of preoperative sedatives, venous thromboembolism prophylaxis, prevention of hypothermia, short-acting anaesthetics to facilitate early emergence, regional anaesthesia, nausea and vomiting control, opioid-sparing analgesia, euvolemic fluid management, minimally invasive surgery, early chest drain removal, avoidance of urinary catheters and early mobilization after surgery. These guidelines outline recommendations for the perioperative management of patients undergoing lung surgery based on the best available evidence. As the recommendation grade for most of the elements is strong, the use of a systematic perioperative care pathway has the potential to improve outcomes after surgery.

Journal ArticleDOI
TL;DR: High-definition spatial transcriptomics is developed, which captures RNA from histological tissue sections on a dense, spatially barcoded bead array, which opens the way to high-resolution spatial analysis of cells and tissues.
Abstract: Spatial and molecular characteristics determine tissue function, yet high-resolution methods to capture both concurrently are lacking. Here, we developed high-definition spatial transcriptomics, which captures RNA from histological tissue sections on a dense, spatially barcoded bead array. Each experiment recovers several hundred thousand transcript-coupled spatial barcodes at 2-μm resolution, as demonstrated in mouse brain and primary breast cancer. This opens the way to high-resolution spatial analysis of cells and tissues.

Journal ArticleDOI
TL;DR: These new classification criteria for systemic lupus erythematosus have excellent sensitivity and specificity, and were developed using rigorous methodology with multidisciplinary and international input.
Abstract: Objective To develop new classification criteria for systemic lupus erythematosus (SLE) jointly supported by the European League Against Rheumatism (EULAR) and the American College of Rheumatology (ACR). Methods This international initiative had four phases. (1) Evaluation of antinuclear antibody (ANA) as an entry criterion through systematic review and meta-regression of the literature and criteria generation through an international Delphi exercise, an early patient cohort and a patient survey. (2) Criteria reduction by Delphi and nominal group technique exercises. (3) Criteria definition and weighting based on criterion performance and on results of a multi-criteria decision analysis. (4) Refinement of weights and threshold scores in a new derivation cohort of 1001 subjects and validation compared with previous criteria in a new validation cohort of 1270 subjects. Results The 2019 EULAR/ACR classification criteria for SLE include positive ANA at least once as obligatory entry criterion; followed by additive weighted criteria grouped in seven clinical (constitutional, haematological, neuropsychiatric, mucocutaneous, serosal, musculoskeletal, renal) and three immunological (antiphospholipid antibodies, complement proteins, SLE-specific antibodies) domains, and weighted from 2 to 10. Patients accumulating ≥10 points are classified. In the validation cohort, the new criteria had a sensitivity of 96.1% and specificity of 93.4%, compared with 82.8% sensitivity and 93.4% specificity of the ACR 1997 and 96.7% sensitivity and 83.7% specificity of the Systemic Lupus International Collaborating Clinics 2012 criteria. Conclusion These new classification criteria were developed using rigorous methodology with multidisciplinary and international input, and have excellent sensitivity and specificity. Use of ANA entry criterion, hierarchically clustered and weighted criteria reflect current thinking about SLE and provide an improved foundation for SLE research.

Journal ArticleDOI
TL;DR: It is shown that the outsourced training introduces new security risks: an adversary can create a maliciously trained network (a backdoored neural network, or a BadNet) that has the state-of-the-art performance on the user's training and validation samples but behaves badly on specific attacker-chosen inputs.
Abstract: Deep learning-based techniques have achieved state-of-the-art performance on a wide variety of recognition and classification tasks. However, these networks are typically computationally expensive to train, requiring weeks of computation on many GPUs; as a result, many users outsource the training procedure to the cloud or rely on pre-trained models that are then fine-tuned for a specific task. In this paper, we show that the outsourced training introduces new security risks: an adversary can create a maliciously trained network (a backdoored neural network, or a BadNet) that has the state-of-the-art performance on the user's training and validation samples but behaves badly on specific attacker-chosen inputs. We first explore the properties of BadNets in a toy example, by creating a backdoored handwritten digit classifier. Next, we demonstrate backdoors in a more realistic scenario by creating a U.S. street sign classifier that identifies stop signs as speed limits when a special sticker is added to the stop sign; we then show in addition that the backdoor in our U.S. street sign detector can persist even if the network is later retrained for another task and cause a drop in an accuracy of 25% on average when the backdoor trigger is present. These results demonstrate that backdoors in neural networks are both powerful and-because the behavior of neural networks is difficult to explicate-stealthy. This paper provides motivation for further research into techniques for verifying and inspecting neural networks, just as we have developed tools for verifying and debugging software.

Journal ArticleDOI
07 Feb 2019-Nature
TL;DR: It is proposed that the activation of retrotransposons is an important component of sterile inflammation that is a hallmark of ageing, and that L1 reverse transcriptase is a relevant target for the treatment of age-associated disorders.
Abstract: Retrotransposable elements are deleterious at many levels, and the failure of host surveillance systems for these elements can thus have negative consequences. However, the contribution of retrotransposon activity to ageing and age-associated diseases is not known. Here we show that during cellular senescence, L1 (also known as LINE-1) retrotransposable elements become transcriptionally derepressed and activate a type-I interferon (IFN-I) response. The IFN-I response is a phenotype of late senescence and contributes to the maintenance of the senescence-associated secretory phenotype. The IFN-I response is triggered by cytoplasmic L1 cDNA, and is antagonized by inhibitors of the L1 reverse transcriptase. Treatment of aged mice with the nucleoside reverse transcriptase inhibitor lamivudine downregulated IFN-I activation and age-associated inflammation (inflammaging) in several tissues. We propose that the activation of retrotransposons is an important component of sterile inflammation that is a hallmark of ageing, and that L1 reverse transcriptase is a relevant target for the treatment of age-associated disorders. During cellular senescence in human and mouse cells, L1 transposons become transcriptionally derepressed and trigger a type-1 interferon response, which contributes to age-associated inflammation and age-related phenotypes.

Journal ArticleDOI
TL;DR: In this paper, the authors explain how the first chapter of the massive MIMO research saga has come to an end, while the story has just begun, and outline five new massive antenna array related research directions.

Journal ArticleDOI
10 Apr 2019-Nature
TL;DR: This analysis revealed previously unappreciated levels of cellular heterogeneity within the bone marrow niche and resolved cellular sources of pro-haematopoietic growth factors, chemokines and membrane-bound ligands as well as substantial transcriptional remodelling under stress conditions.
Abstract: The bone marrow microenvironment has a key role in regulating haematopoiesis, but its molecular complexity and response to stress are incompletely understood. Here we map the transcriptional landscape of mouse bone marrow vascular, perivascular and osteoblast cell populations at single-cell resolution, both at homeostasis and under conditions of stress-induced haematopoiesis. This analysis revealed previously unappreciated levels of cellular heterogeneity within the bone marrow niche and resolved cellular sources of pro-haematopoietic growth factors, chemokines and membrane-bound ligands. Our studies demonstrate a considerable transcriptional remodelling of niche elements under stress conditions, including an adipocytic skewing of perivascular cells. Among the stress-induced changes, we observed that vascular Notch delta-like ligands (encoded by Dll1 and Dll4) were downregulated. In the absence of vascular Dll4, haematopoietic stem cells prematurely induced a myeloid transcriptional program. These findings refine our understanding of the cellular architecture of the bone marrow niche, reveal a dynamic and heterogeneous molecular landscape that is highly sensitive to stress and illustrate the utility of single-cell transcriptomic data in evaluating the regulation of haematopoiesis by discrete niche populations.

Journal ArticleDOI
TL;DR: These efforts to promote transparency and reproducibility by disseminating protocols for enhanced-sampling molecular simulations are outlined.
Abstract: The PLUMED consortium unifies developers and contributors to PLUMED, an open-source library for enhanced- sampling, free-energy calculations and the analysis of molecular dynamics simulations. Here, we outline our efforts to promote transparency and reproducibility by disseminating protocols for enhanced-sampling molecular simulations.

Journal ArticleDOI
27 Nov 2019-Nature
TL;DR: Two derivatives of lithocholic acid are revealed that act as regulators of T helper cells that express IL-17a and regulatory T cells, thus influencing host immune responses.
Abstract: Bile acids are abundant in the mammalian gut, where they undergo bacteria-mediated transformation to generate a large pool of bioactive molecules. Although bile acids are known to affect host metabolism, cancer progression and innate immunity, it is unknown whether they affect adaptive immune cells such as T helper cells that express IL-17a (TH17 cells) or regulatory T cells (Treg cells). Here we screen a library of bile acid metabolites and identify two distinct derivatives of lithocholic acid (LCA), 3-oxoLCA and isoalloLCA, as T cell regulators in mice. 3-OxoLCA inhibited the differentiation of TH17 cells by directly binding to the key transcription factor retinoid-related orphan receptor-γt (RORγt) and isoalloLCA increased the differentiation of Treg cells through the production of mitochondrial reactive oxygen species (mitoROS), which led to increased expression of FOXP3. The isoalloLCA-mediated enhancement of Treg cell differentiation required an intronic Foxp3 enhancer, the conserved noncoding sequence (CNS) 3; this represents a mode of action distinct from that of previously identified metabolites that increase Treg cell differentiation, which require CNS1. The administration of 3-oxoLCA and isoalloLCA to mice reduced TH17 cell differentiation and increased Treg cell differentiation, respectively, in the intestinal lamina propria. Our data suggest mechanisms through which bile acid metabolites control host immune responses, by directly modulating the balance of TH17 and Treg cells. Screening of a library of bile acid metabolites revealed two derivatives of lithocholic acid that act as regulators of T helper cells that express IL-17a and regulatory T cells, thus influencing host immune responses.