scispace - formally typeset
Search or ask a question

Showing papers by "University of California, Irvine published in 2019"


Journal ArticleDOI
Eli A. Stahl1, Eli A. Stahl2, Gerome Breen3, Andreas J. Forstner  +339 moreInstitutions (107)
TL;DR: Genome-wide analysis identifies 30 loci associated with bipolar disorder, allowing for comparisons of shared genes and pathways with other psychiatric disorders, including schizophrenia and depression.
Abstract: Bipolar disorder is a highly heritable psychiatric disorder. We performed a genome-wide association study (GWAS) including 20,352 cases and 31,358 controls of European descent, with follow-up analysis of 822 variants with P < 1 × 10-4 in an additional 9,412 cases and 137,760 controls. Eight of the 19 variants that were genome-wide significant (P < 5 × 10-8) in the discovery GWAS were not genome-wide significant in the combined analysis, consistent with small effect sizes and limited power but also with genetic heterogeneity. In the combined analysis, 30 loci were genome-wide significant, including 20 newly identified loci. The significant loci contain genes encoding ion channels, neurotransmitter transporters and synaptic components. Pathway analysis revealed nine significantly enriched gene sets, including regulation of insulin secretion and endocannabinoid signaling. Bipolar I disorder is strongly genetically correlated with schizophrenia, driven by psychosis, whereas bipolar II disorder is more strongly correlated with major depressive disorder. These findings address key clinical questions and provide potential biological mechanisms for bipolar disorder.

1,090 citations


Journal ArticleDOI
TL;DR: This Consensus Statement documents the central role and global importance of microorganisms in climate change biology and puts humanity on notice that the impact of climate change will depend heavily on responses of micro organisms, which are essential for achieving an environmentally sustainable future.
Abstract: In the Anthropocene, in which we now live, climate change is impacting most life on Earth. Microorganisms support the existence of all higher trophic life forms. To understand how humans and other life forms on Earth (including those we are yet to discover) can withstand anthropogenic climate change, it is vital to incorporate knowledge of the microbial 'unseen majority'. We must learn not just how microorganisms affect climate change (including production and consumption of greenhouse gases) but also how they will be affected by climate change and other human activities. This Consensus Statement documents the central role and global importance of microorganisms in climate change biology. It also puts humanity on notice that the impact of climate change will depend heavily on responses of microorganisms, which are essential for achieving an environmentally sustainable future.

963 citations


Journal ArticleDOI
Željko Ivezić1, Steven M. Kahn2, J. Anthony Tyson3, Bob Abel4  +332 moreInstitutions (55)
TL;DR: The Large Synoptic Survey Telescope (LSST) as discussed by the authors is a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile.
Abstract: We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the solar system, exploring the transient optical sky, and mapping the Milky Way. LSST will be a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2 field of view, a 3.2-gigapixel camera, and six filters (ugrizy) covering the wavelength range 320–1050 nm. The project is in the construction phase and will begin regular survey operations by 2022. About 90% of the observing time will be devoted to a deep-wide-fast survey mode that will uniformly observe a 18,000 deg2 region about 800 times (summed over all six bands) during the anticipated 10 yr of operations and will yield a co-added map to r ~ 27.5. These data will result in databases including about 32 trillion observations of 20 billion galaxies and a similar number of stars, and they will serve the majority of the primary science programs. The remaining 10% of the observing time will be allocated to special projects such as Very Deep and Very Fast time domain surveys, whose details are currently under discussion. We illustrate how the LSST science drivers led to these choices of system parameters, and we describe the expected data products and their characteristics.

921 citations


Journal ArticleDOI
01 Jun 2019-Brain
TL;DR: A recently recognized brain disorder that mimics the clinical features of Alzheimer’s disease: Limbic-predominant Age-related TDP-43 Encephalopathy (LATE).
Abstract: We describe a recently recognized disease entity, limbic-predominant age-related TDP-43 encephalopathy (LATE). LATE neuropathological change (LATE-NC) is defined by a stereotypical TDP-43 proteinopathy in older adults, with or without coexisting hippocampal sclerosis pathology. LATE-NC is a common TDP-43 proteinopathy, associated with an amnestic dementia syndrome that mimicked Alzheimer's-type dementia in retrospective autopsy studies. LATE is distinguished from frontotemporal lobar degeneration with TDP-43 pathology based on its epidemiology (LATE generally affects older subjects), and relatively restricted neuroanatomical distribution of TDP-43 proteinopathy. In community-based autopsy cohorts, ∼25% of brains had sufficient burden of LATE-NC to be associated with discernible cognitive impairment. Many subjects with LATE-NC have comorbid brain pathologies, often including amyloid-β plaques and tauopathy. Given that the 'oldest-old' are at greatest risk for LATE-NC, and subjects of advanced age constitute a rapidly growing demographic group in many countries, LATE has an expanding but under-recognized impact on public health. For these reasons, a working group was convened to develop diagnostic criteria for LATE, aiming both to stimulate research and to promote awareness of this pathway to dementia. We report consensus-based recommendations including guidelines for diagnosis and staging of LATE-NC. For routine autopsy workup of LATE-NC, an anatomically-based preliminary staging scheme is proposed with TDP-43 immunohistochemistry on tissue from three brain areas, reflecting a hierarchical pattern of brain involvement: amygdala, hippocampus, and middle frontal gyrus. LATE-NC appears to affect the medial temporal lobe structures preferentially, but other areas also are impacted. Neuroimaging studies demonstrated that subjects with LATE-NC also had atrophy in the medial temporal lobes, frontal cortex, and other brain regions. Genetic studies have thus far indicated five genes with risk alleles for LATE-NC: GRN, TMEM106B, ABCC9, KCNMB2, and APOE. The discovery of these genetic risk variants indicate that LATE shares pathogenetic mechanisms with both frontotemporal lobar degeneration and Alzheimer's disease, but also suggests disease-specific underlying mechanisms. Large gaps remain in our understanding of LATE. For advances in prevention, diagnosis, and treatment, there is an urgent need for research focused on LATE, including in vitro and animal models. An obstacle to clinical progress is lack of diagnostic tools, such as biofluid or neuroimaging biomarkers, for ante-mortem detection of LATE. Development of a disease biomarker would augment observational studies seeking to further define the risk factors, natural history, and clinical features of LATE, as well as eventual subject recruitment for targeted therapies in clinical trials.

753 citations


Journal ArticleDOI
TL;DR: A novel deep learning framework to achieve highly accurate machine fault diagnosis using transfer learning to enable and accelerate the training of deep neural network is developed.
Abstract: We develop a novel deep learning framework to achieve highly accurate machine fault diagnosis using transfer learning to enable and accelerate the training of deep neural network. Compared with existing methods, the proposed method is faster to train and more accurate. First, original sensor data are converted to images by conducting a Wavelet transformation to obtain time-frequency distributions. Next, a pretrained network is used to extract lower level features. The labeled time-frequency images are then used to fine-tune the higher levels of the neural network architecture. This paper creates a machine fault diagnosis pipeline and experiments are carried out to verify the effectiveness and generalization of the pipeline on three main mechanical datasets including induction motors, gearboxes, and bearings with sizes of 6000, 9000, and 5000 time series samples, respectively. We achieve state-of-the-art results on each dataset, with most datasets showing test accuracy near 100%, and in the gearbox dataset, we achieve significant improvement from 94.8% to 99.64%. We created a repository including these datasets located at mlmechanics.ics.uci.edu.

721 citations


Journal ArticleDOI
TL;DR: In certain subgroups, PFS was positively associated with PD-L1 expression (KRAS, EGFR) and with smoking status (BRAF, HER2) and the lack of response in the ALK group was notable.

719 citations


Journal ArticleDOI
TL;DR: The Community Land Model (CLM) is the land component of the Community Earth System Model (CESM) and is used in several global and regional modeling systems.
Abstract: The Community Land Model (CLM) is the land component of the Community Earth System Model (CESM) and is used in several global and regional modeling systems. In this paper, we introduce model developments included in CLM version 5 (CLM5), which is the default land component for CESM2. We assess an ensemble of simulations, including prescribed and prognostic vegetation state, multiple forcing data sets, and CLM4, CLM4.5, and CLM5, against a range of metrics including from the International Land Model Benchmarking (ILAMBv2) package. CLM5 includes new and updated processes and parameterizations: (1) dynamic land units, (2) updated parameterizations and structure for hydrology and snow (spatially explicit soil depth, dry surface layer, revised groundwater scheme, revised canopy interception and canopy snow processes, updated fresh snow density, simple firn model, and Model for Scale Adaptive River Transport), (3) plant hydraulics and hydraulic redistribution, (4) revised nitrogen cycling (flexible leaf stoichiometry, leaf N optimization for photosynthesis, and carbon costs for plant nitrogen uptake), (5) global crop model with six crop types and time‐evolving irrigated areas and fertilization rates, (6) updated urban building energy, (7) carbon isotopes, and (8) updated stomatal physiology. New optional features include demographically structured dynamic vegetation model (Functionally Assembled Terrestrial Ecosystem Simulator), ozone damage to plants, and fire trace gas emissions coupling to the atmosphere. Conclusive establishment of improvement or degradation of individual variables or metrics is challenged by forcing uncertainty, parametric uncertainty, and model structural complexity, but the multivariate metrics presented here suggest a general broad improvement from CLM4 to CLM5.

661 citations


Journal ArticleDOI
TL;DR: During the entire period, the mass loss concentrated in areas closest to warm, salty, subsurface, circumpolar deep water (CDW), consistent with enhanced polar westerlies pushing CDW toward Antarctica to melt its floating ice shelves, destabilize the glaciers, and raise sea level.
Abstract: We use updated drainage inventory, ice thickness, and ice velocity data to calculate the grounding line ice discharge of 176 basins draining the Antarctic Ice Sheet from 1979 to 2017. We compare the results with a surface mass balance model to deduce the ice sheet mass balance. The total mass loss increased from 40 ± 9 Gt/y in 1979–1990 to 50 ± 14 Gt/y in 1989–2000, 166 ± 18 Gt/y in 1999–2009, and 252 ± 26 Gt/y in 2009–2017. In 2009–2017, the mass loss was dominated by the Amundsen/Bellingshausen Sea sectors, in West Antarctica (159 ± 8 Gt/y), Wilkes Land, in East Antarctica (51 ± 13 Gt/y), and West and Northeast Peninsula (42 ± 5 Gt/y). The contribution to sea-level rise from Antarctica averaged 3.6 ± 0.5 mm per decade with a cumulative 14.0 ± 2.0 mm since 1979, including 6.9 ± 0.6 mm from West Antarctica, 4.4 ± 0.9 mm from East Antarctica, and 2.5 ± 0.4 mm from the Peninsula (i.e., East Antarctica is a major participant in the mass loss). During the entire period, the mass loss concentrated in areas closest to warm, salty, subsurface, circumpolar deep water (CDW), that is, consistent with enhanced polar westerlies pushing CDW toward Antarctica to melt its floating ice shelves, destabilize the glaciers, and raise sea level.

654 citations


Journal ArticleDOI
Nasim Mavaddat1, Kyriaki Michailidou1, Kyriaki Michailidou2, Joe Dennis1  +307 moreInstitutions (105)
TL;DR: This PRS, optimized for prediction of estrogen receptor (ER)-specific disease, from the largest available genome-wide association dataset is developed and empirically validated and is a powerful and reliable predictor of breast cancer risk that may improve breast cancer prevention programs.
Abstract: Stratification of women according to their risk of breast cancer based on polygenic risk scores (PRSs) could improve screening and prevention strategies. Our aim was to develop PRSs, optimized for prediction of estrogen receptor (ER)-specific disease, from the largest available genome-wide association dataset and to empirically validate the PRSs in prospective studies. The development dataset comprised 94,075 case subjects and 75,017 control subjects of European ancestry from 69 studies, divided into training and validation sets. Samples were genotyped using genome-wide arrays, and single-nucleotide polymorphisms (SNPs) were selected by stepwise regression or lasso penalized regression. The best performing PRSs were validated in an independent test set comprising 11,428 case subjects and 18,323 control subjects from 10 prospective studies and 190,040 women from UK Biobank (3,215 incident breast cancers). For the best PRSs (313 SNPs), the odds ratio for overall disease per 1 standard deviation in ten prospective studies was 1.61 (95%CI: 1.57-1.65) with area under receiver-operator curve (AUC) = 0.630 (95%CI: 0.628-0.651). The lifetime risk of overall breast cancer in the top centile of the PRSs was 32.6%. Compared with women in the middle quintile, those in the highest 1% of risk had 4.37- and 2.78-fold risks, and those in the lowest 1% of risk had 0.16- and 0.27-fold risks, of developing ER-positive and ER-negative disease, respectively. Goodness-of-fit tests indicated that this PRS was well calibrated and predicts disease risk accurately in the tails of the distribution. This PRS is a powerful and reliable predictor of breast cancer risk that may improve breast cancer prevention programs.

653 citations


Journal ArticleDOI
19 Sep 2019-Nature
TL;DR: A US national experiment showed that a short, online, self-administered growth mindset intervention can increase adolescents’ grades and advanced course-taking, and identified the types of school that were poised to benefit the most.
Abstract: A global priority for the behavioural sciences is to develop cost-effective, scalable interventions that could improve the academic outcomes of adolescents at a population level, but no such interventions have so far been evaluated in a population-generalizable sample. Here we show that a short (less than one hour), online growth mindset intervention-which teaches that intellectual abilities can be developed-improved grades among lower-achieving students and increased overall enrolment to advanced mathematics courses in a nationally representative sample of students in secondary education in the United States. Notably, the study identified school contexts that sustained the effects of the growth mindset intervention: the intervention changed grades when peer norms aligned with the messages of the intervention. Confidence in the conclusions of this study comes from independent data collection and processing, pre-registration of analyses, and corroboration of results by a blinded Bayesian analysis.

583 citations


Journal ArticleDOI
TL;DR: This article elucidates step-by-step the problems typically encountered when training SNNs and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting as well as introducing surrogate gradient methods, specifically, as a particularly flexible and efficient method to overcome the aforementioned challenges.
Abstract: Spiking neural networks (SNNs) are nature's versatile solution to fault-tolerant, energy-efficient signal processing. To translate these benefits into hardware, a growing number of neuromorphic spiking NN processors have attempted to emulate biological NNs. These developments have created an imminent need for methods and tools that enable such systems to solve real-world signal processing problems. Like conventional NNs, SNNs can be trained on real, domain-specific data; however, their training requires the overcoming of a number of challenges linked to their binary and dynamical nature. This article elucidates step-by-step the problems typically encountered when training SNNs and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting. Accordingly, it gives an overview of existing approaches and provides an introduction to surrogate gradient (SG) methods, specifically, as a particularly flexible and efficient method to overcome the aforementioned challenges.

Journal ArticleDOI
TL;DR: Temporal Segment Networks (TSN) as discussed by the authors is proposed to model long-range temporal structure with a new segment-based sampling and aggregation scheme, which enables the TSN framework to efficiently learn action models by using the whole video.
Abstract: We present a general and flexible video-level framework for learning action models in videos. This method, called temporal segment network (TSN), aims to model long-range temporal structure with a new segment-based sampling and aggregation scheme. This unique design enables the TSN framework to efficiently learn action models by using the whole video. The learned models could be easily deployed for action recognition in both trimmed and untrimmed videos with simple average pooling and multi-scale temporal window integration, respectively. We also study a series of good practices for the implementation of the TSN framework given limited training samples. Our approach obtains the state-the-of-art performance on five challenging action recognition benchmarks: HMDB51 (71.0 percent), UCF101 (94.9 percent), THUMOS14 (80.1 percent), ActivityNet v1.2 (89.6 percent), and Kinetics400 (75.7 percent). In addition, using the proposed RGB difference as a simple motion representation, our method can still achieve competitive accuracy on UCF101 (91.0 percent) while running at 340 FPS. Furthermore, based on the proposed TSN framework, we won the video classification track at the ActivityNet challenge 2016 among 24 teams.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

Journal ArticleDOI
Arjun Dey, David J. Schlegel1, Dustin Lang2, Dustin Lang3  +162 moreInstitutions (52)
TL;DR: The DESI Legacy Imaging Surveys (http://legacysurvey.org/) as mentioned in this paper is a combination of three public projects (the Dark Energy Camera Legacy Survey, the Beijing-Arizona Sky Survey, and the Mayall z-band Legacy Survey) that will jointly image ≈14,000 deg2 of the extragalactic sky visible from the northern hemisphere in three optical bands (g, r, and z) using telescopes at the Kitt Peak National Observatory and the Cerro Tololo Inter-American Observatory.
Abstract: The DESI Legacy Imaging Surveys (http://legacysurvey.org/) are a combination of three public projects (the Dark Energy Camera Legacy Survey, the Beijing–Arizona Sky Survey, and the Mayall z-band Legacy Survey) that will jointly image ≈14,000 deg2 of the extragalactic sky visible from the northern hemisphere in three optical bands (g, r, and z) using telescopes at the Kitt Peak National Observatory and the Cerro Tololo Inter-American Observatory. The combined survey footprint is split into two contiguous areas by the Galactic plane. The optical imaging is conducted using a unique strategy of dynamically adjusting the exposure times and pointing selection during observing that results in a survey of nearly uniform depth. In addition to calibrated images, the project is delivering a catalog, constructed by using a probabilistic inference-based approach to estimate source shapes and brightnesses. The catalog includes photometry from the grz optical bands and from four mid-infrared bands (at 3.4, 4.6, 12, and 22 μm) observed by the Wide-field Infrared Survey Explorer satellite during its full operational lifetime. The project plans two public data releases each year. All the software used to generate the catalogs is also released with the data. This paper provides an overview of the Legacy Surveys project.

Journal ArticleDOI
TL;DR: Single Mo atoms anchored to nitrogen-doped porous carbon as a cost-effective catalyst for the NRR achieves a high NH3 yield rate and a high Faradaic efficiency, considerably higher compared to previously reported non-precious-metal electrocatalysts.
Abstract: NH3 synthesis by the electrocatalytic N2 reduction reaction (NRR) under ambient conditions is an appealing alternative to the currently employed industrial method-the Haber-Bosch process-that requires high temperature and pressure. We report single Mo atoms anchored to nitrogen-doped porous carbon as a cost-effective catalyst for the NRR. Benefiting from the optimally high density of active sites and hierarchically porous carbon frameworks, this catalyst achieves a high NH3 yield rate (34.0±3.6 μg NH 3 h-1 mgcat. -1 ) and a high Faradaic efficiency (14.6±1.6 %) in 0.1 m KOH at room temperature. These values are considerably higher compared to previously reported non-precious-metal electrocatalysts. Moreover, this catalyst displays no obvious current drop during a 50 000 s NRR, and high activity and durability are achieved in 0.1 m HCl. The findings provide a promising lead for the design of efficient and robust single-atom non-precious-metal catalysts for the electrocatalytic NRR.

Journal ArticleDOI
16 May 2019-Cell
TL;DR: Systemic administration of a brain-penetrant selenopeptide activates homeostatic transcription to inhibit cell death and improves function when delivered after hemorrhagic or ischemic stroke.

Journal ArticleDOI
TL;DR: The ibrutinib-rituximab regimen resulted in progression-free survival and overall survival that were superior to those with a standard chemoimmunotherapy regimen among patients 70 years of age or younger with previously untreated CLL.
Abstract: Background Data regarding the efficacy of treatment with ibrutinib–rituximab, as compared with standard chemoimmunotherapy with fludarabine, cyclophosphamide, and rituximab, in patients wi...

Journal ArticleDOI
TL;DR: Even in years of high SMB, enhanced glacier discharge has remained sufficiently high above equilibrium to maintain an annual mass loss every year since 1998, and the acceleration in mass loss switched from positive in 2000–2010 to negative in 2010–2018, which illustrates the difficulty of extrapolating short records into longer-term trends.
Abstract: We reconstruct the mass balance of the Greenland Ice Sheet using a comprehensive survey of thickness, surface elevation, velocity, and surface mass balance (SMB) of 260 glaciers from 1972 to 2018. We calculate mass discharge, D, into the ocean directly for 107 glaciers (85% of D) and indirectly for 110 glaciers (15%) using velocity-scaled reference fluxes. The decadal mass balance switched from a mass gain of +47 ± 21 Gt/y in 1972-1980 to a loss of 51 ± 17 Gt/y in 1980-1990. The mass loss increased from 41 ± 17 Gt/y in 1990-2000, to 187 ± 17 Gt/y in 2000-2010, to 286 ± 20 Gt/y in 2010-2018, or sixfold since the 1980s, or 80 ± 6 Gt/y per decade, on average. The acceleration in mass loss switched from positive in 2000-2010 to negative in 2010-2018 due to a series of cold summers, which illustrates the difficulty of extrapolating short records into longer-term trends. Cumulated since 1972, the largest contributions to global sea level rise are from northwest (4.4 ± 0.2 mm), southeast (3.0 ± 0.3 mm), and central west (2.0 ± 0.2 mm) Greenland, with a total 13.7 ± 1.1 mm for the ice sheet. The mass loss is controlled at 66 ± 8% by glacier dynamics (9.1 mm) and 34 ± 8% by SMB (4.6 mm). Even in years of high SMB, enhanced glacier discharge has remained sufficiently high above equilibrium to maintain an annual mass loss every year since 1998.

Journal ArticleDOI
F. Kyle Satterstrom1, Jack A. Kosmicki1, Jiebiao Wang2, Michael S. Breen3  +150 moreInstitutions (45)
TL;DR: Using an enhanced Bayesian framework to integrate de novo and case-control rare variation, 102 risk genes are identified at a false discovery rate of ≤ 0.1, consistent with multiple paths to an excitatory/inhibitory imbalance underlying ASD.
Abstract: We present the largest exome sequencing study of autism spectrum disorder (ASD) to date (n=35,584 total samples, 11,986 with ASD). Using an enhanced Bayesian framework to integrate de novo and case-control rare variation, we identify 102 risk genes at a false discovery rate ≤ 0.1. Of these genes, 49 show higher frequencies of disruptive de novo variants in individuals ascertained for severe neurodevelopmental delay, while 53 show higher frequencies in individuals ascertained for ASD; comparing ASD cases with mutations in these groups reveals phenotypic differences. Expressed early in brain development, most of the risk genes have roles in regulation of gene expression or neuronal communication (i.e., mutations effect neurodevelopmental and neurophysiological changes), and 13 fall within loci recurrently hit by copy number variants. In human cortex single-cell gene expression data, expression of risk genes is enriched in both excitatory and inhibitory neuronal lineages, consistent with multiple paths to an excitatory/inhibitory imbalance underlying ASD.

Journal ArticleDOI
Jean-Christophe Golaz1, Peter M. Caldwell1, Luke Van Roekel2, Mark R. Petersen2, Qi Tang1, Jonathan Wolfe2, G. W. Abeshu3, Valentine G. Anantharaj4, Xylar Asay-Davis2, David C. Bader1, Sterling Baldwin1, Gautam Bisht5, Peter A. Bogenschutz1, Marcia L. Branstetter4, Michael A. Brunke6, Steven R. Brus2, Susannah M. Burrows7, Philip Cameron-Smith1, Aaron S. Donahue1, Michael Deakin8, Michael Deakin9, Richard C. Easter7, Katherine J. Evans4, Yan Feng10, Mark Flanner11, James G. Foucar8, Jeremy Fyke2, Brian M. Griffin12, Cecile Hannay13, Bryce E. Harrop7, Mattthew J. Hoffman2, Elizabeth Hunke2, Robert Jacob10, Douglas W. Jacobsen2, Nicole Jeffery2, Philip W. Jones2, Noel Keen5, Stephen A. Klein1, Vincent E. Larson12, L. Ruby Leung7, Hongyi Li3, Wuyin Lin14, William H. Lipscomb13, William H. Lipscomb2, Po-Lun Ma7, Salil Mahajan4, Mathew Maltrud2, Azamat Mametjanov10, Julie L. McClean15, Renata B. McCoy1, Richard Neale13, Stephen Price2, Yun Qian7, Philip J. Rasch7, J. E. Jack Reeves Eyre6, William J. Riley5, Todd D. Ringler16, Todd D. Ringler2, Andrew Roberts2, Erika Louise Roesler8, Andrew G. Salinger8, Zeshawn Shaheen1, Xiaoying Shi4, Balwinder Singh7, Jinyun Tang5, Mark A. Taylor8, Peter E. Thornton4, Adrian K. Turner2, Milena Veneziani2, Hui Wan7, Hailong Wang7, Shanlin Wang2, Dean N. Williams1, Phillip J. Wolfram2, Patrick H. Worley4, Shaocheng Xie1, Yang Yang7, Jin-Ho Yoon17, Mark D. Zelinka1, Charles S. Zender18, Xubin Zeng6, Chengzhu Zhang1, Kai Zhang7, Yuying Zhang1, X. Zheng1, Tian Zhou7, Qing Zhu5 
TL;DR: Energy Exascale Earth System Model (E3SM) project as mentioned in this paper is a project of the U.S. Department of Energy that aims to develop and validate the E3SM model.
Abstract: Energy Exascale Earth System Model (E3SM) project - U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research; Climate Model Development and Validation activity - Office of Biological and Environmental Research in the US Department of Energy Office of Science; Regional and Global Modeling and Analysis Program of the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research; National Research Foundation [NRF_2017R1A2b4007480]; Office of Science of the U.S. Department of Energy [DE-AC02-05CH11231]; DOE Office of Science User Facility [DE-AC05-00OR22725]; U.S. Department of Energy by Lawrence Livermore National Laboratory [DE-AC52-07NA27344]; DOE [DE-AC05-76RLO1830]; National Center for Atmospheric Research - National Science Foundation [1852977];[DE-SC0012778]

Journal ArticleDOI
Donald J. Hagler1, Sean N. Hatton1, M. Daniela Cornejo1, Carolina Makowski2, Damien A. Fair3, Anthony Steven Dick4, Matthew T. Sutherland4, B. J. Casey5, M Deanna6, Michael P. Harms6, Richard Watts5, James M. Bjork7, Hugh Garavan8, Laura Hilmer1, Christopher J. Pung1, Chelsea S. Sicat1, Joshua M. Kuperman1, Hauke Bartsch1, Feng Xue1, Mary M. Heitzeg9, Angela R. Laird4, Thanh T. Trinh1, Raul Gonzalez4, Susan F. Tapert1, Michael C. Riedel4, Lindsay M. Squeglia10, Luke W. Hyde9, Monica D. Rosenberg5, Eric Earl3, Katia D. Howlett11, Fiona C. Baker12, Mary E. Soules9, Jazmin Diaz1, Octavio Ruiz de Leon1, Wesley K. Thompson1, Michael C. Neale7, Megan M. Herting13, Elizabeth R. Sowell13, Ruben P. Alvarez11, Samuel W. Hawes4, Mariana Sanchez4, Jerzy Bodurka14, Florence J. Breslin14, Amanda Sheffield Morris14, Martin P. Paulus14, W. Kyle Simmons14, Jonathan R. Polimeni15, Andre van der Kouwe15, Andrew S. Nencka16, Kevin M. Gray10, Carlo Pierpaoli11, John A. Matochik11, Antonio Noronha11, Will M. Aklin11, Kevin P. Conway11, Meyer D. Glantz11, Elizabeth Hoffman11, Roger Little11, Marsha F. Lopez11, Vani Pariyadath11, Susan R.B. Weiss11, Dana L. Wolff-Hughes, Rebecca DelCarmen-Wiggins, Sarah W. Feldstein Ewing3, Oscar Miranda-Dominguez3, Bonnie J. Nagel3, Anders Perrone3, Darrick Sturgeon3, Aimee Goldstone12, Adolf Pfefferbaum12, Kilian M. Pohl12, Devin Prouty12, Kristina A. Uban17, Susan Y. Bookheimer18, Mirella Dapretto18, Adriana Galván18, Kara Bagot1, Jay N. Giedd1, M. Alejandra Infante1, Joanna Jacobus1, Kevin Patrick1, Paul D. Shilling1, Rahul S. Desikan19, Yi Li19, Leo P. Sugrue19, Marie T. Banich20, Naomi P. Friedman20, John K. Hewitt20, Christian J. Hopfer20, Joseph T. Sakai20, Jody Tanabe20, Linda B. Cottler21, Sara Jo Nixon21, Linda Chang22, Christine C. Cloak22, Thomas Ernst22, Gloria Reeves22, David N. Kennedy23, Steve Heeringa9, Scott Peltier9, John E. Schulenberg9, Chandra Sripada9, Robert A. Zucker9, William G. Iacono24, Monica Luciana24, Finnegan J. Calabro25, Duncan B. Clark25, David A. Lewis25, Beatriz Luna25, Claudiu Schirda25, Tufikameni Brima26, John J. Foxe26, Edward G. Freedman26, Daniel W. Mruzek26, Michael J. Mason27, Rebekah S. Huber28, Erin McGlade28, Andrew P. Prescot28, Perry F. Renshaw28, Deborah A. Yurgelun-Todd28, Nicholas Allgaier8, Julie A. Dumas8, Masha Y. Ivanova8, Alexandra Potter8, Paul Florsheim29, Christine L. Larson29, Krista M. Lisdahl29, Michael E. Charness30, Michael E. Charness15, Michael E. Charness31, Bernard F. Fuemmeler7, John M. Hettema7, Hermine H. Maes7, Joel L. Steinberg7, Andrey P. Anokhin6, Paul E.A. Glaser6, Andrew C. Heath6, Pamela A. F. Madden6, Arielle R. Baskin-Sommers5, R. Todd Constable5, Steven Grant11, Gayathri J. Dowling11, Sandra A. Brown1, Terry L. Jernigan1, Anders M. Dale1 
TL;DR: The baseline neuroimaging processing and subject-level analysis methods used by the Adolescent Brain Cognitive Development Study are described to be a resource of unprecedented scale and depth for studying typical and atypical development.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

Journal ArticleDOI
01 Mar 2019-Science
TL;DR: Advances in the understanding of pantropical interbasin climate interactions are reviewed and their implications for both climate prediction and future climate projections are reviewed.
Abstract: The El Nino-Southern Oscillation (ENSO), which originates in the Pacific, is the strongest and most well-known mode of tropical climate variability. Its reach is global, and it can force climate variations of the tropical Atlantic and Indian Oceans by perturbing the global atmospheric circulation. Less appreciated is how the tropical Atlantic and Indian Oceans affect the Pacific. Especially noteworthy is the multidecadal Atlantic warming that began in the late 1990s, because recent research suggests that it has influenced Indo-Pacific climate, the character of the ENSO cycle, and the hiatus in global surface warming. Discovery of these pantropical interactions provides a pathway forward for improving predictions of climate variability in the current climate and for refining projections of future climate under different anthropogenic forcing scenarios.

Journal ArticleDOI
TL;DR: A comprehensive pipeline called Extensive de-novo TE Annotator (EDTA) is created that produces a filtered non-redundant TE library for annotation of structurally intact and fragmented elements and will greatly facilitate TE annotation in eukaryotic genomes.
Abstract: Sequencing technology and assembly algorithms have matured to the point that high-quality de novo assembly is possible for large, repetitive genomes. Current assemblies traverse transposable elements (TEs) and provide an opportunity for comprehensive annotation of TEs. Numerous methods exist for annotation of each class of TEs, but their relative performances have not been systematically compared. Moreover, a comprehensive pipeline is needed to produce a non-redundant library of TEs for species lacking this resource to generate whole-genome TE annotations. We benchmark existing programs based on a carefully curated library of rice TEs. We evaluate the performance of methods annotating long terminal repeat (LTR) retrotransposons, terminal inverted repeat (TIR) transposons, short TIR transposons known as miniature inverted transposable elements (MITEs), and Helitrons. Performance metrics include sensitivity, specificity, accuracy, precision, FDR, and F1. Using the most robust programs, we create a comprehensive pipeline called Extensive de-novo TE Annotator (EDTA) that produces a filtered non-redundant TE library for annotation of structurally intact and fragmented elements. EDTA also deconvolutes nested TE insertions frequently found in highly repetitive genomic regions. Using other model species with curated TE libraries (maize and Drosophila), EDTA is shown to be robust across both plant and animal species. The benchmarking results and pipeline developed here will greatly facilitate TE annotation in eukaryotic genomes. These annotations will promote a much more in-depth understanding of the diversity and evolution of TEs at both intra- and inter-species levels. EDTA is open-source and freely available: https://github.com/oushujun/EDTA.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

Journal ArticleDOI
TL;DR: In this article, a comprehensive overview of ecosystem services provided by natural and semi-natural grasslands, using southern Africa (SA) and northwest Europe as case studies, respectively, is presented.
Abstract: Extensively managed grasslands are recognized globally for their high biodiversity and their social and cultural values. However, their capacity to deliver multiple ecosystem services (ES) as parts of agricultural systems is surprisingly understudied compared to other production systems. We undertook a comprehensive overview of ES provided by natural and semi-natural grasslands, using southern Africa (SA) and northwest Europe as case studies, respectively. We show that these grasslands can supply additional non-agricultural services, such as water supply and flow regulation, carbon storage, erosion control, climate mitigation, pollination, and cultural ES. While demand for ecosystems services seems to balance supply in natural grasslands of SA, the smaller areas of semi-natural grasslands in Europe appear to not meet the demand for many services. We identified three bundles of related ES from grasslands: water ES including fodder production, cultural ES connected to livestock production, and population-based regulating services (e.g., pollination and biological control), which also linked to biodiversity. Greenhouse gas emission mitigation seemed unrelated to the three bundles. The similarities among the bundles in SA and northwestern Europe suggest that there are generalities in ES relations among natural and semi-natural grassland areas. We assessed trade-offs and synergies among services in relation to management practices and found that although some trade-offs are inevitable, appropriate management may create synergies and avoid trade-offs among many services. We argue that ecosystem service and food security research and policy should give higher priority to how grasslands can be managed for fodder and meat production alongside other ES. By integrating grasslands into agricultural production systems and land-use decisions locally and regionally, their potential to contribute to functional landscapes and to food security and sustainable livelihoods can be greatly enhanced. (Less)

Proceedings ArticleDOI
09 Sep 2019
TL;DR: KnowBert as discussed by the authors proposes a general method to embed multiple knowledge bases (KBs) into large scale models, and thereby enhance their representations with structured, human-curated knowledge by using an integrated entity linker to retrieve relevant entity embeddings, then update contextual word representations via a form of word-to-entity attention.
Abstract: Contextual word representations, typically trained on unstructured, unlabeled text, do not contain any explicit grounding to real world entities and are often unable to remember facts about those entities. We propose a general method to embed multiple knowledge bases (KBs) into large scale models, and thereby enhance their representations with structured, human-curated knowledge. For each KB, we first use an integrated entity linker to retrieve relevant entity embeddings, then update contextual word representations via a form of word-to-entity attention. In contrast to previous approaches, the entity linkers and self-supervised language modeling objective are jointly trained end-to-end in a multitask setting that combines a small amount of entity linking supervision with a large amount of raw text. After integrating WordNet and a subset of Wikipedia into BERT, the knowledge enhanced BERT (KnowBert) demonstrates improved perplexity, ability to recall facts as measured in a probing task and downstream performance on relationship extraction, entity typing, and word sense disambiguation. KnowBert’s runtime is comparable to BERT’s and it scales to large KBs.

Journal ArticleDOI
14 Jun 2019-Science
TL;DR: The major proposed pathway involves an initial decarboxylation of l-dopa to dopamine, followed by conversion of dopamine to m-tyramine by means of a distinctly microbial dehydroxylation reaction.
Abstract: The human gut microbiota metabolizes the Parkinson's disease medication Levodopa (l-dopa), potentially reducing drug availability and causing side effects. However, the organisms, genes, and enzymes responsible for this activity in patients and their susceptibility to inhibition by host-targeted drugs are unknown. Here, we describe an interspecies pathway for gut bacterial l-dopa metabolism. Conversion of l-dopa to dopamine by a pyridoxal phosphate-dependent tyrosine decarboxylase from Enterococcus faecalis is followed by transformation of dopamine to m-tyramine by a molybdenum-dependent dehydroxylase from Eggerthella lenta These enzymes predict drug metabolism in complex human gut microbiotas. Although a drug that targets host aromatic amino acid decarboxylase does not prevent gut microbial l-dopa decarboxylation, we identified a compound that inhibits this activity in Parkinson's patient microbiotas and increases l-dopa bioavailability in mice.

Journal ArticleDOI
01 Jul 2019-Nature
TL;DR: A comprehensive assessment of ‘committed’ carbon dioxide emissions—from existing and proposed fossil-fuel-based infrastructure—finds that these emissions may exceed the level required to keep global warming within 1.5 degrees Celsius.
Abstract: Net anthropogenic emissions of carbon dioxide (CO2) must approach zero by mid-century (2050) in order to stabilize the global mean temperature at the level targeted by international efforts1–5. Yet continued expansion of fossil-fuel-burning energy infrastructure implies already ‘committed’ future CO2 emissions6–13. Here we use detailed datasets of existing fossil-fuel energy infrastructure in 2018 to estimate regional and sectoral patterns of committed CO2 emissions, the sensitivity of such emissions to assumed operating lifetimes and schedules, and the economic value of the associated infrastructure. We estimate that, if operated as historically, existing infrastructure will cumulatively emit about 658 gigatonnes of CO2 (with a range of 226 to 1,479 gigatonnes CO2, depending on the lifetimes and utilization rates assumed). More than half of these emissions are predicted to come from the electricity sector; infrastructure in China, the USA and the 28 member states of the European Union represents approximately 41 per cent, 9 per cent and 7 per cent of the total, respectively. If built, proposed power plants (planned, permitted or under construction) would emit roughly an extra 188 (range 37–427) gigatonnes CO2. Committed emissions from existing and proposed energy infrastructure (about 846 gigatonnes CO2) thus represent more than the entire carbon budget that remains if mean warming is to be limited to 1.5 degrees Celsius (°C) with a probability of 66 to 50 per cent (420–580 gigatonnes CO2)5, and perhaps two-thirds of the remaining carbon budget if mean warming is to be limited to less than 2 °C (1,170–1,500 gigatonnes CO2)5. The remaining carbon budget estimates are varied and nuanced14,15, and depend on the climate target and the availability of large-scale negative emissions16. Nevertheless, our estimates suggest that little or no new CO2-emitting infrastructure can be commissioned, and that existing infrastructure may need to be retired early (or be retrofitted with carbon capture and storage technology) in order to meet the Paris Agreement climate goals17. Given the asset value per tonne of committed emissions, we suggest that the most cost-effective premature infrastructure retirements will be in the electricity and industry sectors, if non-emitting alternatives are available and affordable4,18. A comprehensive assessment of ‘committed’ carbon dioxide emissions—from existing and proposed fossil-fuel-based infrastructure—finds that these emissions may exceed the level required to keep global warming within 1.5 degrees Celsius.

Journal ArticleDOI
TL;DR: It is found that microglial depletion in a mouse model of Alzheimer’s disease impairs plaque formation and that Aβ-induced changes in neuronal gene expression are microglia-mediated.
Abstract: Many risk genes for the development of Alzheimer's disease (AD) are exclusively or highly expressed in myeloid cells. Microglia are dependent on colony-stimulating factor 1 receptor (CSF1R) signaling for their survival. We designed and synthesized a highly selective brain-penetrant CSF1R inhibitor (PLX5622) allowing for extended and specific microglial elimination, preceding and during pathology development. We find that in the 5xFAD mouse model of AD, plaques fail to form in the parenchymal space following microglial depletion, except in areas containing surviving microglia. Instead, Aβ deposits in cortical blood vessels reminiscent of cerebral amyloid angiopathy. Altered gene expression in the 5xFAD hippocampus is also reversed by the absence of microglia. Transcriptional analyses of the residual plaque-forming microglia show they exhibit a disease-associated microglia profile. Collectively, we describe the structure, formulation, and efficacy of PLX5622, which allows for sustained microglial depletion and identify roles of microglia in initiating plaque pathogenesis.