scispace - formally typeset
Search or ask a question

Showing papers by "University of California, Irvine published in 2020"


Journal ArticleDOI
16 Sep 2020-Nature
TL;DR: In this paper, the authors review how a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data, and their evolution into a flexible interoperability layer between increasingly specialized computational libraries is discussed.
Abstract: Array programming provides a powerful, compact and expressive syntax for accessing, manipulating and operating on data in vectors, matrices and higher-dimensional arrays. NumPy is the primary array programming library for the Python language. It has an essential role in research analysis pipelines in fields as diverse as physics, chemistry, astronomy, geoscience, biology, psychology, materials science, engineering, finance and economics. For example, in astronomy, NumPy was an important part of the software stack used in the discovery of gravitational waves1 and in the first imaging of a black hole2. Here we review how a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data. NumPy is the foundation upon which the scientific Python ecosystem is constructed. It is so pervasive that several projects, targeting audiences with specialized needs, have developed their own NumPy-like interfaces and array objects. Owing to its central position in the ecosystem, NumPy increasingly acts as an interoperability layer between such array computation libraries and, together with its application programming interface (API), provides a flexible framework to support the next decade of scientific and industrial analysis. NumPy is the primary array programming library for Python; here its fundamental concepts are reviewed and its evolution into a flexible interoperability layer between increasingly specialized computational libraries is discussed.

7,624 citations


Journal ArticleDOI
TL;DR: How a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data is reviewed.
Abstract: Array programming provides a powerful, compact, expressive syntax for accessing, manipulating, and operating on data in vectors, matrices, and higher-dimensional arrays. NumPy is the primary array programming library for the Python language. It plays an essential role in research analysis pipelines in fields as diverse as physics, chemistry, astronomy, geoscience, biology, psychology, material science, engineering, finance, and economics. For example, in astronomy, NumPy was an important part of the software stack used in the discovery of gravitational waves and the first imaging of a black hole. Here we show how a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring, and analyzing scientific data. NumPy is the foundation upon which the entire scientific Python universe is constructed. It is so pervasive that several projects, targeting audiences with specialized needs, have developed their own NumPy-like interfaces and array objects. Because of its central position in the ecosystem, NumPy increasingly plays the role of an interoperability layer between these new array computation libraries.

4,342 citations



Book
Georges Aad1, E. Abat2, Jalal Abdallah3, Jalal Abdallah4  +3029 moreInstitutions (164)
23 Feb 2020
TL;DR: The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper, where a brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.
Abstract: The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.

3,111 citations


Journal ArticleDOI
TL;DR: In this article, the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP.
Abstract: Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.

2,800 citations


Journal ArticleDOI
TL;DR: A serological enzyme-linked immunosorbent assay for the screening and identification of human SARS-CoV-2 seroconverters and can be adjusted to detect different antibody types in serum and plasma.
Abstract: Here, we describe a serological enzyme-linked immunosorbent assay for the screening and identification of human SARS-CoV-2 seroconverters. This assay does not require the handling of infectious virus, can be adjusted to detect different antibody types in serum and plasma and is amenable to scaling. Serological assays are of critical importance to help define previous exposure to SARS-CoV-2 in populations, identify highly reactive human donors for convalescent plasma therapy and investigate correlates of protection.

1,629 citations


Journal ArticleDOI
06 Feb 2020-Cell
TL;DR: The largest exome sequencing study of autism spectrum disorder (ASD) to date, using an enhanced analytical framework to integrate de novo and case-control rare variation, identifies 102 risk genes at a false discovery rate of 0.1 or less, consistent with multiple paths to an excitatory-inhibitory imbalance underlying ASD.

1,169 citations


Journal ArticleDOI
Marielle Saunois1, Ann R. Stavert2, Ben Poulter3, Philippe Bousquet1, Josep G. Canadell2, Robert B. Jackson4, Peter A. Raymond5, Edward J. Dlugokencky6, Sander Houweling7, Sander Houweling8, Prabir K. Patra9, Prabir K. Patra10, Philippe Ciais1, Vivek K. Arora, David Bastviken11, Peter Bergamaschi, Donald R. Blake12, Gordon Brailsford13, Lori Bruhwiler6, Kimberly M. Carlson14, Mark Carrol3, Simona Castaldi15, Naveen Chandra9, Cyril Crevoisier16, Patrick M. Crill17, Kristofer R. Covey18, Charles L. Curry19, Giuseppe Etiope20, Giuseppe Etiope21, Christian Frankenberg22, Nicola Gedney23, Michaela I. Hegglin24, Lena Höglund-Isaksson25, Gustaf Hugelius17, Misa Ishizawa26, Akihiko Ito26, Greet Janssens-Maenhout, Katherine M. Jensen27, Fortunat Joos28, Thomas Kleinen29, Paul B. Krummel2, Ray L. Langenfelds2, Goulven Gildas Laruelle, Licheng Liu30, Toshinobu Machida26, Shamil Maksyutov26, Kyle C. McDonald27, Joe McNorton31, Paul A. Miller32, Joe R. Melton, Isamu Morino26, Jurek Müller28, Fabiola Murguia-Flores33, Vaishali Naik34, Yosuke Niwa26, Sergio Noce, Simon O'Doherty33, Robert J. Parker35, Changhui Peng36, Shushi Peng37, Glen P. Peters, Catherine Prigent, Ronald G. Prinn38, Michel Ramonet1, Pierre Regnier, William J. Riley39, Judith A. Rosentreter40, Arjo Segers, Isobel J. Simpson12, Hao Shi41, Steven J. Smith42, L. Paul Steele2, Brett F. Thornton17, Hanqin Tian41, Yasunori Tohjima26, Francesco N. Tubiello43, Aki Tsuruta44, Nicolas Viovy1, Apostolos Voulgarakis45, Apostolos Voulgarakis46, Thomas Weber47, Michiel van Weele48, Guido R. van der Werf7, Ray F. Weiss49, Doug Worthy, Debra Wunch50, Yi Yin22, Yi Yin1, Yukio Yoshida26, Weiya Zhang32, Zhen Zhang51, Yuanhong Zhao1, Bo Zheng1, Qing Zhu39, Qiuan Zhu52, Qianlai Zhuang30 
Université Paris-Saclay1, Commonwealth Scientific and Industrial Research Organisation2, Goddard Space Flight Center3, Stanford University4, Yale University5, National Oceanic and Atmospheric Administration6, VU University Amsterdam7, Netherlands Institute for Space Research8, Japan Agency for Marine-Earth Science and Technology9, Chiba University10, Linköping University11, University of California, Irvine12, National Institute of Water and Atmospheric Research13, New York University14, Seconda Università degli Studi di Napoli15, École Polytechnique16, Stockholm University17, Skidmore College18, University of Victoria19, Babeș-Bolyai University20, National Institute of Geophysics and Volcanology21, California Institute of Technology22, Met Office23, University of Reading24, International Institute for Applied Systems Analysis25, National Institute for Environmental Studies26, City University of New York27, University of Bern28, Max Planck Society29, Purdue University30, European Centre for Medium-Range Weather Forecasts31, Lund University32, University of Bristol33, Geophysical Fluid Dynamics Laboratory34, University of Leicester35, Université du Québec à Montréal36, Peking University37, Massachusetts Institute of Technology38, Lawrence Berkeley National Laboratory39, Southern Cross University40, Auburn University41, Joint Global Change Research Institute42, Food and Agriculture Organization43, Finnish Meteorological Institute44, Technical University of Crete45, Imperial College London46, University of Rochester47, Royal Netherlands Meteorological Institute48, Scripps Institution of Oceanography49, University of Toronto50, University of Maryland, College Park51, Hohai University52
TL;DR: The second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modeling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations) as discussed by the authors.
Abstract: Understanding and quantifying the global methane (CH4) budget is important for assessing realistic pathways to mitigate climate change. Atmospheric emissions and concentrations of CH4 continue to increase, making CH4 the second most important human-influenced greenhouse gas in terms of climate forcing, after carbon dioxide (CO2). The relative importance of CH4 compared to CO2 depends on its shorter atmospheric lifetime, stronger warming potential, and variations in atmospheric growth rate over the past decade, the causes of which are still debated. Two major challenges in reducing uncertainties in the atmospheric growth rate arise from the variety of geographically overlapping CH4 sources and from the destruction of CH4 by short-lived hydroxyl radicals (OH). To address these challenges, we have established a consortium of multidisciplinary scientists under the umbrella of the Global Carbon Project to synthesize and stimulate new research aimed at improving and regularly updating the global methane budget. Following Saunois et al. (2016), we present here the second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modelling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations). For the 2008–2017 decade, global methane emissions are estimated by atmospheric inversions (a top-down approach) to be 576 Tg CH4 yr−1 (range 550–594, corresponding to the minimum and maximum estimates of the model ensemble). Of this total, 359 Tg CH4 yr−1 or ∼ 60 % is attributed to anthropogenic sources, that is emissions caused by direct human activity (i.e. anthropogenic emissions; range 336–376 Tg CH4 yr−1 or 50 %–65 %). The mean annual total emission for the new decade (2008–2017) is 29 Tg CH4 yr−1 larger than our estimate for the previous decade (2000–2009), and 24 Tg CH4 yr−1 larger than the one reported in the previous budget for 2003–2012 (Saunois et al., 2016). Since 2012, global CH4 emissions have been tracking the warmest scenarios assessed by the Intergovernmental Panel on Climate Change. Bottom-up methods suggest almost 30 % larger global emissions (737 Tg CH4 yr−1, range 594–881) than top-down inversion methods. Indeed, bottom-up estimates for natural sources such as natural wetlands, other inland water systems, and geological sources are higher than top-down estimates. The atmospheric constraints on the top-down budget suggest that at least some of these bottom-up emissions are overestimated. The latitudinal distribution of atmospheric observation-based emissions indicates a predominance of tropical emissions (∼ 65 % of the global budget, < 30∘ N) compared to mid-latitudes (∼ 30 %, 30–60∘ N) and high northern latitudes (∼ 4 %, 60–90∘ N). The most important source of uncertainty in the methane budget is attributable to natural emissions, especially those from wetlands and other inland waters. Some of our global source estimates are smaller than those in previously published budgets (Saunois et al., 2016; Kirschke et al., 2013). In particular wetland emissions are about 35 Tg CH4 yr−1 lower due to improved partition wetlands and other inland waters. Emissions from geological sources and wild animals are also found to be smaller by 7 Tg CH4 yr−1 by 8 Tg CH4 yr−1, respectively. However, the overall discrepancy between bottom-up and top-down estimates has been reduced by only 5 % compared to Saunois et al. (2016), due to a higher estimate of emissions from inland waters, highlighting the need for more detailed research on emissions factors. Priorities for improving the methane budget include (i) a global, high-resolution map of water-saturated soils and inundated areas emitting methane based on a robust classification of different types of emitting habitats; (ii) further development of process-based models for inland-water emissions; (iii) intensification of methane observations at local scales (e.g., FLUXNET-CH4 measurements) and urban-scale monitoring to constrain bottom-up land surface models, and at regional scales (surface networks and satellites) to constrain atmospheric inversions; (iv) improvements of transport models and the representation of photochemical sinks in top-down inversions; and (v) development of a 3D variational inversion system using isotopic and/or co-emitted species such as ethane to improve source partitioning.

1,047 citations


Journal ArticleDOI
29 Jul 2020-Nature
TL;DR: The authors summarize the data produced by phase III of the Encyclopedia of DNA Elements (ENCODE) project, a resource for better understanding of the human and mouse genomes, which have produced 5,992 new experimental datasets, including systematic determinations across mouse fetal development.
Abstract: The human and mouse genomes contain instructions that specify RNAs and proteins and govern the timing, magnitude, and cellular context of their production. To better delineate these elements, phase III of the Encyclopedia of DNA Elements (ENCODE) Project has expanded analysis of the cell and tissue repertoires of RNA transcription, chromatin structure and modification, DNA methylation, chromatin looping, and occupancy by transcription factors and RNA-binding proteins. Here we summarize these efforts, which have produced 5,992 new experimental datasets, including systematic determinations across mouse fetal development. All data are available through the ENCODE data portal (https://www.encodeproject.org), including phase II ENCODE1 and Roadmap Epigenomics2 data. We have developed a registry of 926,535 human and 339,815 mouse candidate cis-regulatory elements, covering 7.9 and 3.4% of their respective genomes, by integrating selected datatypes associated with gene regulation, and constructed a web-based server (SCREEN; http://screen.encodeproject.org) to provide flexible, user-defined access to this resource. Collectively, the ENCODE data and registry provide an expansive resource for the scientific community to build a better understanding of the organization and function of the human and mouse genomes.

999 citations


Journal ArticleDOI
TL;DR: A significantly decreased growth rate and increased doubling time of cases was observed, which is most likely due to Chinese lockdown measures, which seem to have a potential to slow down the spread of COVID-19.
Abstract: BACKGROUND With its epicenter in Wuhan, China, the COVID-19 outbreak was declared a Public Health Emergency of International Concern by the World Health Organization (WHO). Consequently, many countries have implemented flight restrictions to China. China itself has imposed a lockdown of the population of Wuhan as well as the entire Hubei province. However, whether these two enormous measures have led to significant changes in the spread of COVID-19 cases remains unclear. METHODS We analyzed the available data on the development of confirmed domestic and international COVID-19 cases before and after lockdown measures. We evaluated the correlation of domestic air traffic to the number of confirmed COVID-19 cases and determined the growth curves of COVID-19 cases within China before and after lockdown as well as after changes in COVID-19 diagnostic criteria. RESULTS Our findings indicate a significant increase in doubling time from 2 days (95% CI: 1.9-2.6) to 4 days (95% CI: 3.5-4.3), after imposing lockdown. A further increase is detected after changing diagnostic and testing methodology to 19.3 (95% CI: 15.1-26.3), respectively. Moreover, the correlation between domestic air traffic and COVID-19 spread became weaker following lockdown (before lockdown: r = 0.98, P < 0.05 vs after lockdown: r = 0.91, P = NS). CONCLUSIONS A significantly decreased growth rate and increased doubling time of cases was observed, which is most likely due to Chinese lockdown measures. A more stringent confinement of people in high risk areas seems to have a potential to slow down the spread of COVID-19.

982 citations


Journal ArticleDOI
TL;DR: The Community Earth System Model Version 2 (CESM2) as discussed by the authors is the most recent version of the Coupled Model Intercomparison Project (CMEI) coupled model.
Abstract: An overview of the Community Earth System Model Version 2 (CESM2) is provided, including a discussion of the challenges encountered during its development and how they were addressed. In addition, an evaluation of a pair of CESM2 long preindustrial control and historical ensemble simulations is presented. These simulations were performed using the nominal 1° horizontal resolution configuration of the coupled model with both the “low-top” (40 km, with limited chemistry) and “high-top” (130 km, with comprehensive chemistry) versions of the atmospheric component. CESM2 contains many substantial science and infrastructure improvements and new capabilities since its previous major release, CESM1, resulting in improved historical simulations in comparison to CESM1 and available observations. These include major reductions in low-latitude precipitation and shortwave cloud forcing biases; better representation of the Madden-Julian Oscillation; better El Nino-Southern Oscillation-related teleconnections; and a global land carbon accumulation trend that agrees well with observationally based estimates. Most tropospheric and surface features of the low- and high-top simulations are very similar to each other, so these improvements are present in both configurations. CESM2 has an equilibrium climate sensitivity of 5.1–5.3 °C, larger than in CESM1, primarily due to a combination of relatively small changes to cloud microphysics and boundary layer parameters. In contrast, CESM2's transient climate response of 1.9–2.0 °C is comparable to that of CESM1. The model outputs from these and many other simulations are available to the research community, and they represent CESM2's contributions to the Coupled Model Intercomparison Project Phase 6.

Journal ArticleDOI
Jens Kattge1, Gerhard Bönisch2, Sandra Díaz3, Sandra Lavorel  +751 moreInstitutions (314)
TL;DR: The extent of the trait data compiled in TRY is evaluated and emerging patterns of data coverage and representativeness are analyzed to conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements.
Abstract: Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives.

Journal ArticleDOI
TL;DR: Research suggesting that repeated media exposure to community crisis can lead to increased anxiety, heightened stress responses, and misplaced health-protective and help-seeking behaviors that can overburden health care facilities and tax available resources is reviewed.
Abstract: The 2019 novel coronavirus (COVID-2019) has led to a serious outbreak of often severe respiratory disease, which originated in China and has quickly become a global pandemic, with far-reaching consequences that are unprecedented in the modern era. As public health officials seek to contain the virus and mitigate the deleterious effects on worldwide population health, a related threat has emerged: global media exposure to the crisis. We review research suggesting that repeated media exposure to community crisis can lead to increased anxiety, heightened stress responses that can lead to downstream effects on health, and misplaced health-protective and help-seeking behaviors that can overburden health care facilities and tax available resources. We draw from work on previous public health crises (i.e., Ebola and H1N1 outbreaks) and other collective trauma (e.g., terrorist attacks) where media coverage of events had unintended consequences for those at relatively low risk for direct exposure, leading to potentially severe public health repercussions. We conclude with recommendations for individuals, researchers, and public health officials with respect to receiving and providing effective communications during a public health crisis. (PsycInfo Database Record (c) 2020 APA, all rights reserved).

Journal ArticleDOI
TL;DR: Comprehensive pharmacodynamic and pharmacogenomic profiling in sensitive and partially resistant non-clinical models identified mechanisms implicated in limiting anti-tumor activity including KRAS nucleotide cycling and pathways that induce feedback reactivation and/or bypass KRAS dependence.
Abstract: Despite decades of research, efforts to directly target KRAS have been challenging. MRTX849 was identified as a potent, selective, and covalent KRASG12C inhibitor that exhibits favorable drug-like properties, selectively modifies mutant cysteine 12 in GDP-bound KRASG12C and inhibits KRAS-dependent signaling. MRTX849 demonstrated pronounced tumor regression in 17 of 26 (65%) of KRASG12C-positive cell line- and patient-derived xenograft models from multiple tumor types and objective responses have been observed in KRASG12C-positive lung and colon adenocarcinoma patients. Comprehensive pharmacodynamic and pharmacogenomic profiling in sensitive and partially resistant non-clinical models identified mechanisms implicated in limiting anti-tumor activity including KRAS nucleotide cycling and pathways that induce feedback reactivation and/or bypass KRAS dependence. These factors included activation of RTKs, bypass of KRAS dependence, and genetic dysregulation of cell cycle. Combinations of MRTX849 with agents that target RTKs, mTOR, or cell cycle demonstrated enhanced response and marked tumor regression in several tumor models, including MRTX849-refractory models.

Proceedings ArticleDOI
01 Jun 2020
TL;DR: CheckList as mentioned in this paper is a task-agnostic methodology for testing NLP models, which includes a matrix of general linguistic capabilities and test types that facilitate comprehensive test ideation, as well as a software tool to generate a large and diverse number of test cases quickly.
Abstract: Although measuring held-out accuracy has been the primary approach to evaluate generalization, it often overestimates the performance of NLP models, while alternative approaches for evaluating models either focus on individual tasks or on specific behaviors. Inspired by principles of behavioral testing in software engineering, we introduce CheckList, a task-agnostic methodology for testing NLP models. CheckList includes a matrix of general linguistic capabilities and test types that facilitate comprehensive test ideation, as well as a software tool to generate a large and diverse number of test cases quickly. We illustrate the utility of CheckList with tests for three tasks, identifying critical failures in both commercial and state-of-art models. In a user study, a team responsible for a commercial sentiment analysis model found new and actionable bugs in an extensively tested model. In another user study, NLP practitioners with CheckList created twice as many tests, and found almost three times as many bugs as users without it.

Journal ArticleDOI
Gilberto Pastorello1, Carlo Trotta2, E. Canfora2, Housen Chu1  +300 moreInstitutions (119)
TL;DR: The FLUXNET2015 dataset provides ecosystem-scale data on CO 2 , water, and energy exchange between the biosphere and the atmosphere, and other meteorological and biological measurements, from 212 sites around the globe, and is detailed in this paper.
Abstract: The FLUXNET2015 dataset provides ecosystem-scale data on CO2, water, and energy exchange between the biosphere and the atmosphere, and other meteorological and biological measurements, from 212 sites around the globe (over 1500 site-years, up to and including year 2014). These sites, independently managed and operated, voluntarily contributed their data to create global datasets. Data were quality controlled and processed using uniform methods, to improve consistency and intercomparability across sites. The dataset is already being used in a number of applications, including ecophysiology studies, remote sensing studies, and development of ecosystem and Earth system models. FLUXNET2015 includes derived-data products, such as gap-filled time series, ecosystem respiration and photosynthetic uptake estimates, estimation of uncertainties, and metadata about the measurements, presented for the first time in this paper. In addition, 206 of these sites are for the first time distributed under a Creative Commons (CC-BY 4.0) license. This paper details this enhanced dataset and the processing methods, now made available as open-source codes, making the dataset more accessible, transparent, and reproducible.

Journal ArticleDOI
TL;DR: This review systematically analyzes the available human studies to identify harmful stressors, vulnerable periods during pregnancy, specificities in the outcome and biological correlates of the relation between maternal stress and offspring outcome.

Journal ArticleDOI
08 Oct 2020-Nature
TL;DR: A global N2O inventory is presented that incorporates both natural and anthropogenic sources and accounts for the interaction between nitrogen additions and the biochemical processes that control N 2O emissions, using bottom-up, top-down and process-based model approaches.
Abstract: Nitrous oxide (N2O), like carbon dioxide, is a long-lived greenhouse gas that accumulates in the atmosphere. Over the past 150 years, increasing atmospheric N2O concentrations have contributed to stratospheric ozone depletion1 and climate change2, with the current rate of increase estimated at 2 per cent per decade. Existing national inventories do not provide a full picture of N2O emissions, owing to their omission of natural sources and limitations in methodology for attributing anthropogenic sources. Here we present a global N2O inventory that incorporates both natural and anthropogenic sources and accounts for the interaction between nitrogen additions and the biochemical processes that control N2O emissions. We use bottom-up (inventory, statistical extrapolation of flux measurements, process-based land and ocean modelling) and top-down (atmospheric inversion) approaches to provide a comprehensive quantification of global N2O sources and sinks resulting from 21 natural and human sectors between 1980 and 2016. Global N2O emissions were 17.0 (minimum-maximum estimates: 12.2-23.5) teragrams of nitrogen per year (bottom-up) and 16.9 (15.9-17.7) teragrams of nitrogen per year (top-down) between 2007 and 2016. Global human-induced emissions, which are dominated by nitrogen additions to croplands, increased by 30% over the past four decades to 7.3 (4.2-11.4) teragrams of nitrogen per year. This increase was mainly responsible for the growth in the atmospheric burden. Our findings point to growing N2O emissions in emerging economies-particularly Brazil, China and India. Analysis of process-based model estimates reveals an emerging N2O-climate feedback resulting from interactions between nitrogen additions and climate change. The recent growth in N2O emissions exceeds some of the highest projected emission scenarios3,4, underscoring the urgency to mitigate N2O emissions.

Journal ArticleDOI
TL;DR: In this article, the authors present the latest status of PEM fuel cell technology development and applications in the portable and transportation power through an overview of the state of the art and most recent technological advances, and describe materials and water/thermal transport management for fuel cell design and operational control.

Journal ArticleDOI
TL;DR: In this article, the Southern Hemisphere curve (SHCal20) is proposed to estimate the mean Southern Hemisphere offset to be 36 ± 27 14C yrs older than the Northern Hemisphere offset, based upon a comparison of Southern Hemisphere tree-ring data compared with contemporaneous Northern Hemisphere data.
Abstract: Early researchers of radiocarbon levels in Southern Hemisphere tree rings identified a variable North-South hemispheric offset, necessitating construction of a separate radiocarbon calibration curve for the South. We present here SHCal20, a revised calibration curve from 0–55,000 cal BP, based upon SHCal13 and fortified by the addition of 14 new tree-ring data sets in the 2140–0, 3520–3453, 3608–3590 and 13,140–11,375 cal BP time intervals. We detail the statistical approaches used for curve construction and present recommendations for the use of the Northern Hemisphere curve (IntCal20), the Southern Hemisphere curve (SHCal20) and suggest where application of an equal mixture of the curves might be more appropriate. Using our Bayesian spline with errors-in-variables methodology, and based upon a comparison of Southern Hemisphere tree-ring data compared with contemporaneous Northern Hemisphere data, we estimate the mean Southern Hemisphere offset to be 36 ± 27 14C yrs older.

Journal ArticleDOI
TL;DR: The Situated Expectancy-Value Theory (SEVT) as mentioned in this paper is a theory of achievement choice that is based on the expectancy-value theory of choice, which has been widely used in the literature.

Posted ContentDOI
16 Apr 2020-medRxiv
TL;DR: Serological assays are of critical importance to determine seroprevalence in a given population, define previous exposure and identify highly reactive human donors for the generation of convalescent serum as therapeutic.
Abstract: SARS-Cov-2 (severe acute respiratory disease coronavirus 2), which causes Coronavirus Disease 2019 (COVID19) was first detected in China in late 2019 and has since then caused a global pandemic. While molecular assays to directly detect the viral genetic material are available for the diagnosis of acute infection, we currently lack serological assays suitable to specifically detect SARS-CoV-2 antibodies. Here we describe serological enzyme-linked immunosorbent assays (ELISA) that we developed using recombinant antigens derived from the spike protein of SARS-CoV-2. Using negative control samples representing pre-COVID 19 background immunity in the general adult population as well as samples from COVID19 patients, we demonstrate that these assays are sensitive and specific, allowing for screening and identification of COVID19 seroconverters using human plasma/serum as early as two days post COVID19 symptoms onset. Importantly, these assays do not require handling of infectious virus, can be adjusted to detect different antibody types and are amendable to scaling. Such serological assays are of critical importance to determine seroprevalence in a given population, define previous exposure and identify highly reactive human donors for the generation of convalescent serum as therapeutic. Sensitive and specific identification of coronavirus SARS-Cov-2 antibody titers may, in the future, also support screening of health care workers to identify those who are already immune and can be deployed to care for infected patients minimizing the risk of viral spread to colleagues and other patients.

Journal ArticleDOI
TL;DR: This review focuses on recent additions to TURBOMOLE’s functionality, including excited-state methods, RPA and Green's function methods, relativistic approaches, high-order molecular properties, solvation effects, and periodic systems.
Abstract: TURBOMOLE is a collaborative, multi-national software development project aiming to provide highly efficient and stable computational tools for quantum chemical simulations of molecules, clusters, periodic systems, and solutions. The TURBOMOLE software suite is optimized for widely available, inexpensive, and resource-efficient hardware such as multi-core workstations and small computer clusters. TURBOMOLE specializes in electronic structure methods with outstanding accuracy-cost ratio, such as density functional theory including local hybrids and the random phase approximation (RPA), GW-Bethe-Salpeter methods, second-order Moller-Plesset theory, and explicitly correlated coupled-cluster methods. TURBOMOLE is based on Gaussian basis sets and has been pivotal for the development of many fast and low-scaling algorithms in the past three decades, such as integral-direct methods, fast multipole methods, the resolution-of-the-identity approximation, imaginary frequency integration, Laplace transform, and pair natural orbital methods. This review focuses on recent additions to TURBOMOLE's functionality, including excited-state methods, RPA and Green's function methods, relativistic approaches, high-order molecular properties, solvation effects, and periodic systems. A variety of illustrative applications along with accuracy and timing data are discussed. Moreover, available interfaces to users as well as other software are summarized. TURBOMOLE's current licensing, distribution, and support model are discussed, and an overview of TURBOMOLE's development workflow is provided. Challenges such as communication and outreach, software infrastructure, and funding are highlighted.

Journal ArticleDOI
TL;DR: It is found that supply-chain losses that are related to initial COVID-19 lockdowns are largely dependent on the number of countries imposing restrictions and that losses are more sensitive to the duration of a lockdown than its strictness.
Abstract: Countries have sought to stop the spread of coronavirus disease 2019 (COVID-19) by severely restricting travel and in-person commercial activities. Here, we analyse the supply-chain effects of a set of idealized lockdown scenarios, using the latest global trade modelling framework. We find that supply-chain losses that are related to initial COVID-19 lockdowns are largely dependent on the number of countries imposing restrictions and that losses are more sensitive to the duration of a lockdown than its strictness. However, a longer containment that can eradicate the disease imposes a smaller loss than shorter ones. Earlier, stricter and shorter lockdowns can minimize overall losses. A ‘go-slow’ approach to lifting restrictions may reduce overall damages if it avoids the need for further lockdowns. Regardless of the strategy, the complexity of global supply chains will magnify losses beyond the direct effects of COVID-19. Thus, pandemic control is a public good that requires collective efforts and support to lower-capacity countries.

Journal ArticleDOI
09 Mar 2020
TL;DR: A typology of compound events is proposed, distinguishing events that are preconditioned, multivariate, temporally compounding and spatially compounding, and suggests analytical and modelling approaches to aid in their investigation.
Abstract: Compound weather and climate events describe combinations of multiple climate drivers and/or hazards that contribute to societal or environmental risk. Although many climate-related disasters are caused by compound events, the understanding, analysis, quantification and prediction of such events is still in its infancy. In this Review, we propose a typology of compound events and suggest analytical and modelling approaches to aid in their investigation. We organize the highly diverse compound event types according to four themes: preconditioned, where a weather-driven or climate-driven precondition aggravates the impacts of a hazard; multivariate, where multiple drivers and/or hazards lead to an impact; temporally compounding, where a succession of hazards leads to an impact; and spatially compounding, where hazards in multiple connected locations cause an aggregated impact. Through structuring compound events and their respective analysis tools, the typology offers an opportunity for deeper insight into their mechanisms and impacts, benefiting the development of effective adaptation strategies. However, the complex nature of compound events results in some cases inevitably fitting into more than one class, necessitating soft boundaries within the typology. Future work must homogenize the available analytical approaches into a robust toolset for compound-event analysis under present and future climate conditions. Research on compound events has increased vastly in the last several years, yet, a typology was absent. This Review proposes a comprehensive classification scheme, incorporating compound events that are preconditioned, multivariate, temporally compounding and spatially compounding events.

Proceedings ArticleDOI
07 Feb 2020
TL;DR: It is demonstrated how extremely biased (racist) classifiers crafted by the proposed framework can easily fool popular explanation techniques such as LIME and SHAP into generating innocuous explanations which do not reflect the underlying biases.
Abstract: As machine learning black boxes are increasingly being deployed in domains such as healthcare and criminal justice, there is growing emphasis on building tools and techniques for explaining these black boxes in an interpretable manner. Such explanations are being leveraged by domain experts to diagnose systematic errors and underlying biases of black boxes. In this paper, we demonstrate that post hoc explanations techniques that rely on input perturbations, such as LIME and SHAP, are not reliable. Specifically, we propose a novel scaffolding technique that effectively hides the biases of any given classifier by allowing an adversarial entity to craft an arbitrary desired explanation. Our approach can be used to scaffold any biased classifier in such a way that its predictions on the input data distribution still remain biased, but the post hoc explanations of the scaffolded classifier look innocuous. Using extensive evaluation with multiple real world datasets (including COMPAS), we demonstrate how extremely biased (racist) classifiers crafted by our framework can easily fool popular explanation techniques such as LIME and SHAP into generating innocuous explanations which do not reflect the underlying biases.

Journal ArticleDOI
20 Mar 2020-Science
TL;DR: Results support the radial unit hypothesis that different developmental mechanisms promote surface area expansion and increases in thickness and find evidence that brain structure is a key phenotype along the causal pathway that leads from genetic variation to differences in general cognitive function.
Abstract: The cerebral cortex underlies our complex cognitive capabilities, yet little is known about the specific genetic loci that influence human cortical structure. To identify genetic variants that affect cortical structure, we conducted a genome-wide association meta-analysis of brain magnetic resonance imaging data from 51,665 individuals. We analyzed the surface area and average thickness of the whole cortex and 34 regions with known functional specializations. We identified 199 significant loci and found significant enrichment for loci influencing total surface area within regulatory elements that are active during prenatal cortical development, supporting the radial unit hypothesis. Loci that affect regional surface area cluster near genes in Wnt signaling pathways, which influence progenitor expansion and areal identity. Variation in cortical structure is genetically correlated with cognitive function, Parkinson's disease, insomnia, depression, neuroticism, and attention deficit hyperactivity disorder.

Journal ArticleDOI
TL;DR: In this paper, a high-resolution and physically based description of Antarctica bed topography using mass conservation is presented, revealing previously unknown basal features with major implications for glacier response to climate change.
Abstract: The Antarctic ice sheet has been losing mass over past decades through the accelerated flow of its glaciers, conditioned by ocean temperature and bed topography. Glaciers retreating along retrograde slopes (that is, the bed elevation drops in the inland direction) are potentially unstable, while subglacial ridges slow down the glacial retreat. Despite major advances in the mapping of subglacial bed topography, significant sectors of Antarctica remain poorly resolved and critical spatial details are missing. Here we present a novel, high-resolution and physically based description of Antarctic bed topography using mass conservation. Our results reveal previously unknown basal features with major implications for glacier response to climate change. For example, glaciers flowing across the Transantarctic Mountains are protected by broad, stabilizing ridges. Conversely, in the marine basin of Wilkes Land, East Antarctica, we find retrograde slopes along Ninnis and Denman glaciers, with stabilizing slopes beneath Moscow University, Totten and Lambert glacier system, despite corrections in bed elevation of up to 1 km for the latter. This transformative description of bed topography redefines the high- and lower-risk sectors for rapid sea level rise from Antarctica; it will also significantly impact model projections of sea level rise from Antarctica in the coming centuries.

Journal ArticleDOI
TL;DR: In this article, a collection of initial-condition large ensembles (LEs) generated with seven Earth system models under historical and future radiative forcing scenarios provides new insights into uncertainties due to internal variability versus model differences.
Abstract: Internal variability in the climate system confounds assessment of human-induced climate change and imposes irreducible limits on the accuracy of climate change projections, especially at regional and decadal scales. A new collection of initial-condition large ensembles (LEs) generated with seven Earth system models under historical and future radiative forcing scenarios provides new insights into uncertainties due to internal variability versus model differences. These data enhance the assessment of climate change risks, including extreme events, and offer a powerful testbed for new methodologies aimed at separating forced signals from internal variability in the observational record. Opportunities and challenges confronting the design and dissemination of future LEs, including increased spatial resolution and model complexity alongside emerging Earth system applications, are discussed. Climate change detection is confounded by internal variability, but recent initial-condition large ensembles (LEs) have begun addressing this issue. This Perspective discusses the value of multi-model LEs, the challenges of providing them and their role in future climate change research.

Journal ArticleDOI
TL;DR: The Arctic has warmed more than twice as fast as the global average since the late twentieth century, a phenomenon known as Arctic amplification (AA), and progress has been made in understanding the mechanisms that link it to midlatitude weather variability as discussed by the authors.
Abstract: The Arctic has warmed more than twice as fast as the global average since the late twentieth century, a phenomenon known as Arctic amplification (AA). Recently, there have been considerable advances in understanding the physical contributions to AA, and progress has been made in understanding the mechanisms that link it to midlatitude weather variability. Observational studies overwhelmingly support that AA is contributing to winter continental cooling. Although some model experiments support the observational evidence, most modelling results show little connection between AA and severe midlatitude weather or suggest the export of excess heating from the Arctic to lower latitudes. Divergent conclusions between model and observational studies, and even intramodel studies, continue to obfuscate a clear understanding of how AA is influencing midlatitude weather.