scispace - formally typeset
Search or ask a question

Showing papers by "University of Bern published in 2020"


Journal ArticleDOI
François Mach, Colin Baigent, Alberico L. Catapano, Konstantinos C. Koskinas1, Manuela Casula, Lina Badimon1, M. John Chapman, Guy De Backer, Victoria Delgado, Brian A. Ference, Ian D. Graham, Alison Halliday, Ulf Landmesser, Borislava Mihaylova, Terje R. Pedersen, Gabriele Riccardi, Dimitrios J. Richter, Marc S. Sabatine, Marja-Riitta Taskinen, Lale Tokgozoglu, Olov Wiklund, Christian Mueller, Heinz Drexel, Victor Aboyans, Alberto Corsini, Wolfram Doehner, Michel Farnier, Bruna Gigante, Meral Kayıkçıoğlu, Goran Krstacic, Ekaterini Lambrinou, Basil S. Lewis, Josep Masip, Philippe Moulin, Steffen E. Petersen, Anna Sonia Petronio, Massimo F Piepoli, Xavier Pintó, Lorenz Räber, Kausik K. Ray, Željko Reiner, Walter F Riesen, Marco Roffi, Jean-Paul Schmid, Evgeny Shlyakhto, Iain A. Simpson, Erik S.G. Stroes, Isabella Sudano, Alexandros D Tselepis, Margus Viigimaa, Cecile Vindis, Alexander Vonbank, Michal Vrablik, Mislav Vrsalovic, José Luis Zamorano, Jean-Philippe Collet, Stephan Windecker, Veronica Dean, Donna Fitzsimons, Chris P Gale, Diederick E. Grobbee, Sigrun Halvorsen, Gerhard Hindricks, Bernard Iung, Peter Jüni, Hugo A. Katus, Christophe Leclercq, Maddalena Lettino, Béla Merkely, Miguel Sousa-Uva, Rhian M. Touyz, Djamaleddine Nibouche, Parounak H. Zelveian, Peter Siostrzonek, Ruslan Najafov, Philippe van de Borne, Belma Pojskic, Arman Postadzhiyan, Lambros Kypris, Jindřich Špinar, Mogens Lytken Larsen, Hesham Salah Eldin, Timo E. Strandberg, Jean Ferrières, Rusudan Agladze, Ulrich Laufs, Loukianos S. Rallidis, Laszlo Bajnok, Thorbjorn Gudjonsson, Vincent Maher, Yaakov Henkin, Michele Massimo Gulizia, Aisulu Mussagaliyeva, Gani Bajraktari, Alina Kerimkulova, Gustavs Latkovskis, Omar Hamoui, Rimvydas Šlapikas, Laurent Visser, P. Dingli, Victoria Ivanov, Aneta Boskovic, Mbarek Nazzi, Frank L.J. Visseren, Irena Mitevska, Kjetil Retterstøl, Piotr Jankowski, Ricardo Fontes-Carvalho, Dan Gaita, Marat V. Ezhov, Marina Foscoli, Vojislav Giga, Daniel Pella, Zlatko Fras, Leopoldo Pérez de Isla, Emil Hagström, Roger Lehmann, Leila Abid, Oner Ozdogan, Olena Mitchenko, Riyaz S. Patel 

4,069 citations


Journal ArticleDOI
TL;DR: In this article, the authors present guidelines for the management of patients with coronary artery disease (CAD), which is a pathological process characterized by atherosclerotic plaque accumulation in the epicardial arteries.
Abstract: Coronary artery disease (CAD) is a pathological process characterized by atherosclerotic plaque accumulation in the epicardial arteries, whether obstructive or non-obstructive. This process can be modified by lifestyle adjustments, pharmacological therapies, and invasive interventions designed to achieve disease stabilization or regression. The disease can have long, stable periods but can also become unstable at any time, typically due to an acute atherothrombotic event caused by plaque rupture or erosion. However, the disease is chronic, most often progressive, and hence serious, even in clinically apparently silent periods. The dynamic nature of the CAD process results in various clinical presentations, which can be conveniently categorized as either acute coronary syndromes (ACS) or chronic coronary syndromes (CCS). The Guidelines presented here refer to the management of patients with CCS. The natural history of CCS is illustrated in Figure 1.

3,448 citations


Book
Georges Aad1, E. Abat2, Jalal Abdallah3, Jalal Abdallah4  +3029 moreInstitutions (164)
23 Feb 2020
TL;DR: The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper, where a brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.
Abstract: The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented.

3,111 citations


Journal ArticleDOI
TL;DR: In this article, the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP.
Abstract: Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.

2,800 citations


Journal ArticleDOI
TL;DR: These remdesivir, hydroxychloroquine, lopinavir, and interferon regimens had little or no effect on hospitalized patients with Covid-19, as indicated by overall mortality, initiation of ventilation, and duration of hospital stay.
Abstract: Background World Health Organization expert groups recommended mortality trials of four repurposed antiviral drugs - remdesivir, hydroxychloroquine, lopinavir, and interferon beta-1a - in patients hospitalized with coronavirus disease 2019 (Covid-19). Methods We randomly assigned inpatients with Covid-19 equally between one of the trial drug regimens that was locally available and open control (up to five options, four active and the local standard of care). The intention-to-treat primary analyses examined in-hospital mortality in the four pairwise comparisons of each trial drug and its control (drug available but patient assigned to the same care without that drug). Rate ratios for death were calculated with stratification according to age and status regarding mechanical ventilation at trial entry. Results At 405 hospitals in 30 countries, 11,330 adults underwent randomization; 2750 were assigned to receive remdesivir, 954 to hydroxychloroquine, 1411 to lopinavir (without interferon), 2063 to interferon (including 651 to interferon plus lopinavir), and 4088 to no trial drug. Adherence was 94 to 96% midway through treatment, with 2 to 6% crossover. In total, 1253 deaths were reported (median day of death, day 8; interquartile range, 4 to 14). The Kaplan-Meier 28-day mortality was 11.8% (39.0% if the patient was already receiving ventilation at randomization and 9.5% otherwise). Death occurred in 301 of 2743 patients receiving remdesivir and in 303 of 2708 receiving its control (rate ratio, 0.95; 95% confidence interval [CI], 0.81 to 1.11; P = 0.50), in 104 of 947 patients receiving hydroxychloroquine and in 84 of 906 receiving its control (rate ratio, 1.19; 95% CI, 0.89 to 1.59; P = 0.23), in 148 of 1399 patients receiving lopinavir and in 146 of 1372 receiving its control (rate ratio, 1.00; 95% CI, 0.79 to 1.25; P = 0.97), and in 243 of 2050 patients receiving interferon and in 216 of 2050 receiving its control (rate ratio, 1.16; 95% CI, 0.96 to 1.39; P = 0.11). No drug definitely reduced mortality, overall or in any subgroup, or reduced initiation of ventilation or hospitalization duration. Conclusions These remdesivir, hydroxychloroquine, lopinavir, and interferon regimens had little or no effect on hospitalized patients with Covid-19, as indicated by overall mortality, initiation of ventilation, and duration of hospital stay. (Funded by the World Health Organization; ISRCTN Registry number, ISRCTN83971151; ClinicalTrials.gov number, NCT04315948.).

2,001 citations


Journal ArticleDOI
TL;DR: The ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) have been updated and information reorganised to facilitate their use in practice to help ensure that researchers, reviewers, and journal editors are better equipped to improve the rigour and transparency of the scientific process and thus reproducibility.
Abstract: Reproducible science requires transparent reporting. The ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) were originally developed in 2010 to improve the reporting of animal research. They consist of a checklist of information to include in publications describing in vivo experiments to enable others to scrutinise the work adequately, evaluate its methodological rigour, and reproduce the methods and results. Despite considerable levels of endorsement by funders and journals over the years, adherence to the guidelines has been inconsistent, and the anticipated improvements in the quality of reporting in animal research publications have not been achieved. Here, we introduce ARRIVE 2.0. The guidelines have been updated and information reorganised to facilitate their use in practice. We used a Delphi exercise to prioritise and divide the items of the guidelines into 2 sets, the “ARRIVE Essential 10,” which constitutes the minimum requirement, and the “Recommended Set,” which describes the research context. This division facilitates improved reporting of animal research by supporting a stepwise approach to implementation. This helps journal editors and reviewers verify that the most important items are being reported in manuscripts. We have also developed the accompanying Explanation and Elaboration document, which serves (1) to explain the rationale behind each item in the guidelines, (2) to clarify key concepts, and (3) to provide illustrative examples. We aim, through these changes, to help ensure that researchers, reviewers, and journal editors are better equipped to improve the rigour and transparency of the scientific process and thus reproducibility.

1,796 citations


Journal ArticleDOI
TL;DR: A panel of international experts from 22 countries propose a new definition of metabolic-dysfunction-associated fatty liver disease that is both comprehensive yet simple for the diagnosis of MAFLD and is independent of other liver diseases.

1,705 citations


Journal ArticleDOI
Peter J. Campbell1, Gad Getz2, Jan O. Korbel3, Joshua M. Stuart4  +1329 moreInstitutions (238)
06 Feb 2020-Nature
TL;DR: The flagship paper of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes Consortium describes the generation of the integrative analyses of 2,658 whole-cancer genomes and their matching normal tissues across 38 tumour types, the structures for international data sharing and standardized analyses, and the main scientific findings from across the consortium studies.
Abstract: Cancer is driven by genetic change, and the advent of massively parallel sequencing has enabled systematic documentation of this variation at the whole-genome scale1,2,3. Here we report the integrative analysis of 2,658 whole-cancer genomes and their matching normal tissues across 38 tumour types from the Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium of the International Cancer Genome Consortium (ICGC) and The Cancer Genome Atlas (TCGA). We describe the generation of the PCAWG resource, facilitated by international data sharing using compute clouds. On average, cancer genomes contained 4–5 driver mutations when combining coding and non-coding genomic elements; however, in around 5% of cases no drivers were identified, suggesting that cancer driver discovery is not yet complete. Chromothripsis, in which many clustered structural variants arise in a single catastrophic event, is frequently an early event in tumour evolution; in acral melanoma, for example, these events precede most somatic point mutations and affect several cancer-associated genes simultaneously. Cancers with abnormal telomere maintenance often originate from tissues with low replicative activity and show several mechanisms of preventing telomere attrition to critical levels. Common and rare germline variants affect patterns of somatic mutation, including point mutations, structural variants and somatic retrotransposition. A collection of papers from the PCAWG Consortium describes non-coding mutations that drive cancer beyond those in the TERT promoter4; identifies new signatures of mutational processes that cause base substitutions, small insertions and deletions and structural variation5,6; analyses timings and patterns of tumour evolution7; describes the diverse transcriptional consequences of somatic mutation on splicing, expression levels, fusion genes and promoter activity8,9; and evaluates a range of more-specialized features of cancer genomes8,10,11,12,13,14,15,16,17,18.

1,600 citations


Journal ArticleDOI
TL;DR: Transmission characteristics appear to be of similar magnitude to severe acute respiratory syndrome-related coronavirus (SARS-CoV) and pandemic influenza, indicating a risk of global spread.
Abstract: Since December 2019, China has been experiencing a large outbreak of a novel coronavirus (2019-nCoV) which can cause respiratory disease and severe pneumonia. We estimated the basic reproduction number R0 of 2019-nCoV to be around 2.2 (90% high density interval: 1.4–3.8), indicating the potential for sustained human-to-human transmission. Transmission characteristics appear to be of similar magnitude to severe acute respiratory syndrome-related coronavirus (SARS-CoV) and pandemic influenza, indicating a risk of global spread.

1,187 citations


Journal ArticleDOI
Marielle Saunois1, Ann R. Stavert2, Ben Poulter3, Philippe Bousquet1, Josep G. Canadell2, Robert B. Jackson4, Peter A. Raymond5, Edward J. Dlugokencky6, Sander Houweling7, Sander Houweling8, Prabir K. Patra9, Prabir K. Patra10, Philippe Ciais1, Vivek K. Arora, David Bastviken11, Peter Bergamaschi, Donald R. Blake12, Gordon Brailsford13, Lori Bruhwiler6, Kimberly M. Carlson14, Mark Carrol3, Simona Castaldi15, Naveen Chandra10, Cyril Crevoisier16, Patrick M. Crill17, Kristofer R. Covey18, Charles L. Curry19, Giuseppe Etiope20, Giuseppe Etiope21, Christian Frankenberg22, Nicola Gedney23, Michaela I. Hegglin24, Lena Höglund-Isaksson25, Gustaf Hugelius17, Misa Ishizawa26, Akihiko Ito26, Greet Janssens-Maenhout, Katherine M. Jensen27, Fortunat Joos28, Thomas Kleinen29, Paul B. Krummel2, Ray L. Langenfelds2, Goulven Gildas Laruelle, Licheng Liu30, Toshinobu Machida26, Shamil Maksyutov26, Kyle C. McDonald27, Joe McNorton31, Paul A. Miller32, Joe R. Melton, Isamu Morino26, Jurek Müller28, Fabiola Murguia-Flores33, Vaishali Naik34, Yosuke Niwa26, Sergio Noce, Simon O'Doherty33, Robert J. Parker35, Changhui Peng36, Shushi Peng37, Glen P. Peters, Catherine Prigent, Ronald G. Prinn38, Michel Ramonet1, Pierre Regnier, William J. Riley39, Judith A. Rosentreter40, Arjo Segers, Isobel J. Simpson12, Hao Shi41, Steven J. Smith42, L. Paul Steele2, Brett F. Thornton17, Hanqin Tian41, Yasunori Tohjima26, Francesco N. Tubiello43, Aki Tsuruta44, Nicolas Viovy1, Apostolos Voulgarakis45, Apostolos Voulgarakis46, Thomas Weber47, Michiel van Weele48, Guido R. van der Werf7, Ray F. Weiss49, Doug Worthy, Debra Wunch50, Yi Yin22, Yi Yin1, Yukio Yoshida26, Weiya Zhang32, Zhen Zhang51, Yuanhong Zhao1, Bo Zheng1, Qing Zhu39, Qiuan Zhu52, Qianlai Zhuang30 
Université Paris-Saclay1, Commonwealth Scientific and Industrial Research Organisation2, Goddard Space Flight Center3, Stanford University4, Yale University5, National Oceanic and Atmospheric Administration6, VU University Amsterdam7, Netherlands Institute for Space Research8, Chiba University9, Japan Agency for Marine-Earth Science and Technology10, Linköping University11, University of California, Irvine12, National Institute of Water and Atmospheric Research13, New York University14, Seconda Università degli Studi di Napoli15, École Polytechnique16, Stockholm University17, Skidmore College18, University of Victoria19, Babeș-Bolyai University20, National Institute of Geophysics and Volcanology21, California Institute of Technology22, Met Office23, University of Reading24, International Institute for Applied Systems Analysis25, National Institute for Environmental Studies26, City University of New York27, University of Bern28, Max Planck Society29, Purdue University30, European Centre for Medium-Range Weather Forecasts31, Lund University32, University of Bristol33, Geophysical Fluid Dynamics Laboratory34, University of Leicester35, Université du Québec à Montréal36, Peking University37, Massachusetts Institute of Technology38, Lawrence Berkeley National Laboratory39, Southern Cross University40, Auburn University41, Joint Global Change Research Institute42, Food and Agriculture Organization43, Finnish Meteorological Institute44, Imperial College London45, Technical University of Crete46, University of Rochester47, Royal Netherlands Meteorological Institute48, Scripps Institution of Oceanography49, University of Toronto50, University of Maryland, College Park51, Hohai University52
TL;DR: The second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modeling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations) as discussed by the authors.
Abstract: Understanding and quantifying the global methane (CH4) budget is important for assessing realistic pathways to mitigate climate change. Atmospheric emissions and concentrations of CH4 continue to increase, making CH4 the second most important human-influenced greenhouse gas in terms of climate forcing, after carbon dioxide (CO2). The relative importance of CH4 compared to CO2 depends on its shorter atmospheric lifetime, stronger warming potential, and variations in atmospheric growth rate over the past decade, the causes of which are still debated. Two major challenges in reducing uncertainties in the atmospheric growth rate arise from the variety of geographically overlapping CH4 sources and from the destruction of CH4 by short-lived hydroxyl radicals (OH). To address these challenges, we have established a consortium of multidisciplinary scientists under the umbrella of the Global Carbon Project to synthesize and stimulate new research aimed at improving and regularly updating the global methane budget. Following Saunois et al. (2016), we present here the second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modelling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations). For the 2008–2017 decade, global methane emissions are estimated by atmospheric inversions (a top-down approach) to be 576 Tg CH4 yr−1 (range 550–594, corresponding to the minimum and maximum estimates of the model ensemble). Of this total, 359 Tg CH4 yr−1 or ∼ 60 % is attributed to anthropogenic sources, that is emissions caused by direct human activity (i.e. anthropogenic emissions; range 336–376 Tg CH4 yr−1 or 50 %–65 %). The mean annual total emission for the new decade (2008–2017) is 29 Tg CH4 yr−1 larger than our estimate for the previous decade (2000–2009), and 24 Tg CH4 yr−1 larger than the one reported in the previous budget for 2003–2012 (Saunois et al., 2016). Since 2012, global CH4 emissions have been tracking the warmest scenarios assessed by the Intergovernmental Panel on Climate Change. Bottom-up methods suggest almost 30 % larger global emissions (737 Tg CH4 yr−1, range 594–881) than top-down inversion methods. Indeed, bottom-up estimates for natural sources such as natural wetlands, other inland water systems, and geological sources are higher than top-down estimates. The atmospheric constraints on the top-down budget suggest that at least some of these bottom-up emissions are overestimated. The latitudinal distribution of atmospheric observation-based emissions indicates a predominance of tropical emissions (∼ 65 % of the global budget, < 30∘ N) compared to mid-latitudes (∼ 30 %, 30–60∘ N) and high northern latitudes (∼ 4 %, 60–90∘ N). The most important source of uncertainty in the methane budget is attributable to natural emissions, especially those from wetlands and other inland waters. Some of our global source estimates are smaller than those in previously published budgets (Saunois et al., 2016; Kirschke et al., 2013). In particular wetland emissions are about 35 Tg CH4 yr−1 lower due to improved partition wetlands and other inland waters. Emissions from geological sources and wild animals are also found to be smaller by 7 Tg CH4 yr−1 by 8 Tg CH4 yr−1, respectively. However, the overall discrepancy between bottom-up and top-down estimates has been reduced by only 5 % compared to Saunois et al. (2016), due to a higher estimate of emissions from inland waters, highlighting the need for more detailed research on emissions factors. Priorities for improving the methane budget include (i) a global, high-resolution map of water-saturated soils and inundated areas emitting methane based on a robust classification of different types of emitting habitats; (ii) further development of process-based models for inland-water emissions; (iii) intensification of methane observations at local scales (e.g., FLUXNET-CH4 measurements) and urban-scale monitoring to constrain bottom-up land surface models, and at regional scales (surface networks and satellites) to constrain atmospheric inversions; (iv) improvements of transport models and the representation of photochemical sinks in top-down inversions; and (v) development of a 3D variational inversion system using isotopic and/or co-emitted species such as ethane to improve source partitioning.

1,047 citations


Journal ArticleDOI
TL;DR: The ARRIVE guidelines are revised to update them and facilitate their use in practice and this explanation and elaboration document was developed as part of the revision.
Abstract: Improving the reproducibility of biomedical research is a major challenge. Transparent and accurate reporting is vital to this process; it allows readers to assess the reliability of the findings and repeat or build upon the work of other researchers. The ARRIVE guidelines (Animal Research: Reporting In Vivo Experiments) were developed in 2010 to help authors and journals identify the minimum information necessary to report in publications describing in vivo experiments. Despite widespread endorsement by the scientific community, the impact of ARRIVE on the transparency of reporting in animal research publications has been limited. We have revised the ARRIVE guidelines to update them and facilitate their use in practice. The revised guidelines are published alongside this paper. This explanation and elaboration document was developed as part of the revision. It provides further information about each of the 21 items in ARRIVE 2.0, including the rationale and supporting evidence for their inclusion in the guidelines, elaboration of details to report, and examples of good reporting from the published literature. This document also covers advice and best practice in the design and conduct of animal studies to support researchers in improving standards from the start of the experimental design process through to publication.

Journal ArticleDOI
Jens Kattge1, Gerhard Bönisch2, Sandra Díaz3, Sandra Lavorel  +751 moreInstitutions (314)
TL;DR: The extent of the trait data compiled in TRY is evaluated and emerging patterns of data coverage and representativeness are analyzed to conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements.
Abstract: Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives.

Journal ArticleDOI
TL;DR: It is suggested that most people who become infected with SARS-CoV-2 will not remain asymptomatic throughout the course of the infection, and combination prevention measures will continue to be needed.
Abstract: BACKGROUND There is disagreement about the level of asymptomatic severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection. We conducted a living systematic review and meta-analysis to address three questions: (1) Amongst people who become infected with SARS-CoV-2, what proportion does not experience symptoms at all during their infection? (2) Amongst people with SARS-CoV-2 infection who are asymptomatic when diagnosed, what proportion will develop symptoms later? (3) What proportion of SARS-CoV-2 transmission is accounted for by people who are either asymptomatic throughout infection or presymptomatic? METHODS AND FINDINGS We searched PubMed, Embase, bioRxiv, and medRxiv using a database of SARS-CoV-2 literature that is updated daily, on 25 March 2020, 20 April 2020, and 10 June 2020. Studies of people with SARS-CoV-2 diagnosed by reverse transcriptase PCR (RT-PCR) that documented follow-up and symptom status at the beginning and end of follow-up or modelling studies were included. One reviewer extracted data and a second verified the extraction, with disagreement resolved by discussion or a third reviewer. Risk of bias in empirical studies was assessed with an adapted checklist for case series, and the relevance and credibility of modelling studies were assessed using a published checklist. We included a total of 94 studies. The overall estimate of the proportion of people who become infected with SARS-CoV-2 and remain asymptomatic throughout infection was 20% (95% confidence interval [CI] 17-25) with a prediction interval of 3%-67% in 79 studies that addressed this review question. There was some evidence that biases in the selection of participants influence the estimate. In seven studies of defined populations screened for SARS-CoV-2 and then followed, 31% (95% CI 26%-37%, prediction interval 24%-38%) remained asymptomatic. The proportion of people that is presymptomatic could not be summarised, owing to heterogeneity. The secondary attack rate was lower in contacts of people with asymptomatic infection than those with symptomatic infection (relative risk 0.35, 95% CI 0.10-1.27). Modelling studies fit to data found a higher proportion of all SARS-CoV-2 infections resulting from transmission from presymptomatic individuals than from asymptomatic individuals. Limitations of the review include that most included studies were not designed to estimate the proportion of asymptomatic SARS-CoV-2 infections and were at risk of selection biases; we did not consider the possible impact of false negative RT-PCR results, which would underestimate the proportion of asymptomatic infections; and the database does not include all sources. CONCLUSIONS The findings of this living systematic review suggest that most people who become infected with SARS-CoV-2 will not remain asymptomatic throughout the course of the infection. The contribution of presymptomatic and asymptomatic infections to overall SARS-CoV-2 transmission means that combination prevention measures, with enhanced hand hygiene, masks, testing tracing, and isolation strategies and social distancing, will continue to be needed.

Journal ArticleDOI
TL;DR: These updated recommendations take into account all rTMS publications, including data prior to 2014, as well as currently reviewed literature until the end of 2018, and are based on the differences reached in therapeutic efficacy of real vs. sham rT MS protocols.

Journal ArticleDOI
T. Aoyama1, Nils Asmussen2, M. Benayoun3, Johan Bijnens4  +146 moreInstitutions (64)
TL;DR: The current status of the Standard Model calculation of the anomalous magnetic moment of the muon is reviewed in this paper, where the authors present a detailed account of recent efforts to improve the calculation of these two contributions with either a data-driven, dispersive approach, or a first-principle, lattice approach.

Journal ArticleDOI
TL;DR: The identification of the elements of the gut-liver axis primarily damaged in each chronic liver disease offers possibilities to intervention.

Journal ArticleDOI
TL;DR: Landscape compositions that can mitigate trade-offs under optimal land-use allocation but also show that intensive monocultures always lead to higher profits are identified, suggesting that targeted landscape planning is needed to increase land- use efficiency while ensuring socio-ecological sustainability.
Abstract: Land-use transitions can enhance the livelihoods of smallholder farmers but potential economic-ecological trade-offs remain poorly understood. Here, we present an interdisciplinary study of the environmental, social and economic consequences of land-use transitions in a tropical smallholder landscape on Sumatra, Indonesia. We find widespread biodiversity-profit trade-offs resulting from land-use transitions from forest and agroforestry systems to rubber and oil palm monocultures, for 26,894 aboveground and belowground species and whole-ecosystem multidiversity. Despite variation between ecosystem functions, profit gains come at the expense of ecosystem multifunctionality, indicating far-reaching ecosystem deterioration. We identify landscape compositions that can mitigate trade-offs under optimal land-use allocation but also show that intensive monocultures always lead to higher profits. These findings suggest that, to reduce losses in biodiversity and ecosystem functioning, changes in economic incentive structures through well-designed policies are urgently needed.

Journal ArticleDOI
08 Oct 2020-Nature
TL;DR: A global N2O inventory is presented that incorporates both natural and anthropogenic sources and accounts for the interaction between nitrogen additions and the biochemical processes that control N 2O emissions, using bottom-up, top-down and process-based model approaches.
Abstract: Nitrous oxide (N2O), like carbon dioxide, is a long-lived greenhouse gas that accumulates in the atmosphere. Over the past 150 years, increasing atmospheric N2O concentrations have contributed to stratospheric ozone depletion1 and climate change2, with the current rate of increase estimated at 2 per cent per decade. Existing national inventories do not provide a full picture of N2O emissions, owing to their omission of natural sources and limitations in methodology for attributing anthropogenic sources. Here we present a global N2O inventory that incorporates both natural and anthropogenic sources and accounts for the interaction between nitrogen additions and the biochemical processes that control N2O emissions. We use bottom-up (inventory, statistical extrapolation of flux measurements, process-based land and ocean modelling) and top-down (atmospheric inversion) approaches to provide a comprehensive quantification of global N2O sources and sinks resulting from 21 natural and human sectors between 1980 and 2016. Global N2O emissions were 17.0 (minimum-maximum estimates: 12.2-23.5) teragrams of nitrogen per year (bottom-up) and 16.9 (15.9-17.7) teragrams of nitrogen per year (top-down) between 2007 and 2016. Global human-induced emissions, which are dominated by nitrogen additions to croplands, increased by 30% over the past four decades to 7.3 (4.2-11.4) teragrams of nitrogen per year. This increase was mainly responsible for the growth in the atmospheric burden. Our findings point to growing N2O emissions in emerging economies-particularly Brazil, China and India. Analysis of process-based model estimates reveals an emerging N2O-climate feedback resulting from interactions between nitrogen additions and climate change. The recent growth in N2O emissions exceeds some of the highest projected emission scenarios3,4, underscoring the urgency to mitigate N2O emissions.

Journal ArticleDOI
TL;DR: In this paper, a consensus set of six molecular classes (luminal papillary (24%), luminal nonspecified (8), luminal unstable (15), stroma-rich (15%), basal/squamous (35%), and neuroendocrine-like (3%) was identified.

Journal ArticleDOI
20 Jan 2020
TL;DR: In this article, the authors propose a set of four general principles that underlie high-quality knowledge co-production for sustainability research, and offer practical guidance on how to engage in meaningful co-productive practices, and how to evaluate their quality and success.
Abstract: Research practice, funding agencies and global science organizations suggest that research aimed at addressing sustainability challenges is most effective when ‘co-produced’ by academics and non-academics. Co-production promises to address the complex nature of contemporary sustainability challenges better than more traditional scientific approaches. But definitions of knowledge co-production are diverse and often contradictory. We propose a set of four general principles that underlie high-quality knowledge co-production for sustainability research. Using these principles, we offer practical guidance on how to engage in meaningful co-productive practices, and how to evaluate their quality and success.

Journal ArticleDOI
TL;DR: In this article, a review of lattice results related to pion, kaon, D-meson, neutral kaon mixing, B-meon, and nucleon physics with the aim of making them easily accessible to the nuclear and particle physics communities is presented.
Abstract: We review lattice results related to pion, kaon, D-meson, B-meson, and nucleon physics with the aim of making them easily accessible to the nuclear and particle physics communities. More specifically, we report on the determination of the light-quark masses, the form factor $f_+(0)$ arising in the semileptonic $K \rightarrow \pi $ transition at zero momentum transfer, as well as the decay constant ratio $f_K/f_\pi $ and its consequences for the CKM matrix elements $V_{us}$ and $V_{ud}$. Furthermore, we describe the results obtained on the lattice for some of the low-energy constants of $SU(2)_L\times SU(2)_R$ and $SU(3)_L\times SU(3)_R$ Chiral Perturbation Theory. We review the determination of the $B_K$ parameter of neutral kaon mixing as well as the additional four B parameters that arise in theories of physics beyond the Standard Model. For the heavy-quark sector, we provide results for $m_c$ and $m_b$ as well as those for D- and B-meson decay constants, form factors, and mixing parameters. These are the heavy-quark quantities most relevant for the determination of CKM matrix elements and the global CKM unitarity-triangle fit. We review the status of lattice determinations of the strong coupling constant $\alpha _s$. Finally, in this review we have added a new section reviewing results for nucleon matrix elements of the axial, scalar and tensor bilinears, both isovector and flavor diagonal.


Journal ArticleDOI
10 Dec 2020-PLOS ONE
TL;DR: The findings reinforce the need for repeated testing in patients with suspicion of SARS-Cov-2 infection given that up to 54% of COVID-19 patients may have an initial false-negative RT-PCR (very low certainty of evidence).
Abstract: Background A false-negative case of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection is defined as a person with suspected infection and an initial negative result by reverse transcription-polymerase chain reaction (RT-PCR) test, with a positive result on a subsequent test. False-negative cases have important implications for isolation and risk of transmission of infected people and for the management of coronavirus disease 2019 (COVID-19). We aimed to review and critically appraise evidence about the rate of RT-PCR false-negatives at initial testing for COVID-19. Methods We searched MEDLINE, EMBASE, LILACS, as well as COVID-19 repositories, including the EPPI-Centre living systematic map of evidence about COVID-19 and the Coronavirus Open Access Project living evidence database. Two authors independently screened and selected studies according to the eligibility criteria and collected data from the included studies. The risk of bias was assessed using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. We calculated the proportion of false-negative test results using a multilevel mixed-effect logistic regression model. The certainty of the evidence about false-negative cases was rated using the GRADE approach for tests and strategies. All information in this article is current up to July 17, 2020. Results We included 34 studies enrolling 12,057 COVID-19 confirmed cases. All studies were affected by several risks of bias and applicability concerns. The pooled estimate of false-negative proportion was highly affected by unexplained heterogeneity (tau-squared = 1.39; 90% prediction interval from 0.02 to 0.54). The certainty of the evidence was judged as very low due to the risk of bias, indirectness, and inconsistency issues. Conclusions There is substantial and largely unexplained heterogeneity in the proportion of false-negative RT-PCR results. The collected evidence has several limitations, including risk of bias issues, high heterogeneity, and concerns about its applicability. Nonetheless, our findings reinforce the need for repeated testing in patients with suspicion of SARS-Cov-2 infection given that up to 54% of COVID-19 patients may have an initial false-negative RT-PCR (very low certainty of evidence). Systematic review registration Protocol available on the OSF website: https://tinyurl.com/vvbgqya.

Journal ArticleDOI
TL;DR: A methodological framework to evaluate confidence in the results from network meta-analyses, Confidence in Network Meta-Analysis (CINeMA), when multiple interventions are compared is presented, which improves transparency and avoids the selective use of evidence when forming judgments, thus limiting subjectivity in the process.
Abstract: BACKGROUND The evaluation of the credibility of results from a meta-analysis has become an important part of the evidence synthesis process. We present a methodological framework to evaluate confidence in the results from network meta-analyses, Confidence in Network Meta-Analysis (CINeMA), when multiple interventions are compared. METHODOLOGY CINeMA considers 6 domains: (i) within-study bias, (ii) reporting bias, (iii) indirectness, (iv) imprecision, (v) heterogeneity, and (vi) incoherence. Key to judgments about within-study bias and indirectness is the percentage contribution matrix, which shows how much information each study contributes to the results from network meta-analysis. The contribution matrix can easily be computed using a freely available web application. In evaluating imprecision, heterogeneity, and incoherence, we consider the impact of these components of variability in forming clinical decisions. CONCLUSIONS Via 3 examples, we show that CINeMA improves transparency and avoids the selective use of evidence when forming judgments, thus limiting subjectivity in the process. CINeMA is easy to apply even in large and complicated networks.

Journal ArticleDOI
09 Mar 2020
TL;DR: A typology of compound events is proposed, distinguishing events that are preconditioned, multivariate, temporally compounding and spatially compounding, and suggests analytical and modelling approaches to aid in their investigation.
Abstract: Compound weather and climate events describe combinations of multiple climate drivers and/or hazards that contribute to societal or environmental risk. Although many climate-related disasters are caused by compound events, the understanding, analysis, quantification and prediction of such events is still in its infancy. In this Review, we propose a typology of compound events and suggest analytical and modelling approaches to aid in their investigation. We organize the highly diverse compound event types according to four themes: preconditioned, where a weather-driven or climate-driven precondition aggravates the impacts of a hazard; multivariate, where multiple drivers and/or hazards lead to an impact; temporally compounding, where a succession of hazards leads to an impact; and spatially compounding, where hazards in multiple connected locations cause an aggregated impact. Through structuring compound events and their respective analysis tools, the typology offers an opportunity for deeper insight into their mechanisms and impacts, benefiting the development of effective adaptation strategies. However, the complex nature of compound events results in some cases inevitably fitting into more than one class, necessitating soft boundaries within the typology. Future work must homogenize the available analytical approaches into a robust toolset for compound-event analysis under present and future climate conditions. Research on compound events has increased vastly in the last several years, yet, a typology was absent. This Review proposes a comprehensive classification scheme, incorporating compound events that are preconditioned, multivariate, temporally compounding and spatially compounding events.

Journal ArticleDOI
TL;DR: This S3 guideline informs clinical practice, health systems, policymakers and, indirectly, the public on the available and most effective modalities to treat periodontitis and to maintain a healthy dentition for a lifetime, according to the available evidence at the time of publication.
Abstract: BACKGROUND The recently introduced 2017 World Workshop on the classification of periodontitis, incorporating stages and grades of disease, aims to link disease classification with approaches to prevention and treatment, as it describes not only disease severity and extent but also the degree of complexity and an individual's risk. There is, therefore, a need for evidence-based clinical guidelines providing recommendations to treat periodontitis. AIM The objective of the current project was to develop a S3 Level Clinical Practice Guideline (CPG) for the treatment of Stage I-III periodontitis. MATERIAL AND METHODS This S3 CPG was developed under the auspices of the European Federation of Periodontology (EFP), following the methodological guidance of the Association of Scientific Medical Societies in Germany and the Grading of Recommendations Assessment, Development and Evaluation (GRADE). The rigorous and transparent process included synthesis of relevant research in 15 specifically commissioned systematic reviews, evaluation of the quality and strength of evidence, the formulation of specific recommendations and consensus, on those recommendations, by leading experts and a broad base of stakeholders. RESULTS The S3 CPG approaches the treatment of periodontitis (stages I, II and III) using a pre-established stepwise approach to therapy that, depending on the disease stage, should be incremental, each including different interventions. Consensus was achieved on recommendations covering different interventions, aimed at (a) behavioural changes, supragingival biofilm, gingival inflammation and risk factor control; (b) supra- and sub-gingival instrumentation, with and without adjunctive therapies; (c) different types of periodontal surgical interventions; and (d) the necessary supportive periodontal care to extend benefits over time. CONCLUSION This S3 guideline informs clinical practice, health systems, policymakers and, indirectly, the public on the available and most effective modalities to treat periodontitis and to maintain a healthy dentition for a lifetime, according to the available evidence at the time of publication.

Journal ArticleDOI
TL;DR: This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record.

Journal ArticleDOI
T. Aoyama1, Nils Asmussen2, M. Benayoun3, Johan Bijnens4  +146 moreInstitutions (64)
TL;DR: The current status of the Standard Model calculation of the anomalous magnetic moment of the muon has been reviewed in this paper, where the authors present a detailed account of recent efforts to improve the calculation of these two contributions with either a data-driven, dispersive approach, or a first-principle, lattice-QCD approach.
Abstract: We review the present status of the Standard Model calculation of the anomalous magnetic moment of the muon. This is performed in a perturbative expansion in the fine-structure constant $\alpha$ and is broken down into pure QED, electroweak, and hadronic contributions. The pure QED contribution is by far the largest and has been evaluated up to and including $\mathcal{O}(\alpha^5)$ with negligible numerical uncertainty. The electroweak contribution is suppressed by $(m_\mu/M_W)^2$ and only shows up at the level of the seventh significant digit. It has been evaluated up to two loops and is known to better than one percent. Hadronic contributions are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. The leading hadronic contribution appears at $\mathcal{O}(\alpha^2)$ and is due to hadronic vacuum polarization, whereas at $\mathcal{O}(\alpha^3)$ the hadronic light-by-light scattering contribution appears. Given the low characteristic scale of this observable, these contributions have to be calculated with nonperturbative methods, in particular, dispersion relations and the lattice approach to QCD. The largest part of this review is dedicated to a detailed account of recent efforts to improve the calculation of these two contributions with either a data-driven, dispersive approach, or a first-principle, lattice-QCD approach. The final result reads $a_\mu^\text{SM}=116\,591\,810(43)\times 10^{-11}$ and is smaller than the Brookhaven measurement by 3.7$\sigma$. The experimental uncertainty will soon be reduced by up to a factor four by the new experiment currently running at Fermilab, and also by the future J-PARC experiment. This and the prospects to further reduce the theoretical uncertainty in the near future-which are also discussed here-make this quantity one of the most promising places to look for evidence of new physics.

Journal ArticleDOI
TL;DR: Why the testing strategy in Switzerland should be strengthened urgently, as a core component of a combination approach to control COVID-19 is explained.
Abstract: Switzerland is among the countries with the highest number of coronavirus disease-2019 (COVID-19) cases per capita in the world. There are likely many people with undetected SARS-CoV-2 infection because testing efforts are currently not detecting all infected people, including some with clinical disease compatible with COVID-19. Testing on its own will not stop the spread of SARS-CoV-2. Testing is part of a strategy. The World Health Organization recommends a combination of measures: rapid diagnosis and immediate isolation of cases, rigorous tracking and precautionary self-isolation of close contacts. In this article, we explain why the testing strategy in Switzerland should be strengthened urgently, as a core component of a combination approach to control COVID-19.