Showing papers by "VU University Amsterdam published in 2020"
••
New York University1, University of Chicago2, Mackenzie Presbyterian University3, Middlesex University4, University of Kent5, Nicolaus Copernicus University in Toruń6, Harvard University7, Yale University8, Stanford University9, Northwestern University10, University of Sussex11, Utrecht University12, University of California, San Diego13, University of Maryland, College Park14, McGovern Institute for Brain Research15, University of Queensland16, University of Michigan17, California Institute of Technology18, Lehigh University19, University of Regina20, University of Oregon21, Ohio State University22, Massachusetts Institute of Technology23, University of St Andrews24, University of Cambridge25, University of British Columbia26, University of Illinois at Chicago27, University of California, Berkeley28, Carleton University29, VU University Amsterdam30, Cornell University31
TL;DR: Evidence from a selection of research topics relevant to pandemics is discussed, including work on navigating threats, social and cultural influences on behaviour, science communication, moral decision-making, leadership, and stress and coping.
Abstract: The COVID-19 pandemic represents a massive global health crisis. Because the crisis requires large-scale behaviour change and places significant psychological burdens on individuals, insights from the social and behavioural sciences can be used to help align human behaviour with the recommendations of epidemiologists and public health experts. Here we discuss evidence from a selection of research topics relevant to pandemics, including work on navigating threats, social and cultural influences on behaviour, science communication, moral decision-making, leadership, and stress and coping. In each section, we note the nature and quality of prior research, including uncertainty and unsettled issues. We identify several insights for effective response to the COVID-19 pandemic and highlight important gaps researchers should move quickly to fill in the coming weeks and months.
3,223 citations
••
École Normale Supérieure1, University of Exeter2, Norwich Research Park3, University of Groningen4, Wageningen University and Research Centre5, Ludwig Maximilian University of Munich6, Max Planck Society7, Commonwealth Scientific and Industrial Research Organisation8, Université Paris-Saclay9, Stanford University10, National Oceanic and Atmospheric Administration11, National Institute for Space Research12, University of Southampton13, Bermuda Institute of Ocean Sciences14, PSL Research University15, National Institute for Environmental Studies16, Japan Agency for Marine-Earth Science and Technology17, University of Maryland, College Park18, University of Leeds19, International Institute of Minnesota20, Flanders Marine Institute21, ETH Zurich22, University of East Anglia23, German Aerospace Center24, Woods Hole Research Center25, University of Illinois at Urbana–Champaign26, University of Toulouse27, Japan Meteorological Agency28, Plymouth Marine Laboratory29, University of Paris30, Hobart Corporation31, Oeschger Centre for Climate Change Research32, Tsinghua University33, National Center for Atmospheric Research34, Appalachian State University35, University of Colorado Boulder36, University of Washington37, Atlantic Oceanographic and Meteorological Laboratory38, Princeton University39, Met Office40, Leibniz Institute of Marine Sciences41, Auburn University42, University of Tasmania43, VU University Amsterdam44, Oak Ridge National Laboratory45, Sun Yat-sen University46, Nanjing University47
TL;DR: In this paper, the authors describe and synthesize data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties, including emissions from land use and land-use change data and bookkeeping models.
Abstract: Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere in a changing climate – the “global carbon budget” – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe and synthesize data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. Fossil CO2 emissions (EFOS) are based on energy statistics and cement production data, while emissions from land-use change (ELUC), mainly deforestation, are based on land use and land-use change data and bookkeeping models. Atmospheric CO2 concentration is measured directly and its growth rate (GATM) is computed from the annual changes in concentration. The ocean CO2 sink (SOCEAN) and terrestrial CO2 sink (SLAND) are estimated with global process models constrained by observations. The resulting carbon budget imbalance (BIM), the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ. For the last decade available (2010–2019), EFOS was 9.6 ± 0.5 GtC yr−1 excluding the cement carbonation sink (9.4 ± 0.5 GtC yr−1 when the cement carbonation sink is included), and ELUC was 1.6 ± 0.7 GtC yr−1. For the same decade, GATM was 5.1 ± 0.02 GtC yr−1 (2.4 ± 0.01 ppm yr−1), SOCEAN 2.5 ± 0.6 GtC yr−1, and SLAND 3.4 ± 0.9 GtC yr−1, with a budget imbalance BIM of −0.1 GtC yr−1 indicating a near balance between estimated sources and sinks over the last decade. For the year 2019 alone, the growth in EFOS was only about 0.1 % with fossil emissions increasing to 9.9 ± 0.5 GtC yr−1 excluding the cement carbonation sink (9.7 ± 0.5 GtC yr−1 when cement carbonation sink is included), and ELUC was 1.8 ± 0.7 GtC yr−1, for total anthropogenic CO2 emissions of 11.5 ± 0.9 GtC yr−1 (42.2 ± 3.3 GtCO2). Also for 2019, GATM was 5.4 ± 0.2 GtC yr−1 (2.5 ± 0.1 ppm yr−1), SOCEAN was 2.6 ± 0.6 GtC yr−1, and SLAND was 3.1 ± 1.2 GtC yr−1, with a BIM of 0.3 GtC. The global atmospheric CO2 concentration reached 409.85 ± 0.1 ppm averaged over 2019. Preliminary data for 2020, accounting for the COVID-19-induced changes in emissions, suggest a decrease in EFOS relative to 2019 of about −7 % (median estimate) based on individual estimates from four studies of −6 %, −7 %, −7 % (−3 % to −11 %), and −13 %. Overall, the mean and trend in the components of the global carbon budget are consistently estimated over the period 1959–2019, but discrepancies of up to 1 GtC yr−1 persist for the representation of semi-decadal variability in CO2 fluxes. Comparison of estimates from diverse approaches and observations shows (1) no consensus in the mean and trend in land-use change emissions over the last decade, (2) a persistent low agreement between the different methods on the magnitude of the land CO2 flux in the northern extra-tropics, and (3) an apparent discrepancy between the different methods for the ocean sink outside the tropics, particularly in the Southern Ocean. This living data update documents changes in the methods and data sets used in this new global carbon budget and the progress in understanding of the global carbon cycle compared with previous publications of this data set (Friedlingstein et al., 2019; Le Quere et al., 2018b, a, 2016, 2015b, a, 2014, 2013). The data presented in this work are available at https://doi.org/10.18160/gcp-2020 (Friedlingstein et al., 2020).
1,764 citations
••
TL;DR: In 2019, the LIGO Livingston detector observed a compact binary coalescence with signal-to-noise ratio 12.9 and the Virgo detector was also taking data that did not contribute to detection due to a low SINR but were used for subsequent parameter estimation as discussed by the authors.
Abstract: On 2019 April 25, the LIGO Livingston detector observed a compact binary coalescence with signal-to-noise ratio 12.9. The Virgo detector was also taking data that did not contribute to detection due to a low signal-to-noise ratio, but were used for subsequent parameter estimation. The 90% credible intervals for the component masses range from to if we restrict the dimensionless component spin magnitudes to be smaller than 0.05). These mass parameters are consistent with the individual binary components being neutron stars. However, both the source-frame chirp mass and the total mass of this system are significantly larger than those of any other known binary neutron star (BNS) system. The possibility that one or both binary components of the system are black holes cannot be ruled out from gravitational-wave data. We discuss possible origins of the system based on its inconsistency with the known Galactic BNS population. Under the assumption that the signal was produced by a BNS coalescence, the local rate of neutron star mergers is updated to 250-2810.
1,189 citations
••
26 Nov 2020
TL;DR: This Primer provides an overview of the epidemiology, pathogenesis and treatment of HNSCCs of different aetiologies and the effects of the cancer and its treatment on patient quality of life.
Abstract: Most head and neck cancers are derived from the mucosal epithelium in the oral cavity, pharynx and larynx and are known collectively as head and neck squamous cell carcinoma (HNSCC). Oral cavity and larynx cancers are generally associated with tobacco consumption, alcohol abuse or both, whereas pharynx cancers are increasingly attributed to infection with human papillomavirus (HPV), primarily HPV-16. Thus, HNSCC can be separated into HPV-negative and HPV-positive HNSCC. Despite evidence of histological progression from cellular atypia through various degrees of dysplasia, ultimately leading to invasive HNSCC, most patients are diagnosed with late-stage HNSCC without a clinically evident antecedent pre-malignant lesion. Traditional staging of HNSCC using the tumour–node–metastasis system has been supplemented by the 2017 AJCC/UICC staging system, which incorporates additional information relevant to HPV-positive disease. Treatment is generally multimodal, consisting of surgery followed by chemoradiotherapy (CRT) for oral cavity cancers and primary CRT for pharynx and larynx cancers. The EGFR monoclonal antibody cetuximab is generally used in combination with radiation in HPV-negative HNSCC where comorbidities prevent the use of cytotoxic chemotherapy. The FDA approved the immune checkpoint inhibitors pembrolizumab and nivolumab for treatment of recurrent or metastatic HNSCC and pembrolizumab as primary treatment for unresectable disease. Elucidation of the molecular genetic landscape of HNSCC over the past decade has revealed new opportunities for therapeutic intervention. Ongoing efforts aim to integrate our understanding of HNSCC biology and immunobiology to identify predictive biomarkers that will enable delivery of the most effective, least-toxic therapies. Head and neck squamous cell carcinomas (HNSCCs) originate from the mucosal epithelium in the oral cavity, pharynx and larynx and are commonly associated with viral infection and tobacco use. This Primer provides an overview of the epidemiology, pathogenesis and treatment of HNSCCs of different aetiologies and the effects of the cancer and its treatment on patient quality of life.
1,152 citations
••
Université Paris-Saclay1, Commonwealth Scientific and Industrial Research Organisation2, Goddard Space Flight Center3, Stanford University4, Yale University5, National Oceanic and Atmospheric Administration6, VU University Amsterdam7, Netherlands Institute for Space Research8, Japan Agency for Marine-Earth Science and Technology9, Chiba University10, Linköping University11, University of California, Irvine12, National Institute of Water and Atmospheric Research13, New York University14, Seconda Università degli Studi di Napoli15, École Polytechnique16, Stockholm University17, Skidmore College18, University of Victoria19, National Institute of Geophysics and Volcanology20, Babeș-Bolyai University21, California Institute of Technology22, Met Office23, University of Reading24, International Institute for Applied Systems Analysis25, National Institute for Environmental Studies26, City University of New York27, University of Bern28, Max Planck Society29, Purdue University30, European Centre for Medium-Range Weather Forecasts31, Lund University32, University of Bristol33, Geophysical Fluid Dynamics Laboratory34, University of Leicester35, Université du Québec à Montréal36, Peking University37, Massachusetts Institute of Technology38, Lawrence Berkeley National Laboratory39, Southern Cross University40, Auburn University41, Joint Global Change Research Institute42, Food and Agriculture Organization43, Finnish Meteorological Institute44, Technical University of Crete45, Imperial College London46, University of Rochester47, Royal Netherlands Meteorological Institute48, Scripps Institution of Oceanography49, University of Toronto50, University of Maryland, College Park51, Hohai University52
TL;DR: The second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modeling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations) as discussed by the authors.
Abstract: Understanding and quantifying the global methane (CH4) budget is important for assessing realistic pathways to mitigate climate change. Atmospheric emissions and concentrations of CH4 continue to increase, making CH4 the second most important human-influenced greenhouse gas in terms of climate forcing, after carbon dioxide (CO2). The relative importance of CH4 compared to CO2 depends on its shorter atmospheric lifetime, stronger warming potential, and variations in atmospheric growth rate over the past decade, the causes of which are still debated. Two major challenges in reducing uncertainties in the atmospheric growth rate arise from the variety of geographically overlapping CH4 sources and from the destruction of CH4 by short-lived hydroxyl radicals (OH). To address these challenges, we have established a consortium of multidisciplinary scientists under the umbrella of the Global Carbon Project to synthesize and stimulate new research aimed at improving and regularly updating the global methane budget. Following Saunois et al. (2016), we present here the second version of the living review paper dedicated to the decadal methane budget, integrating results of top-down studies (atmospheric observations within an atmospheric inverse-modelling framework) and bottom-up estimates (including process-based models for estimating land surface emissions and atmospheric chemistry, inventories of anthropogenic emissions, and data-driven extrapolations).
For the 2008–2017 decade, global methane emissions are estimated by atmospheric inversions (a top-down approach) to be 576 Tg CH4 yr−1 (range 550–594, corresponding to the minimum and maximum estimates of the model ensemble). Of this total, 359 Tg CH4 yr−1 or ∼ 60 % is attributed to anthropogenic sources, that is emissions caused by direct human activity (i.e. anthropogenic emissions; range 336–376 Tg CH4 yr−1 or 50 %–65 %). The mean annual total emission for the new decade (2008–2017) is 29 Tg CH4 yr−1 larger than our estimate for the previous decade (2000–2009), and 24 Tg CH4 yr−1 larger than the one reported in the previous budget for 2003–2012 (Saunois et al., 2016). Since 2012, global CH4 emissions have been tracking the warmest scenarios assessed by the Intergovernmental Panel on Climate Change. Bottom-up methods suggest almost 30 % larger global emissions (737 Tg CH4 yr−1, range 594–881) than top-down inversion methods. Indeed, bottom-up estimates for natural sources such as natural wetlands, other inland water systems, and geological sources are higher than top-down estimates. The atmospheric constraints on the top-down budget suggest that at least some of these bottom-up emissions are overestimated. The latitudinal distribution of atmospheric observation-based emissions indicates a predominance of tropical emissions (∼ 65 % of the global budget, < 30∘ N) compared to mid-latitudes (∼ 30 %, 30–60∘ N) and high northern latitudes (∼ 4 %, 60–90∘ N). The most important source of uncertainty in the methane budget is attributable to natural emissions, especially those from wetlands and other inland waters.
Some of our global source estimates are smaller than those in previously published budgets (Saunois et al., 2016; Kirschke et al., 2013). In particular wetland emissions are about 35 Tg CH4 yr−1 lower due to improved partition wetlands and other inland waters. Emissions from geological sources and wild animals are also found to be smaller by 7 Tg CH4 yr−1 by 8 Tg CH4 yr−1, respectively. However, the overall discrepancy between bottom-up and top-down estimates has been reduced by only 5 % compared to Saunois et al. (2016), due to a higher estimate of emissions from inland waters, highlighting the need for more detailed research on emissions factors. Priorities for improving the methane budget include (i) a global, high-resolution map of water-saturated soils and inundated areas emitting methane based on a robust classification of different types of emitting habitats; (ii) further development of process-based models for inland-water emissions; (iii) intensification of methane observations at local scales (e.g., FLUXNET-CH4 measurements) and urban-scale monitoring to constrain bottom-up land surface models, and at regional scales (surface networks and satellites) to constrain atmospheric inversions; (iv) improvements of transport models and the representation of photochemical sinks in top-down inversions; and (v) development of a 3D variational inversion system using isotopic and/or co-emitted species such as ethane to improve source partitioning.
1,047 citations
••
TL;DR: In this paper, the authors reported the observation of a compact binary coalescence involving a 222 −243 M ⊙ black hole and a compact object with a mass of 250 −267 M ⋆ (all measurements quoted at the 90% credible level) The gravitational-wave signal, GW190814, was observed during LIGO's and Virgo's third observing run on 2019 August 14 at 21:10:39 UTC and has a signal-to-noise ratio of 25 in the three-detector network.
Abstract: We report the observation of a compact binary coalescence involving a 222–243 M ⊙ black hole and a compact object with a mass of 250–267 M ⊙ (all measurements quoted at the 90% credible level) The gravitational-wave signal, GW190814, was observed during LIGO's and Virgo's third observing run on 2019 August 14 at 21:10:39 UTC and has a signal-to-noise ratio of 25 in the three-detector network The source was localized to 185 deg2 at a distance of ${241}_{-45}^{+41}$ Mpc; no electromagnetic counterpart has been confirmed to date The source has the most unequal mass ratio yet measured with gravitational waves, ${0112}_{-0009}^{+0008}$, and its secondary component is either the lightest black hole or the heaviest neutron star ever discovered in a double compact-object system The dimensionless spin of the primary black hole is tightly constrained to ≤007 Tests of general relativity reveal no measurable deviations from the theory, and its prediction of higher-multipole emission is confirmed at high confidence We estimate a merger rate density of 1–23 Gpc−3 yr−1 for the new class of binary coalescence sources that GW190814 represents Astrophysical models predict that binaries with mass ratios similar to this event can form through several channels, but are unlikely to have formed in globular clusters However, the combination of mass ratio, component masses, and the inferred merger rate for this event challenges all current models of the formation and mass distribution of compact-object binaries
913 citations
••
TL;DR: The extent of the trait data compiled in TRY is evaluated and emerging patterns of data coverage and representativeness are analyzed to conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements.
Abstract: Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives.
882 citations
••
TL;DR: It is inferred that the primary black hole mass lies within the gap produced by (pulsational) pair-instability supernova processes, with only a 0.32% probability of being below 65 M⊙, which can be considered an intermediate mass black hole (IMBH).
Abstract: On May 21, 2019 at 03:02:29 UTC Advanced LIGO and Advanced Virgo observed a short duration gravitational-wave signal, GW190521, with a three-detector network signal-to-noise ratio of 14.7, and an estimated false-alarm rate of 1 in 4900 yr using a search sensitive to generic transients. If GW190521 is from a quasicircular binary inspiral, then the detected signal is consistent with the merger of two black holes with masses of 85_{-14}^{+21} M_{⊙} and 66_{-18}^{+17} M_{⊙} (90% credible intervals). We infer that the primary black hole mass lies within the gap produced by (pulsational) pair-instability supernova processes, with only a 0.32% probability of being below 65 M_{⊙}. We calculate the mass of the remnant to be 142_{-16}^{+28} M_{⊙}, which can be considered an intermediate mass black hole (IMBH). The luminosity distance of the source is 5.3_{-2.6}^{+2.4} Gpc, corresponding to a redshift of 0.82_{-0.34}^{+0.28}. The inferred rate of mergers similar to GW190521 is 0.13_{-0.11}^{+0.30} Gpc^{-3} yr^{-1}.
876 citations
••
TL;DR: The FLUXNET2015 dataset provides ecosystem-scale data on CO 2 , water, and energy exchange between the biosphere and the atmosphere, and other meteorological and biological measurements, from 212 sites around the globe, and is detailed in this paper.
Abstract: The FLUXNET2015 dataset provides ecosystem-scale data on CO2, water, and energy exchange between the biosphere and the atmosphere, and other meteorological and biological measurements, from 212 sites around the globe (over 1500 site-years, up to and including year 2014). These sites, independently managed and operated, voluntarily contributed their data to create global datasets. Data were quality controlled and processed using uniform methods, to improve consistency and intercomparability across sites. The dataset is already being used in a number of applications, including ecophysiology studies, remote sensing studies, and development of ecosystem and Earth system models. FLUXNET2015 includes derived-data products, such as gap-filled time series, ecosystem respiration and photosynthetic uptake estimates, estimation of uncertainties, and metadata about the measurements, presented for the first time in this paper. In addition, 206 of these sites are for the first time distributed under a Creative Commons (CC-BY 4.0) license. This paper details this enhanced dataset and the processing methods, now made available as open-source codes, making the dataset more accessible, transparent, and reproducible.
681 citations
••
Auburn University1, Commonwealth Scientific and Industrial Research Organisation2, Norwegian Institute for Air Research3, International Institute for Applied Systems Analysis4, University of Zielona Góra5, University of East Anglia6, University of Maryland Center for Environmental Science7, Centre national de la recherche scientifique8, Stanford University9, Ghent University10, University of California, Irvine11, Université libre de Bruxelles12, Food and Agriculture Organization13, Max Planck Society14, Peking University15, Karlsruhe Institute of Technology16, University of Bern17, University of Toulouse18, École Normale Supérieure19, Utrecht University20, Ocean University of China21, Netherlands Environmental Assessment Agency22, Zhejiang University23, University of Leeds24, Woods Hole Research Center25, National Oceanic and Atmospheric Administration26, Southern Cross University27, Beijing Normal University28, Chinese Academy of Sciences29, National Institute for Environmental Studies30, Leibniz Institute of Marine Sciences31, Université Paris-Saclay32, Tsinghua University33, Oeschger Centre for Climate Change Research34, Yale University35, Scotland's Rural College36, University of Minnesota37, Lund University38, Chiba University39, Japan Agency for Marine-Earth Science and Technology40, Massachusetts Institute of Technology41, VU University Amsterdam42, University of California, San Diego43, Mississippi State University44
TL;DR: A global N2O inventory is presented that incorporates both natural and anthropogenic sources and accounts for the interaction between nitrogen additions and the biochemical processes that control N 2O emissions, using bottom-up, top-down and process-based model approaches.
Abstract: Nitrous oxide (N2O), like carbon dioxide, is a long-lived greenhouse gas that accumulates in the atmosphere. Over the past 150 years, increasing atmospheric N2O concentrations have contributed to stratospheric ozone depletion1 and climate change2, with the current rate of increase estimated at 2 per cent per decade. Existing national inventories do not provide a full picture of N2O emissions, owing to their omission of natural sources and limitations in methodology for attributing anthropogenic sources. Here we present a global N2O inventory that incorporates both natural and anthropogenic sources and accounts for the interaction between nitrogen additions and the biochemical processes that control N2O emissions. We use bottom-up (inventory, statistical extrapolation of flux measurements, process-based land and ocean modelling) and top-down (atmospheric inversion) approaches to provide a comprehensive quantification of global N2O sources and sinks resulting from 21 natural and human sectors between 1980 and 2016. Global N2O emissions were 17.0 (minimum-maximum estimates: 12.2-23.5) teragrams of nitrogen per year (bottom-up) and 16.9 (15.9-17.7) teragrams of nitrogen per year (top-down) between 2007 and 2016. Global human-induced emissions, which are dominated by nitrogen additions to croplands, increased by 30% over the past four decades to 7.3 (4.2-11.4) teragrams of nitrogen per year. This increase was mainly responsible for the growth in the atmospheric burden. Our findings point to growing N2O emissions in emerging economies-particularly Brazil, China and India. Analysis of process-based model estimates reveals an emerging N2O-climate feedback resulting from interactions between nitrogen additions and climate change. The recent growth in N2O emissions exceeds some of the highest projected emission scenarios3,4, underscoring the urgency to mitigate N2O emissions.
650 citations
••
Ben-Gurion University of the Negev1, Helmholtz-Zentrum Berlin2, National Renewable Energy Laboratory3, University of Erlangen-Nuremberg4, Forschungszentrum Jülich5, University of Rome Tor Vergata6, Massachusetts Institute of Technology7, Princeton University8, Chulalongkorn University9, Wuhan University of Technology10, Karlsruhe Institute of Technology11, University of Grenoble12, Commonwealth Scientific and Industrial Research Organisation13, University of Michigan14, Sapienza University of Rome15, École Polytechnique Fédérale de Lausanne16, VU University Amsterdam17, University of Jena18, Bangor University19, University of Maryland, College Park20, University of California, Davis21, Dalian Institute of Chemical Physics22, Shaanxi Normal University23, Chinese Academy of Sciences24, University of Southern Denmark25, University of Colorado Boulder26, State University of Campinas27, Boğaziçi University28, Sungkyunkwan University29, Swansea University30, Technische Universität Darmstadt31, University of Oxford32, University of Cambridge33, Skolkovo Institute of Science and Technology34, Yonsei University35, Imperial College London36
TL;DR: A consensus between researchers in the field is reported on procedures for testing perovskite solar cell stability, which are based on the International Summit on Organic Photovoltaic Stability (ISOS) protocols, and additional procedures to account for properties specific to PSCs are proposed.
Abstract: Improving the long-term stability of perovskite solar cells is critical to the deployment of this technology. Despite the great emphasis laid on stability-related investigations, publications lack consistency in experimental procedures and parameters reported. It is therefore challenging to reproduce and compare results and thereby develop a deep understanding of degradation mechanisms. Here, we report a consensus between researchers in the field on procedures for testing perovskite solar cell stability, which are based on the International Summit on Organic Photovoltaic Stability (ISOS) protocols. We propose additional procedures to account for properties specific to PSCs such as ion redistribution under electric fields, reversible degradation and to distinguish ambient-induced degradation from other stress factors. These protocols are not intended as a replacement of the existing qualification standards, but rather they aim to unify the stability assessment and to understand failure modes. Finally, we identify key procedural information which we suggest reporting in publications to improve reproducibility and enable large data set analysis. Reliability of stability data for perovskite solar cells is undermined by a lack of consistency in the test conditions and reporting. This Consensus Statement outlines practices for testing and reporting stability tailoring ISOS protocols for perovskite devices.
••
Queen's University1, University of Texas Southwestern Medical Center2, Osaka University3, Ben-Gurion University of the Negev4, VU University Amsterdam5, University of Milan6, University of São Paulo7, Laval University8, Harvard University9, University of Surrey10, University of Padua11, University of New South Wales12, University of Colorado Denver13
TL;DR: The evidence is summarized that waist circumference and BMI together can provide improved assessments of cardiometabolic risk compared with either measurement alone, and it is recommended that health professionals are trained to properly perform this simple measurement in clinical practice.
Abstract: Despite decades of unequivocal evidence that waist circumference provides both independent and additive information to BMI for predicting morbidity and risk of death, this measurement is not routinely obtained in clinical practice. This Consensus Statement proposes that measurements of waist circumference afford practitioners with an important opportunity to improve the management and health of patients. We argue that BMI alone is not sufficient to properly assess or manage the cardiometabolic risk associated with increased adiposity in adults and provide a thorough review of the evidence that will empower health practitioners and professional societies to routinely include waist circumference in the evaluation and management of patients with overweight or obesity. We recommend that decreases in waist circumference are a critically important treatment target for reducing adverse health risks for both men and women. Moreover, we describe evidence that clinically relevant reductions in waist circumference can be achieved by routine, moderate-intensity exercise and/or dietary interventions. We identify gaps in the knowledge, including the refinement of waist circumference threshold values for a given BMI category, to optimize obesity risk stratification across age, sex and ethnicity. We recommend that health professionals are trained to properly perform this simple measurement and consider it as an important 'vital sign' in clinical practice.
••
TL;DR: With extended follow-up, the impact of SABR on OS was larger in magnitude than in the initial analysis and durable over time and there were no new safety signals, and S ABR had no detrimental impact on QOL.
Abstract: PURPOSE The oligometastatic paradigm hypothesizes that patients with a limited number of metastases may achieve long-term disease control, or even cure, if all sites of disease can be ablated. Howe...
••
University of Cologne1, German Center for Neurodegenerative Diseases2, Brigham and Women's Hospital3, University of Melbourne4, Harvard University5, VU University Amsterdam6, University of Barcelona7, Yeshiva University8, City University of New York9, Indiana University10, University of Victoria11, University Hospital Bonn12
TL;DR: The majority of individuals with SCD will not show progressive cognitive decline, and an individually tailored diagnostic process might be reasonable to identify or exclude underlying medical conditions in an individual withSCD who actively seeks medical help.
Abstract: A growing awareness about brain health and Alzheimer's disease in the general population is leading to an increasing number of cognitively unimpaired individuals, who are concerned that they have reduced cognitive function, to approach the medical system for help. The term subjective cognitive decline (SCD) was conceived in 2014 to describe this condition. Epidemiological data provide evidence that the risk for mild cognitive impairment and dementia is increased in individuals with SCD. However, the majority of individuals with SCD will not show progressive cognitive decline. An individually tailored diagnostic process might be reasonable to identify or exclude underlying medical conditions in an individual with SCD who actively seeks medical help. An increasing number of studies are investigating the link between SCD and the very early stages of Alzheimer's disease and other neurodegenerative diseases.
••
TL;DR: The COVID-19 pandemic : The 'black swan' for mental health care and a turning point for e-health.
••
TL;DR: In this article, the authors reported the observation of gravitational waves from a binary-black-hole coalescence during the first two weeks of LIGO and Virgo's third observing run.
Abstract: We report the observation of gravitational waves from a binary-black-hole coalescence during the first two weeks of LIGO’s and Virgo’s third observing run. The signal was recorded on April 12, 2019 at 05∶30∶44 UTC with a network signal-to-noise ratio of 19. The binary is different from observations during the first two observing runs most notably due to its asymmetric masses: a ∼30 M⊙ black hole merged with a ∼8 M⊙ black hole companion. The more massive black hole rotated with a dimensionless spin magnitude between 0.22 and 0.60 (90% probability). Asymmetric systems are predicted to emit gravitational waves with stronger contributions from higher multipoles, and indeed we find strong evidence for gravitational radiation beyond the leading quadrupolar order in the observed signal. A suite of tests performed on GW190412 indicates consistency with Einstein’s general theory of relativity. While the mass ratio of this system differs from all previous detections, we show that it is consistent with the population model of stellar binary black holes inferred from the first two observing runs.
••
TL;DR: The authors presented the global climate model IPSL-CM6A-LR developed at the Institut Pierre-Simon Laplace (IPSL) to study natural climate variability and climate response to natural and anthropogenic forcings as part of the sixth phase of the Coupled Model Intercomparison Project (CMIP6).
Abstract: This study presents the global climate model IPSL-CM6A-LR developed at Institut Pierre-Simon Laplace (IPSL) to study natural climate variability and climate response to natural and anthropogenic forcings as part of the sixth phase of the Coupled Model Intercomparison Project (CMIP6). This article describes the different model components, their coupling, and the simulated climate in comparison to previous model versions. We focus here on the representation of the physical climate along with the main characteristics of the global carbon cycle. The model's climatology, as assessed from a range of metrics (related in particular to radiation, temperature, precipitation, and wind), is strongly improved in comparison to previous model versions. Although they are reduced, a number of known biases and shortcomings (e.g., double Intertropical Convergence Zone [ITCZ], frequency of midlatitude wintertime blockings, and El Nino–Southern Oscillation [ENSO] dynamics) persist. The equilibrium climate sensitivity and transient climate response have both increased from the previous climate model IPSL-CM5A-LR used in CMIP5. A large ensemble of more than 30 members for the historical period (1850–2018) and a smaller ensemble for a range of emissions scenarios (until 2100 and 2300) are also presented and discussed.
••
TL;DR: In this paper, a framework based on 10 common circular economy strategies (i.e. recover, recycling, repurpose, remanufacture, refurbish, repair, reuse, reduce, rethink, refuse) is applied to scrutinise the selected targets.
Abstract: The transition to a circular economy requires actions and policies. In the praxis of governance, a common way to steer the transition to a different state proceeds through the setting of targets. Thus far, no study has investigated circular economy targets in a systematic way. To bridge this gap, this study examines which targets can facilitate the transition towards a circular economy. The analysis focuses both on existing and new targets; the latter complement existing targets which are limited to a few discrete cases addressing only partially the goal of a more circular economy. A framework based on 10 common circular economy strategies (i.e. recover, recycling, repurpose, remanufacture, refurbish, repair, re-use, reduce, rethink, refuse) is applied to scrutinise the selected targets. The study clarifies that existing targets for recovery and recycling do not necessarily promote a circular economy, though they are the most commonly applied targets so far. Because of lack of efficacy of recovery and recycling, targets should instead favour other more powerful circular economy strategies. In relation to these, the study looks into new and existing targets showing how they can reduce waste, increase efficiency, close production loops, and maximise retention of the economic value of materials and products. In particular, the study proposes an expanded set of brand new targets for the transition to a circular economy together with a fresh view on targets aimed at scholars and decision-makers alike.
••
Oeschger Centre for Climate Change Research1, University of Bern2, University of Adelaide3, University of Reading4, Columbia University5, California Institute of Technology6, Lamont–Doherty Earth Observatory7, VU University Amsterdam8, University of California, Irvine9, École Normale Supérieure10, École des ponts ParisTech11, Leipzig University12, Max Planck Society13, University of Graz14, University of Lisbon15, University of New South Wales16, Vrije Universiteit Brussel17, University of Geneva18
TL;DR: A typology of compound events is proposed, distinguishing events that are preconditioned, multivariate, temporally compounding and spatially compounding, and suggests analytical and modelling approaches to aid in their investigation.
Abstract: Compound weather and climate events describe combinations of multiple climate drivers and/or hazards that contribute to societal or environmental risk. Although many climate-related disasters are caused by compound events, the understanding, analysis, quantification and prediction of such events is still in its infancy. In this Review, we propose a typology of compound events and suggest analytical and modelling approaches to aid in their investigation. We organize the highly diverse compound event types according to four themes: preconditioned, where a weather-driven or climate-driven precondition aggravates the impacts of a hazard; multivariate, where multiple drivers and/or hazards lead to an impact; temporally compounding, where a succession of hazards leads to an impact; and spatially compounding, where hazards in multiple connected locations cause an aggregated impact. Through structuring compound events and their respective analysis tools, the typology offers an opportunity for deeper insight into their mechanisms and impacts, benefiting the development of effective adaptation strategies. However, the complex nature of compound events results in some cases inevitably fitting into more than one class, necessitating soft boundaries within the typology. Future work must homogenize the available analytical approaches into a robust toolset for compound-event analysis under present and future climate conditions. Research on compound events has increased vastly in the last several years, yet, a typology was absent. This Review proposes a comprehensive classification scheme, incorporating compound events that are preconditioned, multivariate, temporally compounding and spatially compounding events.
••
Katrina L. Grasby1, Neda Jahanshad2, Jodie N. Painter1, Lucía Colodro-Conde3 +356 more•Institutions (115)
TL;DR: Results support the radial unit hypothesis that different developmental mechanisms promote surface area expansion and increases in thickness and find evidence that brain structure is a key phenotype along the causal pathway that leads from genetic variation to differences in general cognitive function.
Abstract: The cerebral cortex underlies our complex cognitive capabilities, yet little is known about the specific genetic loci that influence human cortical structure. To identify genetic variants that affect cortical structure, we conducted a genome-wide association meta-analysis of brain magnetic resonance imaging data from 51,665 individuals. We analyzed the surface area and average thickness of the whole cortex and 34 regions with known functional specializations. We identified 199 significant loci and found significant enrichment for loci influencing total surface area within regulatory elements that are active during prenatal cortical development, supporting the radial unit hypothesis. Loci that affect regional surface area cluster near genes in Wnt signaling pathways, which influence progenitor expansion and areal identity. Variation in cortical structure is genetically correlated with cognitive function, Parkinson's disease, insomnia, depression, neuroticism, and attention deficit hyperactivity disorder.
••
TL;DR: The GNW hypothesis proposes that, in the conscious state, a non-linear network ignition associated with recurrent processing amplifies and sustains a neural representation, allowing the corresponding information to be globally accessed by local processors.
••
TL;DR: Plasma pTau181 concentrations are elevated specifically in patients diagnosed with Alzheimer’s disease compared to those diagnosed with frontotemporal lobar degeneration or elderly controls, supporting its further development as a blood-based biomarker for AD.
Abstract: With the potential development of new disease-modifying Alzheimer's disease (AD) therapies, simple, widely available screening tests are needed to identify which individuals, who are experiencing symptoms of cognitive or behavioral decline, should be further evaluated for initiation of treatment. A blood-based test for AD would be a less invasive and less expensive screening tool than the currently approved cerebrospinal fluid or amyloid β positron emission tomography (PET) diagnostic tests. We examined whether plasma tau phosphorylated at residue 181 (pTau181) could differentiate between clinically diagnosed or autopsy-confirmed AD and frontotemporal lobar degeneration. Plasma pTau181 concentrations were increased by 3.5-fold in AD compared to controls and differentiated AD from both clinically diagnosed (receiver operating characteristic area under the curve of 0.894) and autopsy-confirmed frontotemporal lobar degeneration (area under the curve of 0.878). Plasma pTau181 identified individuals who were amyloid β-PET-positive regardless of clinical diagnosis and correlated with cortical tau protein deposition measured by 18F-flortaucipir PET. Plasma pTau181 may be useful to screen for tau pathology associated with AD.
••
Massachusetts Institute of Technology1, University of Alaska Fairbanks2, Woods Hole Oceanographic Institution3, Alfred Wegener Institute for Polar and Marine Research4, University of Bremen5, California Institute of Technology6, National Oceanic and Atmospheric Administration7, Texas State University8, Lund University9, Pennsylvania State University10, VU University Amsterdam11, Potsdam Institute for Climate Impact Research12, Utah State University13, United States Naval Academy14, Environment Canada15, University of Gothenburg16, University of Cambridge17, Naval Postgraduate School18, University of California, Irvine19, University of Washington20, University College London21, Langley Research Center22, University of Wisconsin-Madison23, Finnish Meteorological Institute24, Leipzig University25, Columbia University26, Gwangju Institute of Science and Technology27
TL;DR: The Arctic has warmed more than twice as fast as the global average since the late twentieth century, a phenomenon known as Arctic amplification (AA), and progress has been made in understanding the mechanisms that link it to midlatitude weather variability as discussed by the authors.
Abstract: The Arctic has warmed more than twice as fast as the global average since the late twentieth century, a phenomenon known as Arctic amplification (AA). Recently, there have been considerable advances in understanding the physical contributions to AA, and progress has been made in understanding the mechanisms that link it to midlatitude weather variability. Observational studies overwhelmingly support that AA is contributing to winter continental cooling. Although some model experiments support the observational evidence, most modelling results show little connection between AA and severe midlatitude weather or suggest the export of excess heating from the Arctic to lower latitudes. Divergent conclusions between model and observational studies, and even intramodel studies, continue to obfuscate a clear understanding of how AA is influencing midlatitude weather.
••
University of Edinburgh1, University of California2, University of Sheffield3, University of Virginia4, Aarhus University5, University of California, Davis6, University of Barcelona7, Northern Arizona University8, University of Alaska Fairbanks9, Netherlands Organisation for Scientific Research10, University of Oslo11, University of Bergen12, VU University Amsterdam13, University of Exeter14, Institute of Arctic and Alpine Research15, University of Lapland16, Grand Valley State University17, University of Zurich18, Colgate University19, University of Oxford20, Open University21, Umeå University22, University of Stirling23, University of Tromsø24, Lund University25, University of Alaska Anchorage26, University of Texas at El Paso27, University of Greifswald28, University of Aberdeen29, Swiss Federal Institute for Forest, Snow and Landscape Research30
TL;DR: In this paper, a consensus is emerging that the underlying causes and future dynamics of so-called Arctic greening and browning trends are more complex, variable and inherently scale-dependent than previously thought.
Abstract: As the Arctic warms, vegetation is responding, and satellite measures indicate widespread greening at high latitudes. This ‘greening of the Arctic’ is among the world’s most important large-scale ecological responses to global climate change. However, a consensus is emerging that the underlying causes and future dynamics of so-called Arctic greening and browning trends are more complex, variable and inherently scale-dependent than previously thought. Here we summarize the complexities of observing and interpreting high-latitude greening to identify priorities for future research. Incorporating satellite and proximal remote sensing with in-situ data, while accounting for uncertainties and scale issues, will advance the study of past, present and future Arctic vegetation change.
••
25 Sep 2020
TL;DR: In patients with lethal CO VID-19, an extensive systemic inflammatory response was present, with a continued presence of neutrophils and NETs, which suggests a maladaptive immune response and substantiates the evidence for immunomodulation as a target in the treatment of severe COVID-19.
Abstract: Background: Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) targets multiple organs and causes severe coagulopathy. Histopathological organ changes might not only be attributable to a direct virus-induced effect, but also the immune response. The aims of this study were to assess the duration of viral presence, identify the extent of inflammatory response, and investigate the underlying cause of coagulopathy. Methods: This prospective autopsy cohort study was done at Amsterdam University Medical Centers (UMC), the Netherlands. With informed consent from relatives, full body autopsy was done on 21 patients with COVID-19 for whom autopsy was requested between March 9 and May 18, 2020. In addition to histopathological evaluation of organ damage, the presence of SARS-CoV-2 nucleocapsid protein and the composition of the immune infiltrate and thrombi were assessed, and all were linked to disease course. Findings: Our cohort (n=21) included 16 (76%) men, and median age was 68 years (range 41-78). Median disease course (time from onset of symptoms to death) was 22 days (range 5-44 days). In 11 patients tested for SARS-CoV-2 tropism, SARS-CoV-2 infected cells were present in multiple organs, most abundantly in the lungs, but presence in the lungs became sporadic with increased disease course. Other SARS-CoV-2-positive organs included the upper respiratory tract, heart, kidneys, and gastrointestinal tract. In histological analyses of organs (sampled from nine to 21 patients per organ), an extensive inflammatory response was present in the lungs, heart, liver, kidneys, and brain. In the brain, extensive inflammation was seen in the olfactory bulbs and medulla oblongata. Thrombi and neutrophilic plugs were present in the lungs, heart, kidneys, liver, spleen, and brain and were most frequently observed late in the disease course (15 patients with thrombi, median disease course 22 days [5-44]; ten patients with neutrophilic plugs, 21 days [5-44]). Neutrophilic plugs were observed in two forms: solely composed of neutrophils with neutrophil extracellular traps (NETs), or as aggregates of NETs and platelets.. Interpretation: In patients with lethal COVID-19, an extensive systemic inflammatory response was present, with a continued presence of neutrophils and NETs. However, SARS-CoV-2-infected cells were only sporadically present at late stages of COVID-19. This suggests a maladaptive immune response and substantiates the evidence for immunomodulation as a target in the treatment of severe COVID-19. Funding: Amsterdam UMC Corona Research Fund.
•
TL;DR: The PCL-5 provided more detailed information about the nature and severity of symptomatology in an individual patient and was slightly better able to demonstrate clinical significant change than with the OQ-SD, which is a suitable addition for routine outcome monitoring for patients with PTSD.
Abstract: BACKGROUND The PTSD Checklist for the DSM-5 (PCL-5) may be a suitable addition for routine outcome monitoring (ROM) for patients with PTSD AIM: To determine whether the PCL-5 is worth the extra effort that administration requires from the patient METHOD: Pretest and retest measurement results of the PCL-5 and the OQ-45 were compared head-to-head in 464 patients from the Sinai Center of Arkin RESULTS: The correlations between scores on the instruments were high and analysis of variance for repeated measurements revealed no difference in responsiveness Comparison of Cohen's d (049 vs 043) and Delta T (50 vs 44), indicated a slightly better responsiveness of the PCL-5 and also the proportion of recovered patients was greater according to the PCL-5 compared to the OQ-SD CONCLUSION: At first glance, the PCL-5 and the OQ-SD were equally sensitive to detect change during treatment However, the PCL-5 provided more detailed information about the nature and severity of symptomatology in an individual patient and with the PCL-5 we were slightly better able to demonstrate clinical significant change than with the OQ-SD We recommend to add the PCL-5 to ROM for patients with PTSD
••
Paul M. Thompson1, Neda Jahanshad1, Christopher R.K. Ching1, Lauren E. Salminen1 +210 more•Institutions (99)
TL;DR: This review summarizes the last decade of work by the ENIGMA Consortium, a global alliance of over 1400 scientists across 43 countries, studying the human brain in health and disease, and highlights the advantages of collaborative large-scale coordinated data analyses for testing reproducibility and robustness of findings.
Abstract: This review summarizes the last decade of work by the ENIGMA (Enhancing NeuroImaging Genetics through Meta Analysis) Consortium, a global alliance of over 1400 scientists across 43 countries, studying the human brain in health and disease. Building on large-scale genetic studies that discovered the first robustly replicated genetic loci associated with brain metrics, ENIGMA has diversified into over 50 working groups (WGs), pooling worldwide data and expertise to answer fundamental questions in neuroscience, psychiatry, neurology, and genetics. Most ENIGMA WGs focus on specific psychiatric and neurological conditions, other WGs study normal variation due to sex and gender differences, or development and aging; still other WGs develop methodological pipelines and tools to facilitate harmonized analyses of "big data" (i.e., genetic and epigenetic data, multimodal MRI, and electroencephalography data). These international efforts have yielded the largest neuroimaging studies to date in schizophrenia, bipolar disorder, major depressive disorder, post-traumatic stress disorder, substance use disorders, obsessive-compulsive disorder, attention-deficit/hyperactivity disorder, autism spectrum disorders, epilepsy, and 22q11.2 deletion syndrome. More recent ENIGMA WGs have formed to study anxiety disorders, suicidal thoughts and behavior, sleep and insomnia, eating disorders, irritability, brain injury, antisocial personality and conduct disorder, and dissociative identity disorder. Here, we summarize the first decade of ENIGMA's activities and ongoing projects, and describe the successes and challenges encountered along the way. We highlight the advantages of collaborative large-scale coordinated data analyses for testing reproducibility and robustness of findings, offering the opportunity to identify brain systems involved in clinical syndromes across diverse samples and associated genetic, environmental, demographic, cognitive, and psychosocial factors.
••
TL;DR: The GW190521 signal is consistent with a binary black hole (BBH) merger source at redshift 0.13-0.30 Gpc-3 yr-1.8 as discussed by the authors.
Abstract: The gravitational-wave signal GW190521 is consistent with a binary black hole (BBH) merger source at redshift 0.8 with unusually high component masses, 85-14+21 M o˙ and 66-18+17 M o˙, compared to previously reported events, and shows mild evidence for spin-induced orbital precession. The primary falls in the mass gap predicted by (pulsational) pair-instability supernova theory, in the approximate range 65-120 M o˙. The probability that at least one of the black holes in GW190521 is in that range is 99.0%. The final mass of the merger (142-16+28 M o˙) classifies it as an intermediate-mass black hole. Under the assumption of a quasi-circular BBH coalescence, we detail the physical properties of GW190521's source binary and its post-merger remnant, including component masses and spin vectors. Three different waveform models, as well as direct comparison to numerical solutions of general relativity, yield consistent estimates of these properties. Tests of strong-field general relativity targeting the merger-ringdown stages of the coalescence indicate consistency of the observed signal with theoretical predictions. We estimate the merger rate of similar systems to be 0.13-0.11+0.30 Gpc-3 yr-1. We discuss the astrophysical implications of GW190521 for stellar collapse and for the possible formation of black holes in the pair-instability mass gap through various channels: via (multiple) stellar coalescences, or via hierarchical mergers of lower-mass black holes in star clusters or in active galactic nuclei. We find it to be unlikely that GW190521 is a strongly lensed signal of a lower-mass black hole binary merger. We also discuss more exotic possible sources for GW190521, including a highly eccentric black hole binary, or a primordial black hole binary.
••
18 Aug 2020
TL;DR: In this paper, the authors describe recent global and regional trends in fire activity and examine projections for fire regimes in the near future, concluding that the economic and environmental impacts of vegetation fires will worsen as a result of anthropogenic climate change.
Abstract: Vegetation fires are an essential component of the Earth system but can also cause substantial economic losses, severe air pollution, human mortality and environmental damage. Contemporary fire regimes are increasingly impacted by human activities and climate change, but, owing to the complex fire–human–climate interactions and incomplete historical or long-term datasets, it is difficult to detect and project fire-regime trajectories. In this Review, we describe recent global and regional trends in fire activity and examine projections for fire regimes in the near future. Although there are large uncertainties, it is likely that the economic and environmental impacts of vegetation fires will worsen as a result of anthropogenic climate change. These effects will be particularly prominent in flammable forests in populated temperate zones, the sparsely inhabited flammable boreal zone and fire-sensitive tropical rainforests, and will contribute to greenhouse gas emissions. The impacts of increased fire activity can be mitigated through effective stewardship of fire regimes, which should be achieved through evidence-based fire management that incorporates indigenous and local knowledge, combined with planning and design of natural and urban landscapes. Increasing transdisciplinary research is needed to fully understand how Anthropocene fire regimes are changing and how humans must adapt.
••
Green Templeton College1, University of Washington2, Muhimbili University of Health and Allied Sciences3, University of Alberta Hospital4, University of Manitoba5, University of Texas Southwestern Medical Center6, University of Sydney7, Leiden University8, University of Ljubljana9, VU University Amsterdam10
TL;DR: This guideline is on the diagnosis and treatment of foot infection in persons with diabetes and updates the 2015 IWGDF infection guideline, offering 27 recommendations on various aspects of diagnosing soft tissue and bone infection.
Abstract: The International Working Group on the Diabetic Foot (IWGDF) has published evidence-based guidelines on the prevention and management of diabetic foot disease since 1999. This guideline is on the diagnosis and treatment of foot infection in persons with diabetes and updates the 2015 IWGDF infection guideline. On the basis of patient, intervention, comparison, outcomes (PICOs) developed by the infection committee, in conjunction with internal and external reviewers and consultants, and on systematic reviews the committee conducted on the diagnosis of infection (new) and treatment of infection (updated from 2015), we offer 27 recommendations. These cover various aspects of diagnosing soft tissue and bone infection, including the classification scheme for diagnosing infection and its severity. Of note, we have updated this scheme for the first time since we developed it 15 years ago. We also review the microbiology of diabetic foot infections, including how to collect samples and to process them to identify causative pathogens. Finally, we discuss the approach to treating diabetic foot infections, including selecting appropriate empiric and definitive antimicrobial therapy for soft tissue and for bone infections, when and how to approach surgical treatment, and which adjunctive treatments we think are or are not useful for the infectious aspects of diabetic foot problems. For this version of the guideline, we also updated four tables and one figure from the 2016 guideline. We think that following the principles of diagnosing and treating diabetic foot infections outlined in this guideline can help clinicians to provide better care for these patients.