Showing papers by "University of Vienna published in 2020"
••
8,699 citations
••
TL;DR: Some notable features of IQ-TREE version 2 are described and the key advantages over other software are highlighted.
Abstract: IQ-TREE (http://www.iqtree.org, last accessed February 6, 2020) is a user-friendly and widely used software package for phylogenetic inference using maximum likelihood. Since the release of version 1 in 2014, we have continuously expanded IQ-TREE to integrate a plethora of new models of sequence evolution and efficient computational approaches of phylogenetic inference to deal with genomic data. Here, we describe notable features of IQ-TREE version 2 and highlight the key advantages over other software.
4,337 citations
••
TL;DR: The extent of the trait data compiled in TRY is evaluated and emerging patterns of data coverage and representativeness are analyzed to conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements.
Abstract: Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives.
882 citations
••
TL;DR: The current status of the Standard Model calculation of the anomalous magnetic moment of the muon is reviewed in this paper, where the authors present a detailed account of recent efforts to improve the calculation of these two contributions with either a data-driven, dispersive approach, or a first-principle, lattice approach.
801 citations
••
Graz University of Technology1, Université Paris-Saclay2, University of Waterloo3, Guizhou University4, European Food Information Council5, Institut national de la recherche agronomique6, Agricultural University of Athens7, University of Minnesota8, University of Minho9, University of Vienna10, Agriculture and Agri-Food Canada11, Rothamsted Research12, Pacific Northwest National Laboratory13, Austrian Institute of Technology14, CABI15, Tallinn University of Technology16, Wageningen University and Research Centre17, Pondicherry University18, State University of Campinas19, University of Sydney20, Teagasc21
TL;DR: A definition of microbiome is proposed based on the compact, clear, and comprehensive description of the term provided by Whipps et al. in 1988, amended with a set of novel recommendations considering the latest technological developments and research findings.
Abstract: The field of microbiome research has evolved rapidly over the past few decades and has become a topic of great scientific and public interest. As a result of this rapid growth in interest covering different fields, we are lacking a clear commonly agreed definition of the term “microbiome.” Moreover, a consensus on best practices in microbiome research is missing. Recently, a panel of international experts discussed the current gaps in the frame of the European-funded MicrobiomeSupport project. The meeting brought together about 40 leaders from diverse microbiome areas, while more than a hundred experts from all over the world took part in an online survey accompanying the workshop. This article excerpts the outcomes of the workshop and the corresponding online survey embedded in a short historical introduction and future outlook. We propose a definition of microbiome based on the compact, clear, and comprehensive description of the term provided by Whipps et al. in 1988, amended with a set of novel recommendations considering the latest technological developments and research findings. We clearly separate the terms microbiome and microbiota and provide a comprehensive discussion considering the composition of microbiota, the heterogeneity and dynamics of microbiomes in time and space, the stability and resilience of microbial networks, the definition of core microbiomes, and functionally relevant keystone species as well as co-evolutionary principles of microbe-host and inter-species interactions within the microbiome. These broad definitions together with the suggested unifying concepts will help to improve standardization of microbiome studies in the future, and could be the starting point for an integrated assessment of data resulting in a more rapid transfer of knowledge from basic science into practice. Furthermore, microbiome standards are important for solving new challenges associated with anthropogenic-driven changes in the field of planetary health, for which the understanding of microbiomes might play a key role.
733 citations
••
Academy of Sciences of the Czech Republic1, Stellenbosch University2, Charles University in Prague3, Canterbury of New Zealand4, University of Tennessee5, University of Fribourg6, Zoological Society of London7, University College London8, Williams College9, Durham University10, University of Vienna11, South African National Parks12, International Union for Conservation of Nature and Natural Resources13, Leibniz Association14, Free University of Berlin15, Helmholtz Centre for Environmental Research - UFZ16, Martin Luther University of Halle-Wittenberg17, Czech University of Life Sciences Prague18, United States Forest Service19, University of Toronto20, University of Rhode Island21, University of Concepción22, Taizhou University23, University of Konstanz24, Spanish National Research Council25, University of Seville26, University of Pretoria27
TL;DR: Improved international cooperation is crucial to reduce the impacts of invasive alien species on biodiversity, ecosystem services, and human livelihoods, as synergies with other global changes are exacerbating current invasions and facilitating new ones, thereby escalating the extent and impacts of invaders.
Abstract: Biological invasions are a global consequence of an increasingly connected world and the rise in human population size The numbers of invasive alien species – the subset of alien species that spread widely in areas where they are not native, affecting the environment or human livelihoods – are increasing Synergies with other global changes are exacerbating current invasions and facilitating new ones, thereby escalating the extent and impacts of invaders Invasions have complex and often immense long‐term direct and indirect impacts In many cases, such impacts become apparent or problematic only when invaders are well established and have large ranges Invasive alien species break down biogeographic realms, affect native species richness and abundance, increase the risk of native species extinction, affect the genetic composition of native populations, change native animal behaviour, alter phylogenetic diversity across communities, and modify trophic networks Many invasive alien species also change ecosystem functioning and the delivery of ecosystem services by altering nutrient and contaminant cycling, hydrology, habitat structure, and disturbance regimes These biodiversity and ecosystem impacts are accelerating and will increase further in the future Scientific evidence has identified policy strategies to reduce future invasions, but these strategies are often insufficiently implemented For some nations, notably Australia and New Zealand, biosecurity has become a national priority There have been long‐term successes, such as eradication of rats and cats on increasingly large islands and biological control of weeds across continental areas However, in many countries, invasions receive little attention Improved international cooperation is crucial to reduce the impacts of invasive alien species on biodiversity, ecosystem services, and human livelihoods Countries can strengthen their biosecurity regulations to implement and enforce more effective management strategies that should also address other global changes that interact with invasions
677 citations
••
Rotem Botvinik-Nezer1, Rotem Botvinik-Nezer2, Felix Holzmeister3, Colin F. Camerer4 +217 more•Institutions (78)
TL;DR: The results obtained by seventy different teams analysing the same functional magnetic resonance imaging dataset show substantial variation, highlighting the influence of analytical choices and the importance of sharing workflows publicly and performing multiple analyses.
Abstract: Data analysis workflows in many scientific domains have become increasingly complex and flexible. Here we assess the effect of this flexibility on the results of functional magnetic resonance imaging by asking 70 independent teams to analyse the same dataset, testing the same 9 ex-ante hypotheses1. The flexibility of analytical approaches is exemplified by the fact that no two teams chose identical workflows to analyse the data. This flexibility resulted in sizeable variation in the results of hypothesis tests, even for teams whose statistical maps were highly correlated at intermediate stages of the analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Notably, a meta-analytical approach that aggregated information across teams yielded a significant consensus in activated regions. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset2-5. Our findings show that analytical flexibility can have substantial effects on scientific conclusions, and identify factors that may be related to variability in the analysis of functional magnetic resonance imaging. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for performing and reporting multiple analyses of the same data. Potential approaches that could be used to mitigate issues related to analytical variability are discussed.
551 citations
••
fondazione bruno kessler1, University of Vienna2, The Turing Institute3, University of Oxford4, Utrecht University5, Telefónica6, University of Turin7, Institute for Scientific Interchange8, University of Paris9, Free University of Berlin10, Technical University of Denmark11, Banco Bilbao Vizcaya Argentaria12, Massachusetts Institute of Technology13, Harvard University14, Aalto University15, Northeastern University16, New York University17
TL;DR: It is argued that mobile phone data, when used properly and carefully, represents a critical arsenal of tools for supporting public health actions across early-, middle-, and late-stage phases of the COVID-19 pandemic.
Abstract: The coronavirus 2019–2020 pandemic (COVID-19) poses unprecedented challenges for governments and societies around the world ( 1 ). Nonpharmaceutical interventions have proven to be critical for delaying and containing the COVID-19 pandemic ( 2 – 6 ). These include testing and tracing, bans on large gatherings, nonessential business and school and university closures, international and domestic mobility restrictions and physical isolation, and total lockdowns of regions and countries. Decision-making and evaluation or such interventions during all stages of the pandemic life cycle require specific, reliable, and timely data not only about infections but also about human behavior, especially mobility and physical copresence. We argue that mobile phone data, when used properly and carefully, represents a critical arsenal of tools for supporting public health actions across early-, middle-, and late-stage phases of the COVID-19 pandemic.
Seminal work on human mobility has shown that aggregate and (pseudo-)anonymized mobile phone data can assist the modeling of the geographical spread of epidemics ( 7 – 11 ). Thus, researchers and governments have started to collaborate with private companies, most notably mobile network operators and location intelligence companies, to estimate the effectiveness of control measures in a number of countries, including Austria, Belgium, Chile, China, Germany, France, Italy, Spain, United Kingdom, and the United States ( 12 – 21 ).
There is, however, little coordination or information exchange between these national or even regional initiatives ( 22 ). Although ad hoc mechanisms leveraging mobile phone data can be effectively (but not easily) developed at the local or national level, regional or even global collaborations seem to be much more difficult given the number of actors, the range of interests and priorities, the variety of legislations concerned, and the need to protect civil liberties. The global scale and spread of the COVID-19 pandemic highlight the need for a more harmonized or coordinated approach.
In the …
487 citations
••
TL;DR: A quantum interface that combines optical trapping of solids with cavity-mediated light-matter interaction and laser-cooling an optically trapped nanoparticle into its quantum ground state of motion from room temperature is demonstrated.
Abstract: Quantum control of complex objects in the regime of large size and mass provides opportunities for sensing applications and tests of fundamental physics. The realization of such extreme quantum states of matter remains a major challenge. We demonstrate a quantum interface that combines optical trapping of solids with cavity-mediated light-matter interaction. Precise control over the frequency and position of the trap laser with respect to the optical cavity allowed us to laser-cool an optically trapped nanoparticle into its quantum ground state of motion from room temperature. The particle comprises 108 atoms, similar to current Bose-Einstein condensates, with the density of a solid object. Our cooling technique, in combination with optical trap manipulation, may enable otherwise unachievable superposition states involving large masses.
436 citations
••
TL;DR: The current status of the Standard Model calculation of the anomalous magnetic moment of the muon has been reviewed in this paper, where the authors present a detailed account of recent efforts to improve the calculation of these two contributions with either a data-driven, dispersive approach, or a first-principle, lattice-QCD approach.
Abstract: We review the present status of the Standard Model calculation of the anomalous magnetic moment of the muon. This is performed in a perturbative expansion in the fine-structure constant $\alpha$ and is broken down into pure QED, electroweak, and hadronic contributions. The pure QED contribution is by far the largest and has been evaluated up to and including $\mathcal{O}(\alpha^5)$ with negligible numerical uncertainty. The electroweak contribution is suppressed by $(m_\mu/M_W)^2$ and only shows up at the level of the seventh significant digit. It has been evaluated up to two loops and is known to better than one percent. Hadronic contributions are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. The leading hadronic contribution appears at $\mathcal{O}(\alpha^2)$ and is due to hadronic vacuum polarization, whereas at $\mathcal{O}(\alpha^3)$ the hadronic light-by-light scattering contribution appears. Given the low characteristic scale of this observable, these contributions have to be calculated with nonperturbative methods, in particular, dispersion relations and the lattice approach to QCD. The largest part of this review is dedicated to a detailed account of recent efforts to improve the calculation of these two contributions with either a data-driven, dispersive approach, or a first-principle, lattice-QCD approach. The final result reads $a_\mu^\text{SM}=116\,591\,810(43)\times 10^{-11}$ and is smaller than the Brookhaven measurement by 3.7$\sigma$. The experimental uncertainty will soon be reduced by up to a factor four by the new experiment currently running at Fermilab, and also by the future J-PARC experiment. This and the prospects to further reduce the theoretical uncertainty in the near future-which are also discussed here-make this quantity one of the most promising places to look for evidence of new physics.
420 citations
••
University of Grenoble1, Katholieke Universiteit Leuven2, ETH Zurich3, Infineon Technologies4, University of Münster5, University of Gothenburg6, Royal Institute of Technology7, Helmholtz-Zentrum Dresden-Rossendorf8, Kaiserslautern University of Technology9, Université Paris-Saclay10, University of Vienna11, University of York12, University of Lorraine13, Spanish National Research Council14, Catalan Institution for Research and Advanced Studies15, Koç University16, University of Naples Federico II17, University of Messina18, University of Salamanca19
TL;DR: In this article, the potential of spintronics in four key areas of application (memory, sensors, microwave devices, and logic devices) is examined and the challenges that need to be addressed in order to integrate spintronic materials and functionalities into mainstream microelectronic platforms.
Abstract: Spintronic devices exploit the spin, as well as the charge, of electrons and could bring new capabilities to the microelectronics industry However, in order for spintronic devices to meet the ever-increasing demands of the industry, innovation in terms of materials, processes and circuits are required Here, we review recent developments in spintronics that could soon have an impact on the microelectronics and information technology industry We highlight and explore four key areas: magnetic memories, magnetic sensors, radio-frequency and microwave devices, and logic and non-Boolean devices We also discuss the challenges—at both the device and the system level—that need be addressed in order to integrate spintronic materials and functionalities into mainstream microelectronic platforms This Review Article examines the potential of spintronics in four key areas of application —memories, sensors, microwave devices, and logic devices — and discusses the challenges that need be addressed in order to integrate spintronic materials and functionalities into mainstream microelectronic platforms
••
TL;DR: It is shown that atmospheric transport is a major pathway for road plastic pollution over remote regions, and it is suggested that the Arctic may be a particularly sensitive receptor region, where the light-absorbing properties of TWPs and BWPs may also cause accelerated warming and melting of the cryosphere.
Abstract: In recent years, marine, freshwater and terrestrial pollution with microplastics has been discussed extensively, whereas atmospheric microplastic transport has been largely overlooked. Here, we present global simulations of atmospheric transport of microplastic particles produced by road traffic (TWPs – tire wear particles and BWPs – brake wear particles), a major source that can be quantified relatively well. We find a high transport efficiencies of these particles to remote regions. About 34% of the emitted coarse TWPs and 30% of the emitted coarse BWPs (100 kt yr−1 and 40 kt yr−1 respectively) were deposited in the World Ocean. These amounts are of similar magnitude as the total estimated direct and riverine transport of TWPs and fibres to the ocean (64 kt yr−1). We suggest that the Arctic may be a particularly sensitive receptor region, where the light-absorbing properties of TWPs and BWPs may also cause accelerated warming and melting of the cryosphere. Plastic pollution is a critical concern across diverse ecosystems, yet most research has focused on terrestrial and aquatic transport, neglecting other mechanisms. Here the authors show that atmospheric transport is a major pathway for road plastic pollution over remote regions.
••
Hull York Medical School1, Sahlgrenska University Hospital2, Lithuanian University of Health Sciences3, University of Cambridge4, King's College London5, Albert Einstein College of Medicine6, Autonomous University of Barcelona7, University of Glasgow8, Guangzhou Medical University9, Queen's University Belfast10, Cochrane Collaboration11, McMaster University12, University of Manchester13, University of Ulsan14, University of Bern15, Erasmus University Medical Center16, University of Vienna17
TL;DR: In adults, cough hypersensitivity has become the overarching diagnosis, and in children, persistent bacterial bronchitis explains most wet cough, changing treatment advice.
Abstract: These guidelines incorporate the recent advances in chronic cough pathophysiology, diagnosis and treatment. The concept of cough hypersensitivity has allowed an umbrella term that explains the exquisite sensitivity of patients to external stimuli such a cold air, perfumes, smoke and bleach. Thus, adults with chronic cough now have a firm physical explanation for their symptoms based on vagal afferent hypersensitivity. Different treatable traits exist with cough variant asthma (CVA)/eosinophilic bronchitis responding to anti-inflammatory treatment and non-acid reflux being treated with promotility agents rather the anti-acid drugs. An alternative antitussive strategy is to reduce hypersensitivity by neuromodulation. Low-dose morphine is highly effective in a subset of patients with cough resistant to other treatments. Gabapentin and pregabalin are also advocated, but in clinical experience they are limited by adverse events. Perhaps the most promising future developments in pharmacotherapy are drugs which tackle neuronal hypersensitivity by blocking excitability of afferent nerves by inhibiting targets such as the ATP receptor (P2X3). Finally, cough suppression therapy when performed by competent practitioners can be highly effective. Children are not small adults and a pursuit of an underlying cause for cough is advocated. Thus, in toddlers, inhalation of a foreign body is common. Persistent bacterial bronchitis is a common and previously unrecognised cause of wet cough in children. Antibiotics (drug, dose and duration need to be determined) can be curative. A paediatric-specific algorithm should be used.
••
The Catholic University of America1, International Institute of Minnesota2, University of Queensland3, Federal University of Rio de Janeiro4, Wageningen University and Research Centre5, Autonomous University of Barcelona6, Federal Fluminense University7, University of Cambridge8, University of the Philippines Los Baños9, University of Tasmania10, International Union for Conservation of Nature and Natural Resources11, BirdLife International12, University of Natural Resources and Life Sciences, Vienna13, University of São Paulo14, Royal Society for the Protection of Birds15, National University of Cordoba16, World Conservation Monitoring Centre17, International Institute for Applied Systems Analysis18, Environmental Change Institute19, University of Vienna20
TL;DR: It is found that restoring 15% of converted lands in priority areas could avoid 60% of expected extinctions while sequestering 299 gigatonnes of CO 2 —30% of the total CO 2 increase in the atmosphere since the Industrial Revolution.
Abstract: Extensive ecosystem restoration is increasingly seen as being central to conserving biodiversity and stabilizing the climate of the Earth. Although ambitious national and global targets have been set, global priority areas that account for spatial variation in benefits and costs have yet to be identified. Here we develop and apply a multicriteria optimization approach that identifies priority areas for restoration across all terrestrial biomes, and estimates their benefits and costs. We find that restoring 15% of converted lands in priority areas could avoid 60% of expected extinctions while sequestering 299 gigatonnes of CO2—30% of the total CO2 increase in the atmosphere since the Industrial Revolution. The inclusion of several biomes is key to achieving multiple benefits. Cost effectiveness can increase up to 13-fold when spatial allocation is optimized using our multicriteria approach, which highlights the importance of spatial planning. Our results confirm the vast potential contributions of restoration to addressing global challenges, while underscoring the necessity of pursuing these goals synergistically.
••
TL;DR: Clinical performance of four PARP inhibitors used in cancer therapy are highlighted and the predictive biomarkers of inhibitor sensitivity, mechanisms of resistance as well as the means of overcoming them through combination therapy are discussed.
Abstract: Oxidative and replication stress underlie genomic instability of cancer cells. Amplifying genomic instability through radiotherapy and chemotherapy has been a powerful but nonselective means of killing cancer cells. Precision medicine has revolutionized cancer therapy by putting forth the concept of selective targeting of cancer cells. Poly(ADP-ribose) polymerase (PARP) inhibitors represent a successful example of precision medicine as the first drugs targeting DNA damage response to have entered the clinic. PARP inhibitors act through synthetic lethality with mutations in DNA repair genes and were approved for the treatment of BRCA mutated ovarian and breast cancer. PARP inhibitors destabilize replication forks through PARP DNA entrapment and induce cell death through replication stress-induced mitotic catastrophe. Inhibitors of poly(ADP-ribose) glycohydrolase (PARG) exploit and exacerbate replication deficiencies of cancer cells and may complement PARP inhibitors in targeting a broad range of cancer types with different sources of genomic instability. Here I provide an overview of the molecular mechanisms and cellular consequences of PARP and PARG inhibition. I highlight clinical performance of four PARP inhibitors used in cancer therapy (olaparib, rucaparib, niraparib, and talazoparib) and discuss the predictive biomarkers of inhibitor sensitivity, mechanisms of resistance as well as the means of overcoming them through combination therapy.
••
Cornell University1, Technische Universität München2, Woods Hole Oceanographic Institution3, University of Vienna4, Oregon State University5, Stanford University6, Stockholm University7, University of Paris8, Max Planck Society9, University of California, Santa Barbara10, Lawrence Berkeley National Laboratory11, National Center for Atmospheric Research12, Institute of Arctic and Alpine Research13
TL;DR: In this article, the authors propose that soil carbon persistence can be understood through the lens of decomposers as a result of functional complexity derived from the interplay between spatial and temporal variation of molecular diversity and composition, which suggests soil management should be based on constant care rather than one-time action to lock away carbon in soils.
Abstract: Soil organic carbon management has the potential to aid climate change mitigation through drawdown of atmospheric carbon dioxide. To be effective, such management must account for processes influencing carbon storage and re-emission at different space and time scales. Achieving this requires a conceptual advance in our understanding to link carbon dynamics from the scales at which processes occur to the scales at which decisions are made. Here, we propose that soil carbon persistence can be understood through the lens of decomposers as a result of functional complexity derived from the interplay between spatial and temporal variation of molecular diversity and composition. For example, co-location alone can determine whether a molecule is decomposed, with rapid changes in moisture leading to transport of organic matter and constraining the fitness of the microbial community, while greater molecular diversity may increase the metabolic demand of, and thus potentially limit, decomposition. This conceptual shift accounts for emergent behaviour of the microbial community and would enable soil carbon changes to be predicted without invoking recalcitrant carbon forms that have not been observed experimentally. Functional complexity as a driver of soil carbon persistence suggests soil management should be based on constant care rather than one-time action to lock away carbon in soils. Dynamic interactions between chemical and biological controls govern the stability of soil organic carbon and drive complex, emergent patterns in soil carbon persistence.
••
Xishuangbanna Tropical Botanical Garden1, Boston University2, University of Montpellier3, University of Vienna4, Memorial University of Newfoundland5, National University of Singapore6, Arthur Rylah Institute for Environmental Research7, James Hutton Institute8, James Cook University9, University of Wisconsin-Madison10, Portland State University11, University of Guelph12
TL;DR: This editorial can only be a snapshot of a quickly evolving situation, but it hopes that it can offer some encouragement and insights for colleagues in lockdown and how the conservation community must be ready to respond.
••
Max Planck Society1, University of Innsbruck2, Institute for Quantum Optics and Quantum Information3, University of Florence4, Istituto Nazionale di Fisica Nucleare5, Autonomous University of Barcelona6, International School for Advanced Studies7, International Centre for Theoretical Physics8, ICFO – The Institute of Photonic Sciences9, University of Padua10, Tel Aviv University11, Ikerbasque12, University of the Basque Country13, University of Barcelona14, Ghent University15, University of Vienna16, University of Bern17, University of Cambridge18, Jagiellonian University19
TL;DR: In this article, tensor network methods are applied to the study of lattice gauge theories together with some results on Abelian and non-Abelian lattice-gauge theories.
Abstract: Lattice gauge theories, which originated from particle physics in the context of Quantum Chromodynamics (QCD), provide an important intellectual stimulus to further develop quantum information technologies. While one long-term goal is the reliable quantum simulation of currently intractable aspects of QCD itself, lattice gauge theories also play an important role in condensed matter physics and in quantum information science. In this way, lattice gauge theories provide both motivation and a framework for interdisciplinary research towards the development of special purpose digital and analog quantum simulators, and ultimately of scalable universal quantum computers. In this manuscript, recent results and new tools from a quantum science approach to study lattice gauge theories are reviewed. Two new complementary approaches are discussed: first, tensor network methods are presented – a classical simulation approach – applied to the study of lattice gauge theories together with some results on Abelian and non-Abelian lattice gauge theories. Then, recent proposals for the implementation of lattice gauge theory quantum simulators in different quantum hardware are reported, e.g., trapped ions, Rydberg atoms, and superconducting circuits. Finally, the first proof-of-principle trapped ions experimental quantum simulations of the Schwinger model are reviewed.
••
Harvard University1, University of Washington2, Humboldt University of Berlin3, Imperial College London4, University of Belgrade5, Istituto Nazionale di Fisica Nucleare6, Technical University of Berlin7, University of Bordeaux8, University of Oxford9, University of Valencia10, Rutherford Appleton Laboratory11, University of Strathclyde12, King's College London13, Foundation for Research & Technology – Hellas14, University of Birmingham15, University College London16, University of Liverpool17, National Physical Laboratory18, University of Nottingham19, University of Sussex20, Fermilab21, Northern Illinois University22, Peking University23, University of Pisa24, University of California, Riverside25, University of Nevada, Reno26, CERN27, University of Niš28, National Institute of Chemical Physics and Biophysics29, British University in Egypt30, Beni-Suef University31, Leibniz University of Hanover32, Paul Sabatier University33, University of Paris34, University of Cambridge35, Wayne State University36, Stanford University37, University of Bergen38, University of Amsterdam39, Northwestern University40, University of Bristol41, University of Warsaw42, University of Illinois at Urbana–Champaign43, Fayoum University44, University of Crete45, Queen's University Belfast46, Brandeis University47, University of Bologna48, Cochin University of Science and Technology49, German Aerospace Center50, University of Manchester51, University of Copenhagen52, University of Düsseldorf53, University of Vienna54, Florida State University55, University of Florence56, University of Illinois at Chicago57, University of Bremen58, University of Mainz59, Chinese Academy of Sciences60, University of Cincinnati61
TL;DR: The Atomic Experiment for Dark Matter and Gravity Exploration (AEDGE) as mentioned in this paper is a space experiment using cold atoms to search for ultra-light dark matter, and to detect gravitational waves in the frequency range between the most sensitive ranges of LISA and the terrestrial LIGO/Virgo/KAGRA/INDIGO experiments.
Abstract: We propose in this White Paper a concept for a space experiment using cold atoms to search for ultra-light dark matter, and to detect gravitational waves in the frequency range between the most sensitive ranges of LISA and the terrestrial LIGO/Virgo/KAGRA/INDIGO experiments. This interdisciplinary experiment, called Atomic Experiment for Dark Matter and Gravity Exploration (AEDGE), will also complement other planned searches for dark matter, and exploit synergies with other gravitational wave detectors. We give examples of the extended range of sensitivity to ultra-light dark matter offered by AEDGE, and how its gravitational-wave measurements could explore the assembly of super-massive black holes, first-order phase transitions in the early universe and cosmic strings. AEDGE will be based upon technologies now being developed for terrestrial experiments using cold atoms, and will benefit from the space experience obtained with, e.g., LISA and cold atom experiments in microgravity.
••
TL;DR: This work systematically explore the phylogeny of taxa currently assigned to these classes using 120 conserved single-copy marker genes as well as rRNA genes and indicates the independent acquisition of predatory behaviour in the phyla Myxococcota and Bdellovibrio, which is consistent with their distinct modes of action.
Abstract: The class Deltaproteobacteria comprises an ecologically and metabolically diverse group of bacteria best known for dissimilatory sulphate reduction and predatory behaviour. Although this lineage is the fourth described class of the phylum Proteobacteria, it rarely affiliates with other proteobacterial classes and is frequently not recovered as a monophyletic unit in phylogenetic analyses. Indeed, one branch of the class Deltaproteobacteria encompassing Bdellovibrio-like predators was recently reclassified into a separate proteobacterial class, the Oligoflexia. Here we systematically explore the phylogeny of taxa currently assigned to these classes using 120 conserved single-copy marker genes as well as rRNA genes. The overwhelming majority of markers reject the inclusion of the classes Deltaproteobacteria and Oligoflexia in the phylum Proteobacteria. Instead, the great majority of currently recognized members of the class Deltaproteobacteria are better classified into four novel phylum-level lineages. We propose the names Desulfobacterota phyl. nov. and Myxococcota phyl. nov. for two of these phyla, based on the oldest validly published names in each lineage, and retain the placeholder name SAR324 for the third phylum pending formal description of type material. Members of the class Oligoflexia represent a separate phylum for which we propose the name Bdellovibrionota phyl. nov. based on priority in the literature and general recognition of the genus Bdellovibrio. Desulfobacterota phyl. nov. includes the taxa previously classified in the phylum Thermodesulfobacteria, and these reclassifications imply that the ability of sulphate reduction was vertically inherited in the Thermodesulfobacteria rather than laterally acquired as previously inferred. Our analysis also indicates the independent acquisition of predatory behaviour in the phyla Myxococcota and Bdellovibrionota, which is consistent with their distinct modes of action. This work represents a stable reclassification of one of the most taxonomically challenging areas of the bacterial tree and provides a robust framework for future ecological and systematic studies.
••
Stanford University1, University of California, Los Angeles2, École Polytechnique Fédérale de Lausanne3, Niels Bohr Institute4, University of Cambridge5, University of California, Davis6, Institute of Cosmology and Gravitation, University of Portsmouth7, Carnegie Learning8, Kapteyn Astronomical Institute9, University of Oxford10, INAF11, Max Planck Society12, Academia Sinica Institute of Astronomy and Astrophysics13, Technische Universität München14, University of Tokyo15, University of Vienna16, University of Chicago17, Fermilab18
TL;DR: In this article, a hierarchical Bayesian approach was proposed to estimate the mass-sheet transform (MST) with respect to stellar kinematics, which is based on a family of mass models.
Abstract: The H0LiCOW collaboration inferred via strong gravitational lensing time delays a Hubble constant value of km s−1 Mpc−1 , describing deflector mass density profiles by either a power-law or stars (constant mass-to-light ratio) plus standard dark matter halos. The mass-sheet transform (MST) that leaves the lensing observables unchanged is considered the dominant source of residual uncertainty in H 0 . We quantify any potential effect of the MST with a flexible family of mass models, which directly encodes it, and they are hence maximally degenerate with H 0 . Our calculation is based on a new hierarchical Bayesian approach in which the MST is only constrained by stellar kinematics. The approach is validated on mock lenses, which are generated from hydrodynamic simulations. We first applied the inference to the TDCOSMO sample of seven lenses, six of which are from H0LiCOW, and measured km s−1 Mpc−1 . Secondly, in order to further constrain the deflector mass density profiles, we added imaging and spectroscopy for a set of 33 strong gravitational lenses from the Sloan Lens ACS (SLACS) sample. For nine of the 33 SLAC lenses, we used resolved kinematics to constrain the stellar anisotropy. From the joint hierarchical analysis of the TDCOSMO+SLACS sample, we measured km s−1 Mpc−1 . This measurement assumes that the TDCOSMO and SLACS galaxies are drawn from the same parent population. The blind H0LiCOW, TDCOSMO-only and TDCOSMO+SLACS analyses are in mutual statistical agreement. The TDCOSMO+SLACS analysis prefers marginally shallower mass profiles than H0LiCOW or TDCOSMO-only. Without relying on the form of the mass density profile used by H0LiCOW, we achieve a ∼5% measurement of H 0 . While our new hierarchical analysis does not statistically invalidate the mass profile assumptions by H0LiCOW – and thus the H 0 measurement relying on them – it demonstrates the importance of understanding the mass density profile of elliptical galaxies. The uncertainties on H 0 derived in this paper can be reduced by physical or observational priors on the form of the mass profile, or by additional data.
••
14 Jul 2020TL;DR: An overview of the latest technological developments in the generation and manipulation of high-dimensionally entangled photonic systems encoded in various discrete degrees of freedom such as path, transverse spatial modes or time–frequency bins is provided.
Abstract: Since its discovery, quantum entanglement has challenged some of the best established views of the world: locality and reality. Quantum technologies promise to revolutionize computation, communication, metrology and imaging. Here we review conceptual and experimental advances in complex entangled systems involving many multilevel quantum particles. We provide an overview of the latest technological developments in the generation and manipulation of high-dimensionally entangled photonic systems encoded in various discrete degrees of freedom such as path, transverse spatial modes or time–frequency bins. This overview should help to transfer various physical principles for the generation and manipulation from one degree of freedom to another and thus inspire new technical developments. We also show how purely academic questions and curiosity led to new technological applications. Fundamental research provides the necessary knowledge for upcoming technologies, such as a prospective quantum internet or the quantum teleportation of all information stored in a quantum system. Finally, we discuss some important problems in the area of high-dimensional entanglement and give a brief outlook on possible future developments. The study of higher-dimensional quantum states has seen numerous conceptual and technological developments. This review discusses various techniques for the generation and processing of qudits, which are stored in the momentum, path, time-/frequency-bins, or the orbital angular momentum of photons.
••
Leipzig University1, Helmholtz Centre for Environmental Research - UFZ2, University of Jena3, Martin Luther University of Halle-Wittenberg4, Humboldt University of Berlin5, University of Helsinki6, University of Porto7, University of Lisbon8, University of Innsbruck9, Czech University of Life Sciences Prague10, University of Vienna11, University of Göttingen12
TL;DR: Concerned about current attempts to dilute the environmental ambition of the future CAP, and the lack of concrete proposals for improving the CAP in the draft of the European Green Deal, it is called on the European Parliament, Council and Commission to adopt 10 urgent action points for delivering sustainable food production, biodiversity conservation and climate mitigation.
Abstract: Making agriculture sustainable is a global challenge. In the European Union (EU), the Common Agricultural Policy (CAP) is failing with respect to biodiversity, climate, soil, land degradation as well as socio-economic challenges.The European Commission's proposal for a CAP post-2020 provides a scope for enhanced sustainability. However, it also allows Member States to choose low-ambition implementation pathways. It therefore remains essential to address citizens' demands for sustainable agriculture and rectify systemic weaknesses in the CAP, using the full breadth of available scientific evidence and knowledge.Concerned about current attempts to dilute the environmental ambition of the future CAP, and the lack of concrete proposals for improving the CAP in the draft of the European Green Deal, we call on the European Parliament, Council and Commission to adopt 10 urgent action points for delivering sustainable food production, biodiversity conservation and climate mitigation.Knowledge is available to help moving towards evidence-based, sustainable European agriculture that can benefit people, nature and their joint futures.The statements made in this article have the broad support of the scientific community, as expressed by above 3,600 signatories to the preprint version of this manuscript. The list can be found here (https://doi.org/10.5281/zenodo.3685632).
••
University of Rome Tor Vergata1, University of Cyprus2, University College London3, University College Hospital4, Sapienza University of Rome5, University of Insubria6, Adria Airways7, University of Seville8, Maastricht University Medical Centre9, Rabin Medical Center10, University of Vienna11, Yeovil District Hospital NHS Foundation Trust12, Tel Aviv University13, Sheba Medical Center14, University of Liverpool15, VU University Amsterdam16, Autonomous University of Barcelona17
TL;DR: This document summarizes the latest evidence on bariatric surgery through state-of-the art guideline development, aiming to facilitate evidence-based clinical decisions.
Abstract: Surgery for obesity and metabolic diseases has been evolved in the light of new scientific evidence, long-term outcomes and accumulated experience. EAES has sponsored an update of previous guidelines on bariatric surgery. A multidisciplinary group of bariatric surgeons, obesity physicians, nutritional experts, psychologists, anesthetists and a patient representative comprised the guideline development panel. Development and reporting conformed to GRADE guidelines and AGREE II standards. Systematic review of databases, record selection, data extraction and synthesis, evidence appraisal and evidence-to-decision frameworks were developed for 42 key questions in the domains Indication; Preoperative work-up; Perioperative management; Non-bypass, bypass and one-anastomosis procedures; Revisional surgery; Postoperative care; and Investigational procedures. A total of 36 recommendations and position statements were formed through a modified Delphi procedure. This document summarizes the latest evidence on bariatric surgery through state-of-the art guideline development, aiming to facilitate evidence-based clinical decisions.
••
TL;DR: This study provides a national-scale analysis of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) superspreading during the first wave of infections in Austria, a country that played a major role in initial virus transmissions in Europe.
Abstract: Superspreading events shaped the coronavirus disease 2019 (COVID-19) pandemic, and their rapid identification and containment are essential for disease control. Here, we provide a national-scale analysis of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) superspreading during the first wave of infections in Austria, a country that played a major role in initial virus transmissions in Europe. Capitalizing on Austria's well-developed epidemiological surveillance system, we identified major SARS-CoV-2 clusters during the first wave of infections and performed deep whole-genome sequencing of more than 500 virus samples. Phylogenetic-epidemiological analysis enabled the reconstruction of superspreading events and charts a map of tourism-related viral spread originating from Austria in spring 2020. Moreover, we exploited epidemiologically well-defined clusters to quantify SARS-CoV-2 mutational dynamics, including the observation of low-frequency mutations that progressed to fixation within the infection chain. Time-resolved virus sequencing unveiled viral mutation dynamics within individuals with COVID-19, and epidemiologically validated infector-infectee pairs enabled us to determine an average transmission bottleneck size of 103 SARS-CoV-2 particles. In conclusion, this study illustrates the power of combining epidemiological analysis with deep viral genome sequencing to unravel the spread of SARS-CoV-2 and to gain fundamental insights into mutational dynamics and transmission properties.
••
TL;DR: This Perspective describes how polygenic adaptation can be studied using a framework of ‘adaptive architecture’ that unifies principles from the traditionally disparate fields of quantitative genetics and molecular population genetics.
Abstract: Most adaption processes have a polygenic genetic basis, but even with the recent explosive growth of genomic data we are still lacking a unified framework describing the dynamics of selected alleles Building on recent theoretical and empirical work we introduce the concept of adaptive architecture, which extends the genetic architecture of an adaptive trait by factors influencing its adaptive potential and population genetic principles Because adaptation can be typically achieved by many different combinations of adaptive alleles (redundancy), we describe how two characteristics — heterogeneity among loci and non-parallelism between replicated populations — are hallmarks for the characterization of polygenic adaptation in evolving populations We discuss how this unified framework can be applied to natural and experimental populations Increased capacities for sequencing and genotyping are enabling a more comprehensive understanding of the genetics of adaptation for diverse species In this Perspective, Barghi, Hermisson and Schlotterer describe how polygenic adaptation can be studied using a framework of ‘adaptive architecture’ that unifies principles from the traditionally disparate fields of quantitative genetics and molecular population genetics
••
TL;DR: The densely sampled alignment provides a single-base-pair map of selection, has more than doubled the fraction of bases that are confidently predicted to be under conservation and reveals extensive patterns of weak selection in predominantly non-coding DNA.
Abstract: Whole-genome sequencing projects are increasingly populating the tree of life and characterizing biodiversity1-4. Sparse taxon sampling has previously been proposed to confound phylogenetic inference5, and captures only a fraction of the genomic diversity. Here we report a substantial step towards the dense representation of avian phylogenetic and molecular diversity, by analysing 363 genomes from 92.4% of bird families-including 267 newly sequenced genomes produced for phase II of the Bird 10,000 Genomes (B10K) Project. We use this comparative genome dataset in combination with a pipeline that leverages a reference-free whole-genome alignment to identify orthologous regions in greater numbers than has previously been possible and to recognize genomic novelties in particular bird lineages. The densely sampled alignment provides a single-base-pair map of selection, has more than doubled the fraction of bases that are confidently predicted to be under conservation and reveals extensive patterns of weak selection in predominantly non-coding DNA. Our results demonstrate that increasing the diversity of genomes used in comparative studies can reveal more shared and lineage-specific variation, and improve the investigation of genomic characteristics. We anticipate that this genomic resource will offer new perspectives on evolutionary processes in cross-species comparative analyses and assist in efforts to conserve species.
••
TL;DR: It is shown how the karyotypes of amphioxus and diverse vertebrates are derived from 17 ancestral chordate linkage groups (and 19 ancestral bilaterian groups) by fusion, rearrangement and duplication.
Abstract: Although it is widely believed that early vertebrate evolution was shaped by ancient whole-genome duplications, the number, timing and mechanism of these events remain elusive. Here, we infer the history of vertebrates through genomic comparisons with a new chromosome-scale sequence of the invertebrate chordate amphioxus. We show how the karyotypes of amphioxus and diverse vertebrates are derived from 17 ancestral chordate linkage groups (and 19 ancestral bilaterian groups) by fusion, rearrangement and duplication. We resolve two distinct ancient duplications based on patterns of chromosomal conserved synteny. All extant vertebrates share the first duplication, which occurred in the mid/late Cambrian by autotetraploidization (that is, direct genome doubling). In contrast, the second duplication is found only in jawed vertebrates and occurred in the mid–late Ordovician by allotetraploidization (that is, genome duplication following interspecific hybridization) from two now-extinct progenitors. This complex genomic history parallels the diversification of vertebrate lineages in the fossil record. Genomic comparisons with a new amphioxus chromosome-scale genome assembly reveal details of the early evolution of vertebrate genomes.
••
Massachusetts Institute of Technology1, Texas Tech University2, University of Geneva3, CERN4, University of Strasbourg5, École normale supérieure de Lyon6, Université de Montréal7, Université catholique de Louvain8, University of Wisconsin-Madison9, Ohio State University10, University of Southampton11, University of Melbourne12, Imperial College London13, Rutgers University14, Claude Bernard University Lyon 115, University of California, Santa Cruz16, University of Porto17, University of Chicago18, University of California, Berkeley19, University of Zurich20, University of California, Irvine21, Lund University22, University of California, Davis23, University of Bristol24, Fermilab25, University of Grenoble26, Austrian Academy of Sciences27, Northwestern University28, University of Oxford29, RWTH Aachen University30, University of Malaya31, University of Washington32, Harvard University33, Durham University34, National Central University35, Chung-Ang University36, University of Tokyo37, Vrije Universiteit Brussel38, University of Auvergne39, University of Amsterdam40, Sao Paulo State University41, Stockholm University42, Lawrence Berkeley National Laboratory43, Carnegie Mellon University44, University of Padua45, SLAC National Accelerator Laboratory46, University of Vienna47, Quaid-i-Azam University48, Max Planck Society49, University of Naples Federico II50, University of Copenhagen51, University of Hamburg52, Chulalongkorn University53, Cornell University54, Northeastern University55, Rutherford Appleton Laboratory56, Université libre de Bruxelles57, Centre national de la recherche scientifique58
TL;DR: The final report of the ATLAS-CMS Dark Matter Forum, a forum organized by ATLAS and CMS collaborations with the participation of experts on theories of dark matter, to select a minimal basis set of simplified models that should support the design of the early LHC Run-2 searches is presented in this paper.
••
TL;DR: How models aimed at predicting non-steady state ecosystem responses over time can benefit from dissecting ecosystems into the organismal components and their inherent limitations to better represent plant-microbe interactions in coupled carbon and nutrient models is outlined.
Abstract: Numerous studies have demonstrated that fertilization with nutrients such as nitrogen, phosphorus, and potassium increases plant productivity in both natural and managed ecosystems, demonstrating that primary productivity is nutrient limited in most terrestrial ecosystems. In contrast, it has been demonstrated that heterotrophic microbial communities in soil are primarily limited by organic carbon or energy. While this concept of contrasting limitations, that is, microbial carbon and plant nutrient limitation, is based on strong evidence that we review in this paper, it is often ignored in discussions of ecosystem response to global environment changes. The plant-centric perspective has equated plant nutrient limitations with those of whole ecosystems, thereby ignoring the important role of the heterotrophs responsible for soil decomposition in driving ecosystem carbon storage. To truly integrate carbon and nutrient cycles in ecosystem science, we must account for the fact that while plant productivity may be nutrient limited, the secondary productivity by heterotrophic communities is inherently carbon limited. Ecosystem carbon cycling integrates the independent physiological responses of its individual components, as well as tightly coupled exchanges between autotrophs and heterotrophs. To the extent that the interacting autotrophic and heterotrophic processes are controlled by organisms that are limited by nutrient versus carbon accessibility, respectively, we propose that ecosystems by definition cannot be 'limited' by nutrients or carbon alone. Here, we outline how models aimed at predicting non-steady state ecosystem responses over time can benefit from dissecting ecosystems into the organismal components and their inherent limitations to better represent plant-microbe interactions in coupled carbon and nutrient models.