scispace - formally typeset
Search or ask a question

Showing papers by "ETH Zurich published in 2019"


Book ChapterDOI
01 Jan 2019
TL;DR: In this article, sediment is either loaded as bed-load with particles sliding, saltating, and rolling over the river bed, or as a suspended-load, where particles move with the turbulent water flow away from the bed.
Abstract: Transportation of sediment is an important and frequent phenomenon in rivers. Sediment is mobilized as bed-load with particles sliding, saltating, and rolling over the river bed, or as a suspended-load, where particles move with the turbulent water flow away from the bed.

13,877 citations


Posted Content
TL;DR: PyTorch as discussed by the authors is a machine learning library that provides an imperative and Pythonic programming style that makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs.
Abstract: Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it provides an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs. In this paper, we detail the principles that drove the implementation of PyTorch and how they are reflected in its architecture. We emphasize that every aspect of PyTorch is a regular Python program under the full control of its user. We also explain how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance. We demonstrate the efficiency of individual subsystems, as well as the overall speed of PyTorch on several common benchmarks.

12,767 citations


Proceedings Article
01 Jan 2019
TL;DR: This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.
Abstract: Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it was designed from first principles to support an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs. In this paper, we detail the principles that drove the implementation of PyTorch and how they are reflected in its architecture. We emphasize that every aspect of PyTorch is a regular Python program under the full control of its user. We also explain how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance. We demonstrate the efficiency of individual subsystems, as well as the overall speed of PyTorch on several commonly used benchmarks.

10,045 citations


Journal ArticleDOI
TL;DR: Food in the Anthropocene : the EAT-Lancet Commission on healthy diets from sustainable food systems focuses on meat, fish, vegetables and fruit as sources of protein.

4,710 citations


Journal ArticleDOI
TL;DR: A series of major new developments in the BEAST 2 core platform and model hierarchy that have occurred since the first release of the software, culminating in the recent 2.5 release are described.
Abstract: Elaboration of Bayesian phylogenetic inference methods has continued at pace in recent years with major new advances in nearly all aspects of the joint modelling of evolutionary data. It is increasingly appreciated that some evolutionary questions can only be adequately answered by combining evidence from multiple independent sources of data, including genome sequences, sampling dates, phenotypic data, radiocarbon dates, fossil occurrences, and biogeographic range information among others. Including all relevant data into a single joint model is very challenging both conceptually and computationally. Advanced computational software packages that allow robust development of compatible (sub-)models which can be composed into a full model hierarchy have played a key role in these developments. Developing such software frameworks is increasingly a major scientific activity in its own right, and comes with specific challenges, from practical software design, development and engineering challenges to statistical and conceptual modelling challenges. BEAST 2 is one such computational software platform, and was first announced over 4 years ago. Here we describe a series of major new developments in the BEAST 2 core platform and model hierarchy that have occurred since the first release of the software, culminating in the recent 2.5 release.

2,045 citations


Journal ArticleDOI
TL;DR: In this article, a global convergence emerging around five ethical principles (transparency, justice and fairness, non-maleficence, responsibility and privacy), with substantive divergence in relation to how these principles are interpreted, why they are deemed important, what issue, domain or actors they pertain to, and how they should be implemented.
Abstract: In the past five years, private companies, research institutions and public sector organizations have issued principles and guidelines for ethical artificial intelligence (AI). However, despite an apparent agreement that AI should be ‘ethical’, there is debate about both what constitutes ‘ethical AI’ and which ethical requirements, technical standards and best practices are needed for its realization. To investigate whether a global agreement on these questions is emerging, we mapped and analysed the current corpus of principles and guidelines on ethical AI. Our results reveal a global convergence emerging around five ethical principles (transparency, justice and fairness, non-maleficence, responsibility and privacy), with substantive divergence in relation to how these principles are interpreted, why they are deemed important, what issue, domain or actors they pertain to, and how they should be implemented. Our findings highlight the importance of integrating guideline-development efforts with substantive ethical analysis and adequate implementation strategies.

1,419 citations


Journal ArticleDOI
TL;DR: In this 8th release of JASPAR, the CORE collection has been expanded with 245 new PFMs, and 156 PFMs were updated, and the genomic tracks, inference tool, and TF-binding profile similarity clusters were updated.
Abstract: JASPAR (http://jaspar.genereg.net) is an open-access database of curated, non-redundant transcription factor (TF)-binding profiles stored as position frequency matrices (PFMs) for TFs across multiple species in six taxonomic groups. In this 8th release of JASPAR, the CORE collection has been expanded with 245 new PFMs (169 for vertebrates, 42 for plants, 17 for nematodes, 10 for insects, and 7 for fungi), and 156 PFMs were updated (125 for vertebrates, 28 for plants and 3 for insects). These new profiles represent an 18% expansion compared to the previous release. JASPAR 2020 comes with a novel collection of unvalidated TF-binding profiles for which our curators did not find orthogonal supporting evidence in the literature. This collection has a dedicated web form to engage the community in the curation of unvalidated TF-binding profiles. Moreover, we created a Q&A forum to ease the communication between the user community and JASPAR curators. Finally, we updated the genomic tracks, inference tool, and TF-binding profile similarity clusters. All the data is available through the JASPAR website, its associated RESTful API, and through the JASPAR2020 R/Bioconductor package.

1,219 citations


Journal ArticleDOI
TL;DR: An overview of machine learning for fluid mechanics can be found in this article, where the strengths and limitations of these methods are addressed from the perspective of scientific inquiry that considers data as an inherent part of modeling, experimentation, and simulation.
Abstract: The field of fluid mechanics is rapidly advancing, driven by unprecedented volumes of data from field measurements, experiments and large-scale simulations at multiple spatiotemporal scales. Machine learning offers a wealth of techniques to extract information from data that could be translated into knowledge about the underlying fluid mechanics. Moreover, machine learning algorithms can augment domain knowledge and automate tasks related to flow control and optimization. This article presents an overview of past history, current developments, and emerging opportunities of machine learning for fluid mechanics. It outlines fundamental machine learning methodologies and discusses their uses for understanding, modeling, optimizing, and controlling fluid flows. The strengths and limitations of these methods are addressed from the perspective of scientific inquiry that considers data as an inherent part of modeling, experimentation, and simulation. Machine learning provides a powerful information processing framework that can enrich, and possibly even transform, current lines of fluid mechanics research and industrial applications.

1,119 citations


Journal ArticleDOI
TL;DR: A global convergence emerging around five ethical principles (transparency, justice and fairness, non-maleficence, responsibility and privacy), with substantive divergence in relation to how these principles are interpreted; why they are deemed important; what issue, domain or actors they pertain to; and how they should be implemented.
Abstract: In the last five years, private companies, research institutions as well as public sector organisations have issued principles and guidelines for ethical AI, yet there is debate about both what constitutes "ethical AI" and which ethical requirements, technical standards and best practices are needed for its realization. To investigate whether a global agreement on these questions is emerging, we mapped and analyzed the current corpus of principles and guidelines on ethical AI. Our results reveal a global convergence emerging around five ethical principles (transparency, justice and fairness, non-maleficence, responsibility and privacy), with substantive divergence in relation to how these principles are interpreted; why they are deemed important; what issue, domain or actors they pertain to; and how they should be implemented. Our findings highlight the importance of integrating guideline-development efforts with substantive ethical analysis and adequate implementation strategies.

1,105 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide an extensive review and an updated research agenda for the field, classified into nine main themes: understanding transitions; power, agency and politics; governing transitions; civil society, culture and social movements; businesses and industries; transitions in practice and everyday life; geography of transitions; ethical aspects; and methodologies.
Abstract: Research on sustainability transitions has expanded rapidly in the last ten years, diversified in terms of topics and geographical applications, and deepened with respect to theories and methods. This article provides an extensive review and an updated research agenda for the field, classified into nine main themes: understanding transitions; power, agency and politics; governing transitions; civil society, culture and social movements; businesses and industries; transitions in practice and everyday life; geography of transitions; ethical aspects; and methodologies. The review shows that the scope of sustainability transitions research has broadened and connections to established disciplines have grown stronger. At the same time, we see that the grand challenges related to sustainability remain unsolved, calling for continued efforts and an acceleration of ongoing transitions. Transition studies can play a key role in this regard by creating new perspectives, approaches and understanding and helping to move society in the direction of sustainability.

1,099 citations


Journal ArticleDOI
05 Jul 2019-Science
TL;DR: There is room for an extra 0.9 billion hectares of canopy cover, which could store 205 gigatonnes of carbon in areas that would naturally support woodlands and forests, which highlights global tree restoration as one of the most effective carbon drawdown solutions to date.
Abstract: The restoration of trees remains among the most effective strategies for climate change mitigation. We mapped the global potential tree coverage to show that 4.4 billion hectares of canopy cover could exist under the current climate. Excluding existing trees and agricultural and urban areas, we found that there is room for an extra 0.9 billion hectares of canopy cover, which could store 205 gigatonnes of carbon in areas that would naturally support woodlands and forests. This highlights global tree restoration as our most effective climate change solution to date. However, climate change will alter this potential tree coverage. We estimate that if we cannot deviate from the current trajectory, the global potential canopy cover may shrink by ~223 million hectares by 2050, with the vast majority of losses occurring in the tropics. Our results highlight the opportunity of climate change mitigation through global tree restoration but also the urgent need for action.

Journal ArticleDOI
TL;DR: The KOF Globalisation Index as discussed by the authors is a composite index measuring globalization for every country in the world along the economic, social and political dimensions, which is based on 43 instead of 23 variables in the previous version.
Abstract: We introduce the revised version of the KOF Globalisation Index, a composite index measuring globalization for every country in the world along the economic, social and political dimension. The original index was introduced by Dreher (Applied Economics, 38(10):1091–1110, 2006) and updated in Dreher et al. (2008). This second revision of the index distinguishes between de facto and de jure measures along the different dimensions of globalization. We also disentangle trade and financial globalization within the economic dimension of globalization and use time-varying weighting of the variables. The new index is based on 43 instead of 23 variables in the previous version. Following Dreher (Applied Economics, 38(10):1091–1110, 2006), we use the new index to examine the effect of globalization on economic growth. The results suggest that de facto and de jure globalization influence economic growth differently. Future research should use the new KOF Globalisation Index to re-examine other important consequences of globalization and why globalization was proceeding rapidly in some countries, such as South Korea, but less so in others. The KOF Globalisation Index can be downloaded from http://www.kof.ethz.ch/globalisation/ .

Journal ArticleDOI
TL;DR: Progress in the fundamental understanding and design of new multiferroic materials, advances in characterization and modelling tools to describe them, and usage in applications are reviewed.
Abstract: The manipulation of magnetic properties by an electric field in magnetoelectric multiferroic materials has driven significant research activity, with the goal of realizing their transformative technological potential. Here, we review progress in the fundamental understanding and design of new multiferroic materials, advances in characterization and modelling tools to describe them, and the exploration of devices and applications. Focusing on the translation of the many scientific breakthroughs into technological innovations, we identify the key open questions in the field where targeted research activities could have maximum impact in transitioning scientific discoveries into real applications. Magnetoelectric multiferroics, where magnetic properties are manipulated by electric field and vice versa, could lead to improved electronic devices. Here, advances in materials, characterisation and modelling, and usage in applications are reviewed.

Journal ArticleDOI
Pierre Friedlingstein1, Pierre Friedlingstein2, Matthew W. Jones3, Michael O'Sullivan2, Robbie M. Andrew, Judith Hauck4, Glen P. Peters, Wouter Peters5, Wouter Peters6, Julia Pongratz7, Julia Pongratz8, Stephen Sitch2, Corinne Le Quéré3, Dorothee C. E. Bakker3, Josep G. Canadell9, Philippe Ciais10, Robert B. Jackson11, Peter Anthoni12, Leticia Barbero13, Leticia Barbero14, Ana Bastos8, Vladislav Bastrikov10, Meike Becker15, Meike Becker16, Laurent Bopp1, Erik T. Buitenhuis3, Naveen Chandra17, Frédéric Chevallier10, Louise Chini18, Kim I. Currie19, Richard A. Feely20, Marion Gehlen10, Dennis Gilfillan21, Thanos Gkritzalis22, Daniel S. Goll23, Nicolas Gruber24, Sören B. Gutekunst25, Ian Harris26, Vanessa Haverd9, Richard A. Houghton27, George C. Hurtt18, Tatiana Ilyina7, Atul K. Jain28, Emilie Joetzjer10, Jed O. Kaplan29, Etsushi Kato, Kees Klein Goldewijk30, Kees Klein Goldewijk31, Jan Ivar Korsbakken, Peter Landschützer7, Siv K. Lauvset16, Nathalie Lefèvre32, Andrew Lenton33, Andrew Lenton34, Sebastian Lienert35, Danica Lombardozzi36, Gregg Marland21, Patrick C. McGuire37, Joe R. Melton, Nicolas Metzl32, David R. Munro38, Julia E. M. S. Nabel7, Shin-Ichiro Nakaoka39, Craig Neill33, Abdirahman M Omar16, Abdirahman M Omar33, Tsuneo Ono, Anna Peregon10, Anna Peregon40, Denis Pierrot14, Denis Pierrot13, Benjamin Poulter41, Gregor Rehder42, Laure Resplandy43, Eddy Robertson44, Christian Rödenbeck7, Roland Séférian10, Jörg Schwinger16, Jörg Schwinger30, Naomi E. Smith45, Naomi E. Smith6, Pieter P. Tans20, Hanqin Tian46, Bronte Tilbrook33, Bronte Tilbrook34, Francesco N. Tubiello47, Guido R. van der Werf48, Andy Wiltshire44, Sönke Zaehle7 
École Normale Supérieure1, University of Exeter2, Norwich Research Park3, Alfred Wegener Institute for Polar and Marine Research4, University of Groningen5, Wageningen University and Research Centre6, Max Planck Society7, Ludwig Maximilian University of Munich8, Commonwealth Scientific and Industrial Research Organisation9, Centre national de la recherche scientifique10, Stanford University11, Karlsruhe Institute of Technology12, Atlantic Oceanographic and Meteorological Laboratory13, Cooperative Institute for Marine and Atmospheric Studies14, Geophysical Institute, University of Bergen15, Bjerknes Centre for Climate Research16, Japan Agency for Marine-Earth Science and Technology17, University of Maryland, College Park18, National Institute of Water and Atmospheric Research19, National Oceanic and Atmospheric Administration20, Appalachian State University21, Flanders Marine Institute22, Augsburg College23, ETH Zurich24, Leibniz Institute of Marine Sciences25, University of East Anglia26, Woods Hole Research Center27, University of Illinois at Urbana–Champaign28, University of Hong Kong29, Netherlands Environmental Assessment Agency30, Utrecht University31, University of Paris32, Hobart Corporation33, University of Tasmania34, University of Bern35, National Center for Atmospheric Research36, University of Reading37, Cooperative Institute for Research in Environmental Sciences38, National Institute for Environmental Studies39, Russian Academy of Sciences40, Goddard Space Flight Center41, Leibniz Institute for Baltic Sea Research42, Princeton University43, Met Office44, Lund University45, Auburn University46, Food and Agriculture Organization47, VU University Amsterdam48
TL;DR: In this article, the authors describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties, including emissions from land use and land use change, and show that the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere is a measure of imperfect data and understanding of the contemporary carbon cycle.
Abstract: . Accurate assessment of anthropogenic carbon dioxide ( CO2 ) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the “global carbon budget” – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. Fossil CO2 emissions ( EFF ) are based on energy statistics and cement production data, while emissions from land use change ( ELUC ), mainly deforestation, are based on land use and land use change data and bookkeeping models. Atmospheric CO2 concentration is measured directly and its growth rate ( GATM ) is computed from the annual changes in concentration. The ocean CO2 sink ( SOCEAN ) and terrestrial CO2 sink ( SLAND ) are estimated with global process models constrained by observations. The resulting carbon budget imbalance ( BIM ), the difference between the estimated total emissions and the estimated changes in the atmosphere, ocean, and terrestrial biosphere, is a measure of imperfect data and understanding of the contemporary carbon cycle. All uncertainties are reported as ±1σ . For the last decade available (2009–2018), EFF was 9.5±0.5 GtC yr −1 , ELUC 1.5±0.7 GtC yr −1 , GATM 4.9±0.02 GtC yr −1 ( 2.3±0.01 ppm yr −1 ), SOCEAN 2.5±0.6 GtC yr −1 , and SLAND 3.2±0.6 GtC yr −1 , with a budget imbalance BIM of 0.4 GtC yr −1 indicating overestimated emissions and/or underestimated sinks. For the year 2018 alone, the growth in EFF was about 2.1 % and fossil emissions increased to 10.0±0.5 GtC yr −1 , reaching 10 GtC yr −1 for the first time in history, ELUC was 1.5±0.7 GtC yr −1 , for total anthropogenic CO2 emissions of 11.5±0.9 GtC yr −1 ( 42.5±3.3 GtCO2 ). Also for 2018, GATM was 5.1±0.2 GtC yr −1 ( 2.4±0.1 ppm yr −1 ), SOCEAN was 2.6±0.6 GtC yr −1 , and SLAND was 3.5±0.7 GtC yr −1 , with a BIM of 0.3 GtC. The global atmospheric CO2 concentration reached 407.38±0.1 ppm averaged over 2018. For 2019, preliminary data for the first 6–10 months indicate a reduced growth in EFF of +0.6 % (range of −0.2 % to 1.5 %) based on national emissions projections for China, the USA, the EU, and India and projections of gross domestic product corrected for recent changes in the carbon intensity of the economy for the rest of the world. Overall, the mean and trend in the five components of the global carbon budget are consistently estimated over the period 1959–2018, but discrepancies of up to 1 GtC yr −1 persist for the representation of semi-decadal variability in CO2 fluxes. A detailed comparison among individual estimates and the introduction of a broad range of observations shows (1) no consensus in the mean and trend in land use change emissions over the last decade, (2) a persistent low agreement between the different methods on the magnitude of the land CO2 flux in the northern extra-tropics, and (3) an apparent underestimation of the CO2 variability by ocean models outside the tropics. This living data update documents changes in the methods and data sets used in this new global carbon budget and the progress in understanding of the global carbon cycle compared with previous publications of this data set (Le Quere et al., 2018a, b, 2016, 2015a, b, 2014, 2013). The data generated by this work are available at https://doi.org/10.18160/gcp-2019 (Friedlingstein et al., 2019).

Journal ArticleDOI
TL;DR: This Consensus Statement documents the central role and global importance of microorganisms in climate change biology and puts humanity on notice that the impact of climate change will depend heavily on responses of micro organisms, which are essential for achieving an environmentally sustainable future.
Abstract: In the Anthropocene, in which we now live, climate change is impacting most life on Earth. Microorganisms support the existence of all higher trophic life forms. To understand how humans and other life forms on Earth (including those we are yet to discover) can withstand anthropogenic climate change, it is vital to incorporate knowledge of the microbial 'unseen majority'. We must learn not just how microorganisms affect climate change (including production and consumption of greenhouse gases) but also how they will be affected by climate change and other human activities. This Consensus Statement documents the central role and global importance of microorganisms in climate change biology. It also puts humanity on notice that the impact of climate change will depend heavily on responses of microorganisms, which are essential for achieving an environmentally sustainable future.

Journal ArticleDOI
TL;DR: In this paper, the authors acknowledge support from the EU FET Open RIA Grant No 766566, the Ministry of Education of the Czech Republic Grant No LM2015087 and LNSM-LNSpin.
Abstract: A M was supported by the King Abdullah University of Science and Technology (KAUST) T J acknowledges support from the EU FET Open RIA Grant No 766566, the Ministry of Education of the Czech Republic Grant No LM2015087 and LNSM-LNSpin, and the Grant Agency of the Czech Republic Grant No 19-28375X J S acknowledges the Alexander von Humboldt Foundation, EU FET Open Grant No 766566, EU ERC Synergy Grant No 610115, and the Transregional Collaborative Research Center (SFB/TRR) 173 SPIN+X K G and P G acknowledge stimulating discussions with C O Avci and financial support by the Swiss National Science Foundation (Grants No 200021-153404 and No 200020-172775) and the European Commission under the Seventh Framework Program (spOt project, Grant No 318144) A T acknowledges support by the Agence Nationale de la Recherche, Project No ANR-17-CE24-0025 (TopSky) J Ž acknowledges the Grant Agency of the Czech Republic Grant No 19-18623Y and support from the Institute of Physics of the Czech Academy of Sciences and the Max Planck Society through the Max Planck Partner Group programme

Journal ArticleDOI
TL;DR: The current state-of-the-art of available technologies for water purification are reviewed and their field of application for heavy metal ion removal is discussed, as heavy metal ions are the most harmful and widespread contaminants.
Abstract: Water pollution is a global problem threatening the entire biosphere and affecting the life of many millions of people around the world. Not only is water pollution one of the foremost global risk factors for illness, diseases and death, but it also contributes to the continuous reduction of the available drinkable water worldwide. Delivering valuable solutions, which are easy to implement and affordable, often remains a challenge. Here we review the current state-of-the-art of available technologies for water purification and discuss their field of application for heavy metal ion removal, as heavy metal ions are the most harmful and widespread contaminants. We consider each technology in the context of sustainability, a largely neglected key factor, which may actually play a pivotal role in the implementation of each technology in real applications, and we introduce a compact index, the Ranking Efficiency Product (REP), to evaluate the efficiency and ease of implementation of the various technologies in this broader perspective. Emerging technologies, for which a detailed quantitative analysis and assessment is not yet possible according to this methodology, either due to scarcity or inhomogeneity of data, are discussed in the final part of the manuscript.

Proceedings ArticleDOI
01 Oct 2019
TL;DR: An end-to-end tracking architecture, capable of fully exploiting both target and background appearance information for target model prediction, derived from a discriminative learning loss by designing a dedicated optimization process that is capable of predicting a powerful model in only a few iterations.
Abstract: The current strive towards end-to-end trainable computer vision systems imposes major challenges for the task of visual tracking. In contrast to most other vision problems, tracking requires the learning of a robust target-specific appearance model online, during the inference stage. To be end-to-end trainable, the online learning of the target model thus needs to be embedded in the tracking architecture itself. Due to the imposed challenges, the popular Siamese paradigm simply predicts a target feature template, while ignoring the background appearance information during inference. Consequently, the predicted model possesses limited target-background discriminability. We develop an end-to-end tracking architecture, capable of fully exploiting both target and background appearance information for target model prediction. Our architecture is derived from a discriminative learning loss by designing a dedicated optimization process that is capable of predicting a powerful model in only a few iterations. Furthermore, our approach is able to learn key aspects of the discriminative loss itself. The proposed tracker sets a new state-of-the-art on 6 tracking benchmarks, achieving an EAO score of 0.440 on VOT2018, while running at over 40 FPS. The code and models are available at https://github.com/visionml/pytracking.

Proceedings ArticleDOI
15 Jun 2019
TL;DR: ATOM as discussed by the authors proposes a novel tracking architecture consisting of dedicated target estimation and classification components, which is trained to predict the overlap between the target object and an estimated bounding box.
Abstract: While recent years have witnessed astonishing improvements in visual tracking robustness, the advancements in tracking accuracy have been limited. As the focus has been directed towards the development of powerful classifiers, the problem of accurate target state estimation has been largely overlooked. In fact, most trackers resort to a simple multi-scale search in order to estimate the target bounding box. We argue that this approach is fundamentally limited since target estimation is a complex task, requiring high-level knowledge about the object. We address this problem by proposing a novel tracking architecture, consisting of dedicated target estimation and classification components. High level knowledge is incorporated into the target estimation through extensive offline learning. Our target estimation component is trained to predict the overlap between the target object and an estimated bounding box. By carefully integrating target-specific information, our approach achieves previously unseen bounding box accuracy. We further introduce a classification component that is trained online to guarantee high discriminative power in the presence of distractors. Our final tracking framework sets a new state-of-the-art on five challenging benchmarks. On the new large-scale TrackingNet dataset, our tracker ATOM achieves a relative gain of 15% over the previous best approach, while running at over 30 FPS. Code and models are available at https://github.com/visionml/pytracking.

Journal ArticleDOI
Andrea Cossarizza1, Hyun-Dong Chang, Andreas Radbruch, Andreas Acs2  +459 moreInstitutions (160)
TL;DR: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community providing the theory and key practical aspects offlow cytometry enabling immunologists to avoid the common errors that often undermine immunological data.
Abstract: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community. They provide the theory and key practical aspects of flow cytometry enabling immunologists to avoid the common errors that often undermine immunological data. Notably, there are comprehensive sections of all major immune cell types with helpful Tables detailing phenotypes in murine and human cells. The latest flow cytometry techniques and applications are also described, featuring examples of the data that can be generated and, importantly, how the data can be analysed. Furthermore, there are sections detailing tips, tricks and pitfalls to avoid, all written and peer-reviewed by leading experts in the field, making this an essential research companion.

Journal ArticleDOI
16 Jan 2019
TL;DR: In this paper, a method for training a neural network policy in simulation and transferring it to a state-of-the-art legged system is presented. But this method is limited to simulation and only few and comparably simple examples have been deployed on real systems.
Abstract: Legged robots pose one of the greatest challenges in robotics. Dynamic and agile maneuvers of animals cannot be imitated by existing methods that are crafted by humans. A compelling alternative is reinforcement learning, which requires minimal craftsmanship and promotes the natural evolution of a control policy. However, so far, reinforcement learning research for legged robots is mainly limited to simulation, and only few and comparably simple examples have been deployed on real systems. The primary reason is that training with real robots, particularly with dynamically balancing systems, is complicated and expensive. In the present work, we introduce a method for training a neural network policy in simulation and transferring it to a state-of-the-art legged system, thereby leveraging fast, automated, and cost-effective data generation schemes. The approach is applied to the ANYmal robot, a sophisticated medium-dog-sized quadrupedal system. Using policies trained in simulation, the quadrupedal machine achieves locomotion skills that go beyond what had been achieved with prior methods: ANYmal is capable of precisely and energy-efficiently following high-level body velocity commands, running faster than before, and recovering from falling even in complex configurations.

Journal ArticleDOI
TL;DR: The numerous beneficial effects of GLP-1 render this hormone an interesting candidate for the development of pharmacotherapies to treat obesity, diabetes, and neurodegenerative disorders.
Abstract: Background The glucagon-like peptide-1 (GLP-1) is a multifaceted hormone with broad pharmacological potential. Among the numerous metabolic effects of GLP-1 are the glucose-dependent stimulation of insulin secretion, decrease of gastric emptying, inhibition of food intake, increase of natriuresis and diuresis, and modulation of rodent β-cell proliferation. GLP-1 also has cardio- and neuroprotective effects, decreases inflammation and apoptosis, and has implications for learning and memory, reward behavior, and palatability. Biochemically modified for enhanced potency and sustained action, GLP-1 receptor agonists are successfully in clinical use for the treatment of type-2 diabetes, and several GLP-1-based pharmacotherapies are in clinical evaluation for the treatment of obesity. Scope of review In this review, we provide a detailed overview on the multifaceted nature of GLP-1 and its pharmacology and discuss its therapeutic implications on various diseases. Major conclusions Since its discovery, GLP-1 has emerged as a pleiotropic hormone with a myriad of metabolic functions that go well beyond its classical identification as an incretin hormone. The numerous beneficial effects of GLP-1 render this hormone an interesting candidate for the development of pharmacotherapies to treat obesity, diabetes, and neurodegenerative disorders

Journal ArticleDOI
06 Feb 2019-Nature
TL;DR: The authors show that circulating tumour cells can be found in association with neutrophils, an interaction which supports their proliferation and their ability to seed metastasis, providing a rationale for targeting this interaction in treatment of breast cancer.
Abstract: A better understanding of the features that define the interaction between cancer cells and immune cells is important for the development of new cancer therapies1. However, focus is often given to interactions that occur within the primary tumour and its microenvironment, whereas the role of immune cells during cancer dissemination in patients remains largely uncharacterized2,3. Circulating tumour cells (CTCs) are precursors of metastasis in several types of cancer4–6, and are occasionally found within the bloodstream in association with non-malignant cells such as white blood cells (WBCs)7,8. The identity and function of these CTC-associated WBCs, as well as the molecular features that define the interaction between WBCs and CTCs, are unknown. Here we isolate and characterize individual CTC-associated WBCs, as well as corresponding cancer cells within each CTC–WBC cluster, from patients with breast cancer and from mouse models. We use single-cell RNA sequencing to show that in the majority of these cases, CTCs were associated with neutrophils. When comparing the transcriptome profiles of CTCs associated with neutrophils against those of CTCs alone, we detect a number of differentially expressed genes that outline cell cycle progression, leading to more efficient metastasis formation. Further, we identify cell–cell junction and cytokine–receptor pairs that define CTC–neutrophil clusters, representing key vulnerabilities of the metastatic process. Thus, the association between neutrophils and CTCs drives cell cycle progression within the bloodstream and expands the metastatic potential of CTCs, providing a rationale for targeting this interaction in treatment of breast cancer. The authors show that circulating tumour cells can be found in association with neutrophils, an interaction which supports their proliferation and their ability to seed metastasis.

Journal ArticleDOI
TL;DR: Scientists working on the response of bacteria to antibiotics define antibiotic persistence and provide practical guidance on how to study bacterial persister cells, and provide a guide to measuring persistence.
Abstract: Increasing concerns about the rising rates of antibiotic therapy failure and advances in single-cell analyses have inspired a surge of research into antibiotic persistence. Bacterial persister cells represent a subpopulation of cells that can survive intensive antibiotic treatment without being resistant. Several approaches have emerged to define and measure persistence, and it is now time to agree on the basic definition of persistence and its relation to the other mechanisms by which bacteria survive exposure to bactericidal antibiotic treatments, such as antibiotic resistance, heteroresistance or tolerance. In this Consensus Statement, we provide definitions of persistence phenomena, distinguish between triggered and spontaneous persistence and provide a guide to measuring persistence. Antibiotic persistence is not only an interesting example of non-genetic single-cell heterogeneity, it may also have a role in the failure of antibiotic treatments. Therefore, it is our hope that the guidelines outlined in this article will pave the way for better characterization of antibiotic persistence and for understanding its relevance to clinical outcomes.

Journal ArticleDOI
TL;DR: A meta-analysis of eight geographically and technically diverse fecal shotgun metagenomic studies of colorectal cancer identified a core set of 29 species significantly enriched in CRC metagenomes, establishing globally generalizable, predictive taxonomic and functional microbiome CRC signatures as a basis for future diagnostics.
Abstract: Association studies have linked microbiome alterations with many human diseases. However, they have not always reported consistent results, thereby necessitating cross-study comparisons. Here, a meta-analysis of eight geographically and technically diverse fecal shotgun metagenomic studies of colorectal cancer (CRC, n = 768), which was controlled for several confounders, identified a core set of 29 species significantly enriched in CRC metagenomes (false discovery rate (FDR) < 1 × 10−5). CRC signatures derived from single studies maintained their accuracy in other studies. By training on multiple studies, we improved detection accuracy and disease specificity for CRC. Functional analysis of CRC metagenomes revealed enriched protein and mucin catabolism genes and depleted carbohydrate degradation genes. Moreover, we inferred elevated production of secondary bile acids from CRC metagenomes, suggesting a metabolic link between cancer-associated gut microbes and a fat- and meat-rich diet. Through extensive validations, this meta-analysis firmly establishes globally generalizable, predictive taxonomic and functional microbiome CRC signatures as a basis for future diagnostics. Cross-study analysis defines fecal microbial species associated with colorectal cancer.

Posted Content
TL;DR: SuperGlue as discussed by the authors matches two sets of local features by jointly finding correspondences and rejecting non-matchable points by solving a differentiable optimal transport problem, whose costs are predicted by a graph neural network.
Abstract: This paper introduces SuperGlue, a neural network that matches two sets of local features by jointly finding correspondences and rejecting non-matchable points. Assignments are estimated by solving a differentiable optimal transport problem, whose costs are predicted by a graph neural network. We introduce a flexible context aggregation mechanism based on attention, enabling SuperGlue to reason about the underlying 3D scene and feature assignments jointly. Compared to traditional, hand-designed heuristics, our technique learns priors over geometric transformations and regularities of the 3D world through end-to-end training from image pairs. SuperGlue outperforms other learned approaches and achieves state-of-the-art results on the task of pose estimation in challenging real-world indoor and outdoor environments. The proposed method performs matching in real-time on a modern GPU and can be readily integrated into modern SfM or SLAM systems. The code and trained weights are publicly available at this https URL.

Journal ArticleDOI
TL;DR: It is shown that human movement patterns explain the spread of both Aedes aegypti and Aedes albopictus in Europe and the United States following their introduction and predicted the future distributions of both species in response to accelerating urbanization, connectivity and climate change.
Abstract: The global population at risk from mosquito-borne diseases-including dengue, yellow fever, chikungunya and Zika-is expanding in concert with changes in the distribution of two key vectors: Aedes aegypti and Aedes albopictus. The distribution of these species is largely driven by both human movement and the presence of suitable climate. Using statistical mapping techniques, we show that human movement patterns explain the spread of both species in Europe and the United States following their introduction. We find that the spread of Ae. aegypti is characterized by long distance importations, while Ae. albopictus has expanded more along the fringes of its distribution. We describe these processes and predict the future distributions of both species in response to accelerating urbanization, connectivity and climate change. Global surveillance and control efforts that aim to mitigate the spread of chikungunya, dengue, yellow fever and Zika viruses must consider the so far unabated spread of these mosquitos. Our maps and predictions offer an opportunity to strategically target surveillance and control programmes and thereby augment efforts to reduce arbovirus burden in human populations globally.

Proceedings ArticleDOI
15 Jun 2019
TL;DR: This work proposes an approach where a single convolutional neural network plays a dual role: It is simultaneously a dense feature descriptor and a feature detector, and shows that this model can be trained using pixel correspondences extracted from readily available large-scale SfM reconstructions, without any further annotations.
Abstract: In this work we address the problem of finding reliable pixel-level correspondences under difficult imaging conditions. We propose an approach where a single convolutional neural network plays a dual role: It is simultaneously a dense feature descriptor and a feature detector. By postponing the detection to a later stage, the obtained keypoints are more stable than their traditional counterparts based on early detection of low-level structures. We show that this model can be trained using pixel correspondences extracted from readily available large-scale SfM reconstructions, without any further annotations. The proposed method obtains state-of-the-art performance on both the difficult Aachen Day-Night localization dataset and the InLoc indoor localization benchmark, as well as competitive performance on other benchmarks for image matching and 3D reconstruction.

Journal ArticleDOI
03 Jun 2019-Nature
TL;DR: High-throughput genetic analyses and mass spectrometry reveal that the gene products of diverse human gut bacteria affect a wide range of oral drugs, as well as drug metabolism in mice, which has implications for medical therapy and drug development across multiple disease indications.
Abstract: Individuals vary widely in their responses to medicinal drugs, which can be dangerous and expensive owing to treatment delays and adverse effects. Although increasing evidence implicates the gut microbiome in this variability, the molecular mechanisms involved remain largely unknown. Here we show, by measuring the ability of 76 human gut bacteria from diverse clades to metabolize 271 orally administered drugs, that many drugs are chemically modified by microorganisms. We combined high-throughput genetic analyses with mass spectrometry to systematically identify microbial gene products that metabolize drugs. These microbiome-encoded enzymes can directly and substantially affect intestinal and systemic drug metabolism in mice, and can explain the drug-metabolizing activities of human gut bacteria and communities on the basis of their genomic contents. These causal links between the gene content and metabolic activities of the microbiota connect interpersonal variability in microbiomes to interpersonal differences in drug metabolism, which has implications for medical therapy and drug development across multiple disease indications.

Journal ArticleDOI
TL;DR: Temporal Segment Networks (TSN) as discussed by the authors is proposed to model long-range temporal structure with a new segment-based sampling and aggregation scheme, which enables the TSN framework to efficiently learn action models by using the whole video.
Abstract: We present a general and flexible video-level framework for learning action models in videos. This method, called temporal segment network (TSN), aims to model long-range temporal structure with a new segment-based sampling and aggregation scheme. This unique design enables the TSN framework to efficiently learn action models by using the whole video. The learned models could be easily deployed for action recognition in both trimmed and untrimmed videos with simple average pooling and multi-scale temporal window integration, respectively. We also study a series of good practices for the implementation of the TSN framework given limited training samples. Our approach obtains the state-the-of-art performance on five challenging action recognition benchmarks: HMDB51 (71.0 percent), UCF101 (94.9 percent), THUMOS14 (80.1 percent), ActivityNet v1.2 (89.6 percent), and Kinetics400 (75.7 percent). In addition, using the proposed RGB difference as a simple motion representation, our method can still achieve competitive accuracy on UCF101 (91.0 percent) while running at 340 FPS. Furthermore, based on the proposed TSN framework, we won the video classification track at the ActivityNet challenge 2016 among 24 teams.