scispace - formally typeset
Search or ask a question

Showing papers by "University of Notre Dame published in 2020"


Journal ArticleDOI
TL;DR: The quantification of SARS-CoV-2 in wastewater affords the ability to monitor the prevalence of infections among the population via wastewater-based epidemiology (WBE) and highlights the viability of WBE for monitoring infectious diseases, such as COVID-19, in communities.

1,325 citations


Journal ArticleDOI
TL;DR: The most recent data release from the Sloan Digital Sky Surveys (SDSS-IV) is DR16 as mentioned in this paper, which is the fourth and penultimate from the fourth phase of the survey.
Abstract: This paper documents the sixteenth data release (DR16) from the Sloan Digital Sky Surveys; the fourth and penultimate from the fourth phase (SDSS-IV). This is the first release of data from the southern hemisphere survey of the Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2); new data from APOGEE-2 North are also included. DR16 is also notable as the final data release for the main cosmological program of the Extended Baryon Oscillation Spectroscopic Survey (eBOSS), and all raw and reduced spectra from that project are released here. DR16 also includes all the data from the Time Domain Spectroscopic Survey (TDSS) and new data from the SPectroscopic IDentification of ERosita Survey (SPIDERS) programs, both of which were co-observed on eBOSS plates. DR16 has no new data from the Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey (or the MaNGA Stellar Library "MaStar"). We also preview future SDSS-V operations (due to start in 2020), and summarize plans for the final SDSS-IV data release (DR17).

803 citations


Journal ArticleDOI
TL;DR: There is an urgent need for further research to establish methodologies for wastewater surveillance and understand the implications of the presence of SARS-CoV-2 in wastewater.

572 citations


Journal ArticleDOI
24 Aug 2020
TL;DR: In this paper, a review of the basic physical principles of these various techniques on the engineering of quasi-particle and optical bandgaps, their bandgap tunability, potentials and limitations in practical 2D device technologies are provided.
Abstract: Semiconductors are the basis of many vital technologies such as electronics, computing, communications, optoelectronics, and sensing. Modern semiconductor technology can trace its origins to the invention of the point contact transistor in 1947. This demonstration paved the way for the development of discrete and integrated semiconductor devices and circuits that has helped to build a modern society where semiconductors are ubiquitous components of everyday life. A key property that determines the semiconductor electrical and optical properties is the bandgap. Beyond graphene, recently discovered two-dimensional (2D) materials possess semiconducting bandgaps ranging from the terahertz and mid-infrared in bilayer graphene and black phosphorus, visible in transition metal dichalcogenides, to the ultraviolet in hexagonal boron nitride. In particular, these 2D materials were demonstrated to exhibit highly tunable bandgaps, achieved via the control of layers number, heterostructuring, strain engineering, chemical doping, alloying, intercalation, substrate engineering, as well as an external electric field. We provide a review of the basic physical principles of these various techniques on the engineering of quasi-particle and optical bandgaps, their bandgap tunability, potentials and limitations in practical realization in future 2D device technologies.

434 citations


Journal ArticleDOI
TL;DR: PySCF as mentioned in this paper is a Python-based general-purpose electronic structure platform that supports first-principles simulations of molecules and solids as well as accelerates the development of new methodology and complex computational workflows.
Abstract: PySCF is a Python-based general-purpose electronic structure platform that supports first-principles simulations of molecules and solids as well as accelerates the development of new methodology and complex computational workflows. This paper explains the design and philosophy behind PySCF that enables it to meet these twin objectives. With several case studies, we show how users can easily implement their own methods using PySCF as a development environment. We then summarize the capabilities of PySCF for molecular and solid-state simulations. Finally, we describe the growing ecosystem of projects that use PySCF across the domains of quantum chemistry, materials science, machine learning, and quantum information science.

374 citations



Journal ArticleDOI
TL;DR: Three RT-ddPCR assays were used to detect severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNA in weekly samples from nine WWTPs in southeastern Virginia, and fluctuations in population normalized loading rates agreed with known outbreaks during the study.

351 citations



Journal ArticleDOI
TL;DR: In this article, a deep neural network (DNN) is used to enforce the initial and boundary conditions, and the governing partial differential equations (i.e., Navier-Stokes equations) are incorporated into the loss of the DNN to drive the training.

341 citations


Journal ArticleDOI
TL;DR: Author(s): Bivins, Aaron; North, Devin; Ahmad, Arslan; Ahmed, Warish; Alm, Eric; Been, Frederic; Bhattacharya, Prosun; Bijlsma, Lubertus; Boehm, Alexandria B; Brown, Joe; Buttiglieri, Gianluigi; Calabro, Vincenza; Carducci, Annalaura; Castiglioni, Sara; Cetecioglu Guro
Abstract: Author(s): Bivins, Aaron; North, Devin; Ahmad, Arslan; Ahmed, Warish; Alm, Eric; Been, Frederic; Bhattacharya, Prosun; Bijlsma, Lubertus; Boehm, Alexandria B; Brown, Joe; Buttiglieri, Gianluigi; Calabro, Vincenza; Carducci, Annalaura; Castiglioni, Sara; Cetecioglu Gurol, Zeynep; Chakraborty, Sudip; Costa, Federico; Curcio, Stefano; de Los Reyes, Francis L; Delgado Vela, Jeseth; Farkas, Kata; Fernandez-Casi, Xavier; Gerba, Charles; Gerrity, Daniel; Girones, Rosina; Gonzalez, Raul; Haramoto, Eiji; Harris, Angela; Holden, Patricia A; Islam, Md Tahmidul; Jones, Davey L; Kasprzyk-Hordern, Barbara; Kitajima, Masaaki; Kotlarz, Nadine; Kumar, Manish; Kuroda, Keisuke; La Rosa, Giuseppina; Malpei, Francesca; Mautus, Mariana; McLellan, Sandra L; Medema, Gertjan; Meschke, John Scott; Mueller, Jochen; Newton, Ryan J; Nilsson, David; Noble, Rachel T; van Nuijs, Alexander; Peccia, Jordan; Perkins, T Alex; Pickering, Amy J; Rose, Joan; Sanchez, Gloria; Smith, Adam; Stadler, Lauren; Stauber, Christine; Thomas, Kevin; van der Voorn, Tom; Wigginton, Krista; Zhu, Kevin; Bibby, Kyle

325 citations


Journal ArticleDOI
19 Oct 2020
TL;DR: In this article, the authors examine the potential of the ferroelectric field-effect transistor technologies in current embedded non-volatile memory applications and future in-memory, biomimetic and alternative computing models.
Abstract: The discovery of ferroelectricity in oxides that are compatible with modern semiconductor manufacturing processes, such as hafnium oxide, has led to a re-emergence of the ferroelectric field-effect transistor in advanced microelectronics. A ferroelectric field-effect transistor combines a ferroelectric material with a semiconductor in a transistor structure. In doing so, it merges logic and memory functionalities at the single-device level, delivering some of the most pressing hardware-level demands for emerging computing paradigms. Here, we examine the potential of the ferroelectric field-effect transistor technologies in current embedded non-volatile memory applications and future in-memory, biomimetic and alternative computing models. We highlight the material- and device-level challenges involved in high-volume manufacturing in advanced technology nodes (≤10 nm), which are reminiscent of those encountered in the early days of high-K-metal-gate transistor development. We argue that the ferroelectric field-effect transistors can be a key hardware component in the future of computing, providing a new approach to electronics that we term ferroelectronics. This Perspective examines the use of ferroelectric field-effect transistor technologies in current embedded non-volatile memory applications and future in-memory, biomimetic and alternative computing models, arguing that the devices will be a key component in the development of data-centric computing.

Journal ArticleDOI
TL;DR: SARS-CoV-2 RNA was found to be significantly more persistent than infectious SARS- CooperVirus 2, indicating that the environmental detection of RNA alone does not substantiate risk of infection.
Abstract: Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNA is frequently detected in the feces of infected individuals. While infectious SARS-CoV-2 has not previously been identified in waste...

Journal ArticleDOI
TL;DR: In this paper, the state-of-the-art, current and future challenges, as well as the advances in science and technology needed to meet these challenges are presented, along with the state of the art, the current and the future challenges.
Abstract: Plasma catalysis is gaining increasing interest for various gas conversion applications, such as CO2 conversion into value-added chemicals and fuels, CH4 activation into hydrogen, higher hydrocarbons or oxygenates, and NH3 synthesis Other applications are already more established, such as for air pollution control, eg volatile organic compound remediation, particulate matter and NOx removal In addition, plasma is also very promising for catalyst synthesis and treatment Plasma catalysis clearly has benefits over 'conventional' catalysis, as outlined in the Introduction However, a better insight into the underlying physical and chemical processes is crucial This can be obtained by experiments applying diagnostics, studying both the chemical processes at the catalyst surface and the physicochemical mechanisms of plasma-catalyst interactions, as well as by computer modeling The key challenge is to design cost-effective, highly active and stable catalysts tailored to the plasma environment Therefore, insight from thermal catalysis as well as electro- and photocatalysis is crucial All these aspects are covered in this Roadmap paper, written by specialists in their field, presenting the state-of-the-art, the current and future challenges, as well as the advances in science and technology needed to meet these challenges

Proceedings ArticleDOI
01 Jul 2020
TL;DR: RoBERTa reduces an end-to-end LibriSpeech model’s WER by 30% relative and adds up to +1.7 BLEU on state-of-the-art baselines for low-resource translation pairs, with further gains from domain adaptation.
Abstract: Pretrained masked language models (MLMs) require finetuning for most NLP tasks. Instead, we evaluate MLMs out of the box via their pseudo-log-likelihood scores (PLLs), which are computed by masking tokens one by one. We show that PLLs outperform scores from autoregressive language models like GPT-2 in a variety of tasks. By rescoring ASR and NMT hypotheses, RoBERTa reduces an end-to-end LibriSpeech model’s WER by 30% relative and adds up to +1.7 BLEU on state-of-the-art baselines for low-resource translation pairs, with further gains from domain adaptation. We attribute this success to PLL’s unsupervised expression of linguistic acceptability without a left-to-right bias, greatly improving on scores from GPT-2 (+10 points on island effects, NPI licensing in BLiMP). One can finetune MLMs to give scores without masking, enabling computation in a single inference pass. In all, PLLs and their associated pseudo-perplexities (PPPLs) enable plug-and-play use of the growing number of pretrained MLMs; e.g., we use a single cross-lingual model to rescore translations in multiple languages. We release our library for language model scoring at https://github.com/awslabs/mlm-scoring.

Journal ArticleDOI
Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam, Federico Ambrogi  +2248 moreInstitutions (155)
TL;DR: For the first time, predictions from pythia8 obtained with tunes based on NLO or NNLO PDFs are shown to reliably describe minimum-bias and underlying-event data with a similar level of agreement to predictions from tunes using LO PDF sets.
Abstract: New sets of CMS underlying-event parameters (“tunes”) are presented for the pythia8 event generator. These tunes use the NNPDF3.1 parton distribution functions (PDFs) at leading (LO), next-to-leading (NLO), or next-to-next-to-leading (NNLO) orders in perturbative quantum chromodynamics, and the strong coupling evolution at LO or NLO. Measurements of charged-particle multiplicity and transverse momentum densities at various hadron collision energies are fit simultaneously to determine the parameters of the tunes. Comparisons of the predictions of the new tunes are provided for observables sensitive to the event shapes at LEP, global underlying event, soft multiparton interactions, and double-parton scattering contributions. In addition, comparisons are made for observables measured in various specific processes, such as multijet, Drell–Yan, and top quark-antiquark pair production including jet substructure observables. The simulation of the underlying event provided by the new tunes is interfaced to a higher-order matrix-element calculation. For the first time, predictions from pythia8 obtained with tunes based on NLO or NNLO PDFs are shown to reliably describe minimum-bias and underlying-event data with a similar level of agreement to predictions from tunes using LO PDF sets.


Journal ArticleDOI
TL;DR: In this article, a physics informed neural network (PINN) algorithm for solving brittle fracture problems is presented. But, the proposed approach is limited to two problems, and it is not suitable for other problems.

Journal ArticleDOI
TL;DR: This comprehensive review will stimulate the design of new experiments and guide the selection of appropriate constitutive models for specific applications, and propose appropriate mechanical modeling approaches that are as complex as necessary but as simple as possible.
Abstract: Brain tissue is not only one of the most important but also the most complex and compliant tissue in the human body. While long underestimated, increasing evidence confirms that mechanics plays a critical role in modulating brain function and dysfunction. Computational simulations–based on the field equations of nonlinear continuum mechanics–can provide important insights into the underlying mechanisms of brain injury and disease that go beyond the possibilities of traditional diagnostic tools. Realistic numerical predictions, however, require mechanical models that are capable of capturing the complex and unique characteristics of this ultrasoft, heterogeneous, and active tissue. In recent years, contradictory experimental results have caused confusion and hindered rapid progress. In this review, we carefully assess the challenges associated with brain tissue testing and modeling, and work out the most important characteristics of brain tissue behavior on different length and time scales. Depending on the application of interest, we propose appropriate mechanical modeling approaches that are as complex as necessary but as simple as possible. This comprehensive review will, on the one hand, stimulate the design of new experiments and, on the other hand, guide the selection of appropriate constitutive models for specific applications. Mechanical models that capture the complex behavior of nervous tissues and are accurately calibrated with reliable and comprehensive experimental data are key to performing reliable predictive simulations. Ultimately, mathematical modeling and computational simulations of the brain are useful for both biomedical and clinical communities, and cover a wide range of applications ranging from predicting disease progression and estimating injury risk to planning surgical procedures.

Journal ArticleDOI
TL;DR: In this paper, an auto-regressive dense encoder-decoder convolutional neural network is proposed to solve and model non-linear dynamical systems without training data at a computational cost that is potentially magnitudes lower than standard numerical solvers.

Posted Content
TL;DR: AdaBelief is proposed to simultaneously achieve three goals: fast convergence as in adaptive methods, good generalization as in SGD, and training stability; it outperforms other methods with fast convergence and high accuracy on image classification and language modeling.
Abstract: Most popular optimizers for deep learning can be broadly categorized as adaptive methods (e.g. Adam) and accelerated schemes (e.g. stochastic gradient descent (SGD) with momentum). For many models such as convolutional neural networks (CNNs), adaptive methods typically converge faster but generalize worse compared to SGD; for complex settings such as generative adversarial networks (GANs), adaptive methods are typically the default because of their stability.We propose AdaBelief to simultaneously achieve three goals: fast convergence as in adaptive methods, good generalization as in SGD, and training stability. The intuition for AdaBelief is to adapt the stepsize according to the "belief" in the current gradient direction. Viewing the exponential moving average (EMA) of the noisy gradient as the prediction of the gradient at the next time step, if the observed gradient greatly deviates from the prediction, we distrust the current observation and take a small step; if the observed gradient is close to the prediction, we trust it and take a large step. We validate AdaBelief in extensive experiments, showing that it outperforms other methods with fast convergence and high accuracy on image classification and language modeling. Specifically, on ImageNet, AdaBelief achieves comparable accuracy to SGD. Furthermore, in the training of a GAN on Cifar10, AdaBelief demonstrates high stability and improves the quality of generated samples compared to a well-tuned Adam optimizer. Code is available at this https URL

Journal ArticleDOI
TL;DR: A case-based approach is employed—fuzzy-set Qualitative Comparative Analysis (fsQCA)—to identify configurations of antecedent attributes of individuals in groups within samples, thereby revealing asymmetries and multiple entrepreneurial pathways that are otherwise hidden in the data.

Journal ArticleDOI
20 Nov 2020-Science
TL;DR: Cross-species analysis identified evolutionarily conserved and species-specific gene regulatory networks that control the transition of the quiescent, reactive, and proliferative Müller glia after stimulation in mice and zebrafish and validated functions of candidate factors controlling Müllers glia reprogramming.
Abstract: Injury induces retinal Muller glia of certain cold-blooded vertebrates, but not those of mammals, to regenerate neurons. To identify gene regulatory networks that reprogram Muller glia into progenitor cells, we profiled changes in gene expression and chromatin accessibility in Muller glia from zebrafish, chick, and mice in response to different stimuli. We identified evolutionarily conserved and species-specific gene networks controlling glial quiescence, reactivity, and neurogenesis. In zebrafish and chick, the transition from quiescence to reactivity is essential for retinal regeneration, whereas in mice, a dedicated network suppresses neurogenic competence and restores quiescence. Disruption of nuclear factor I transcription factors, which maintain and restore quiescence, induces Muller glia to proliferate and generate neurons in adult mice after injury. These findings may aid in designing therapies to restore retinal neurons lost to degenerative diseases.

Journal ArticleDOI
01 Jan 2020
TL;DR: In this article, the authors present a risk assessment and management framework tailored to SARS-CoV-2 transmission via wastewater, including new tools for environmental surveillance, ensuring adequate disinfection as a component of overall COVID-19 pandemic containment.
Abstract: The COVID-19 pandemic has severely impacted public health and the worldwide economy Converging evidence from the current pandemic, previous outbreaks and controlled experiments indicates that SARS-CoVs are present in wastewater for several days, leading to potential health risks via waterborne and aerosolized wastewater pathways Conventional wastewater treatment provides only partial removal of SARS-CoVs, thus safe disposal or reuse will depend on the efficacy of final disinfection This underscores the need for a risk assessment and management framework tailored to SARS-CoV-2 transmission via wastewater, including new tools for environmental surveillance, ensuring adequate disinfection as a component of overall COVID-19 pandemic containment Converging evidence indicates that SARS-CoVs are present in wastewater for several days with potential health risks This Review analyses knowledge about such risks as well as the potential spread of SARS-CoVs in waterborne, waterborne–aerosolized and waterborne–foodborne pathways during a pandemic

Posted Content
TL;DR: This work shows that neural edge predictors can effectively encode class-homophilic structure to promote intra- class edges and demote inter-class edges in given graph structure, and introduces the GAug graph data augmentation framework, which leverages these insights to improve performance in GNN-based node classification via edge prediction.
Abstract: Data augmentation has been widely used to improve generalizability of machine learning models. However, comparatively little work studies data augmentation for graphs. This is largely due to the complex, non-Euclidean structure of graphs, which limits possible manipulation operations. Augmentation operations commonly used in vision and language have no analogs for graphs. Our work studies graph data augmentation for graph neural networks (GNNs) in the context of improving semi-supervised node-classification. We discuss practical and theoretical motivations, considerations and strategies for graph data augmentation. Our work shows that neural edge predictors can effectively encode class-homophilic structure to promote intra-class edges and demote inter-class edges in given graph structure, and our main contribution introduces the GAug graph data augmentation framework, which leverages these insights to improve performance in GNN-based node classification via edge prediction. Extensive experiments on multiple benchmarks show that augmentation via GAug improves performance across GNN architectures and datasets.

Journal ArticleDOI
B. Abi1, R. Acciarri2, M. A. Acero3, George Adamov4  +966 moreInstitutions (155)
TL;DR: The Deep Underground Neutrino Experiment (DUNE) as discussed by the authors is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model.
Abstract: The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. This TDR is intended to justify the technical choices for the far detector that flow down from the high-level physics goals through requirements at all levels of the Project. Volume I contains an executive summary that introduces the DUNE science program, the far detector and the strategy for its modular designs, and the organization and management of the Project. The remainder of Volume I provides more detail on the science program that drives the choice of detector technologies and on the technologies themselves. It also introduces the designs for the DUNE near detector and the DUNE computing model, for which DUNE is planning design reports. Volume II of this TDR describes DUNE's physics program in detail. Volume III describes the technical coordination required for the far detector design, construction, installation, and integration, and its organizational structure. Volume IV describes the single-phase far detector technology. A planned Volume V will describe the dual-phase technology.

Journal ArticleDOI
TL;DR: In this article, a solar-thermal water evaporation (SWE) has received much interest in recent years due to a few seminal works on materials innovation and thermal management.
Abstract: Solar–thermal water evaporation (SWE) has received much interest in recent years due to a few seminal works on materials innovation and thermal management. With many studies proposing applications ...

Journal ArticleDOI
TL;DR: A survey illuminates alternatives for sequencing proteins with the brightest prospects for displacing mass spectrometry and promise to be scalable and seem to be adaptable to bioinformatics tools for calling the sequence of amino acids that constitute a protein.
Abstract: Proteins can be the root cause of a disease, and they can be used to cure it. The need to identify these critical actors was recognized early (1951) by Sanger; the first biopolymer sequenced was a peptide, insulin. With the advent of scalable, single-molecule DNA sequencing, genomics and transcriptomics have since propelled medicine through improved sensitivity and lower costs, but proteomics has lagged behind. Currently, proteomics relies mainly on mass spectrometry (MS), but instead of truly sequencing, it classifies a protein and typically requires about a billion copies of a protein to do it. Here, we offer a survey that illuminates a few alternatives with the brightest prospects for identifying whole proteins and displacing MS for sequencing them. These alternatives all boast sensitivity superior to MS and promise to be scalable and seem to be adaptable to bioinformatics tools for calling the sequence of amino acids that constitute a protein.

Journal ArticleDOI
TL;DR: Surveillance of wastewater from large transport vessels with their own sanitation systems has potential as a complementary data source to prioritize clinical testing and contact tracing among disembarking passengers and must be further optimized to maximize detection sensitivity.
Abstract: BACKGROUND: Wastewater-based epidemiology (WBE) for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) can be an important source of information for coronavirus disease 2019 (COVID-19) management during and after the pandemic. Currently, governments and transportation industries around the world are developing strategies to minimize SARS-CoV-2 transmission associated with resuming activity. This study investigated the possible use of SARS-CoV-2 RNA wastewater surveillance from airline and cruise ship sanitation systems and its potential use as a COVID-19 public health management tool. METHODS: Aircraft and cruise ship wastewater samples (n = 21) were tested for SARS-CoV-2 using two virus concentration methods, adsorption-extraction by electronegative membrane (n = 13) and ultrafiltration by Amicon (n = 8), and five assays using reverse-transcription quantitative polymerase chain reaction (RT-qPCR) and RT-droplet digital PCR (RT-ddPCR). Representative qPCR amplicons from positive samples were sequenced to confirm assay specificity. RESULTS: SARS-CoV-2 RNA was detected in samples from both aircraft and cruise ship wastewater; however concentrations were near the assay limit of detection. The analysis of multiple replicate samples and use of multiple RT-qPCR and/or RT-ddPCR assays increased detection sensitivity and minimized false-negative results. Representative qPCR amplicons were confirmed for the correct PCR product by sequencing. However, differences in sensitivity were observed among molecular assays and concentration methods. CONCLUSIONS: The study indicates that surveillance of wastewater from large transport vessels with their own sanitation systems has potential as a complementary data source to prioritize clinical testing and contact tracing among disembarking passengers. Importantly, sampling methods and molecular assays must be further optimized to maximize detection sensitivity. The potential for false negatives by both wastewater testing and clinical swab testing suggests that the two strategies could be employed together to maximize the probability of detecting SARS-CoV-2 infections amongst passengers.

Journal ArticleDOI
TL;DR: It is suggested that vector-borne, generalist wildlife and zoonotic pathogens are the types of parasites most likely to be affected by changes to biodiversity, and biodiversity conservation and management need to be considered alongside other disease management options.
Abstract: The disease ecology community has struggled to come to consensus on whether biodiversity reduces or increases infectious disease risk, a question that directly affects policy decisions for biodiversity conservation and public health. Here, we summarize the primary points of contention regarding biodiversity-disease relationships and suggest that vector-borne, generalist wildlife and zoonotic pathogens are the types of parasites most likely to be affected by changes to biodiversity. One synthesis on this topic revealed a positive correlation between biodiversity and human disease burden across countries, but as biodiversity changed over time within these countries, this correlation became weaker and more variable. Another synthesis-a meta-analysis of generally smaller-scale experimental and field studies-revealed a negative correlation between biodiversity and infectious diseases (a dilution effect) in various host taxa. These results raise the question of whether biodiversity-disease relationships are more negative at smaller spatial scales. If so, biodiversity conservation at the appropriate scales might prevent wildlife and zoonotic diseases from increasing in prevalence or becoming problematic (general proactive approaches). Further, protecting natural areas from human incursion should reduce zoonotic disease spillover. By contrast, for some infectious diseases, managing particular species or habitats and targeted biomedical approaches (targeted reactive approaches) might outperform biodiversity conservation as a tool for disease control. Importantly, biodiversity conservation and management need to be considered alongside other disease management options. These suggested guiding principles should provide common ground that can enhance scientific and policy clarity for those interested in simultaneously improving wildlife and human health.

Journal ArticleDOI
E. Kou, Phillip Urquijo1, Wolfgang Altmannshofer2, F. Beaujean3  +558 moreInstitutions (137)
TL;DR: In the original version of this manuscript, an error was introduced on pp352. '2.7nb:1.6nb' has been corrected to ''2.4nb: 1.3nb'' in the current online and printed version.
Abstract: In the original version of this manuscript, an error was introduced on pp352. '2.7nb:1.6nb' has been corrected to '2.4nb:1.3nb' in the current online and printed version. doi:10.1093/ptep/ptz106.