scispace - formally typeset
Search or ask a question

Showing papers by "University of Guelph published in 2021"


Journal ArticleDOI
TL;DR: In this article, the authors present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes.
Abstract: In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field.

1,129 citations


Journal ArticleDOI
TL;DR: This paper aims to provide a comprehensive study concerning FL’s security and privacy aspects that can help bridge the gap between the current state of federated AI and a future in which mass adoption is possible.

565 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide an in-depth study on the oxygen/water vapor barrier of representative biodegradable polymers in mainstream research with an emphasis on theoretical models and experimental modifications to improve their barrier properties.

212 citations


Journal ArticleDOI
TL;DR: To facilitate proper disposal of PPE debris by the public, it is recommended development of municipal efforts to improve PPE collection methods that are informed by the described PPE waste pathways.

198 citations


Journal ArticleDOI
TL;DR: The experimental results demonstrate that the federated-learning (FL)-based anomaly detection approach outperforms the classic/centralized machine learning (non-FL) versions in securing the privacy of user data and provides an optimal accuracy rate in attack detection.
Abstract: The Internet of Things (IoT) is made up of billions of physical devices connected to the Internet via networks that perform tasks independently with less human intervention. Such brilliant automation of mundane tasks requires a considerable amount of user data in digital format, which in turn makes IoT networks an open-source of Personally Identifiable Information data for malicious attackers to steal, manipulate and perform nefarious activities. Huge interest has developed over the past years in applying machine learning (ML)-assisted approaches in the IoT security space. However, the assumption in many current works is that big training data is widely available and transferable to the main server because data is born at the edge and is generated continuously by IoT devices. This is to say that classic ML works on the legacy set of entire data located on a central server, which makes it the least preferred option for domains with privacy concerns on user data. To address this issue, we propose federated learning (FL)-based anomaly detection approach to proactively recognize intrusion in IoT networks using decentralized on-device data. Our approach uses federated training rounds on Gated Recurrent Units (GRUs) models and keeps the data intact on local IoT devices by sharing only the learned weights with the central server of the FL. Also, the approach’s ensembler part aggregates the updates from multiple sources to optimize the global ML model’s accuracy. Our experimental results demonstrate that our approach outperforms the classic/centralized machine learning (non-FL) versions in securing the privacy of user data and provides an optimal accuracy rate in attack detection.

189 citations


Journal ArticleDOI
01 Jun 2021
TL;DR: A taxonomy that taps into the three-layer IoT architecture as a reference to identify security properties and requirements for each layer is built upon, classifying the potential IoT security threat and challenges by an architectural view.
Abstract: Internet of Things (IoT) is one of the most promising technologies that aims to enhance humans’ quality of life (QoL). IoT plays a significant role in several fields such as healthcare, automotive industries, agriculture, education, and many cross-cutting business applications. Addressing and analyzing IoT security issues is crucial because the working mechanisms of IoT applications vary due to the heterogeneity nature of IoT environments. Therefore, discussing the IoT security concerns in addition to available and potential solutions would assist developers and enterprises to find appropriate and timely solutions to tackle specific threats, providing the best possible IoT-based services. This paper provides a comprehensive study on IoT security issues, limitations, requirements, and current and potential solutions. The paper builds upon a taxonomy that taps into the three-layer IoT architecture as a reference to identify security properties and requirements for each layer. The main contribution of this survey is classifying the potential IoT security threat and challenges by an architectural view. From there, IoT security challenges and solutions are further grouped by the layered architecture for readers to get a better understanding on how to address and adopt best practices to avoid the current IoT security threats on each layer.

187 citations


Journal ArticleDOI
18 Mar 2021-Viruses
TL;DR: In this paper, the state of the art of phage taxonomy is discussed and a roadmap for the future is provided, including the abolition of the order Caudovirales and the families Myoviridae, Podovirusidae, and Siphovirides.
Abstract: Bacteriophage (phage) taxonomy has been in flux since its inception over four decades ago. Genome sequencing has put pressure on the classification system and recent years have seen significant changes to phage taxonomy. Here, we reflect on the state of phage taxonomy and provide a roadmap for the future, including the abolition of the order Caudovirales and the families Myoviridae, Podoviridae, and Siphoviridae. Furthermore, we specify guidelines for the demarcation of species, genus, subfamily and family-level ranks of tailed phage taxonomy.

187 citations


Journal ArticleDOI
TL;DR: This paper proposes the first certificateless public verification scheme against procrastinating auditors (CPVPA) by using blockchain technology, and presents rigorous security proofs to demonstrate the security of CPVPA, and conducts a comprehensive performance evaluation to show that CPVpa is efficient.
Abstract: The deployment of cloud storage services has significant benefits in managing data for users. However, it also causes many security concerns, and one of them is data integrity. Public verification techniques can enable a user to employ a third-party auditor to verify the data integrity on behalf of her/him, whereas existing public verification schemes are vulnerable to procrastinating auditors who may not perform verifications on time. Furthermore, most of public verification schemes are constructed on the public key infrastructure (PKI), and thereby suffer from certificate management problem. In this paper, we propose a c ertificateless p ublic v erification scheme against p rocrastinating a uditors (CPVPA) by using blockchain technology . The key idea is to require auditors to record each verification result into a transaction on a blockchain. Because transactions on the blockchain are time-sensitive, the verification can be time-stamped after the transaction is recorded into the blockchain, which enables users to check whether auditors perform the verifications at the prescribed time. Moreover, CPVPA is built on certificateless cryptography, and is free from the certificate management problem. We present rigorous security proofs to demonstrate the security of CPVPA, and conduct a comprehensive performance evaluation to show that CPVPA is efficient.

177 citations


Journal ArticleDOI
TL;DR: In this paper, the feasibility of both poly Lactic Acid (PLA) and polyhydroxyalkanoates (PHAs) as alternative materials that can replace petroleum-based polymers in a wide range of industrial applications is discussed.
Abstract: In spite of the fact that petroleum-based plastics are convenient in terms of fulfilling the performance requirements of many applications, they contribute significantly to a number of ecological and environmental problems. Recently, the public awareness of the negative effects of petroleum-based plastics on the environment has increased. The present utilization of natural resources cannot be sustained forever. Furthermore, oil is often subjected to price fluctuations and will eventually be depleted. The increase in the level of carbon dioxide due to the combustion of fossil fuel is causing global warming. Concerns about preservation of natural resources and climate change are considered worldwide motivations for academic and industrial researchers to reduce the consumption and dependence on fossil fuel. Therefore, bio-based polymers are moving towards becoming the favorable option to be utilized in polymer manufacturing, food packaging, and medical applications. This paper represents an overview of the feasibility of both Poly Lactic Acid (PLA) and polyhydroxyalkanoates (PHAs) as alternative materials that can replace petroleum-based polymers in a wide range of industrial applications. Physical, thermal, rheological, and mechanical properties of both polymers as well as their permeability and migration properties have been reviewed. Moreover, PLA's recyclability, sustainability, and environmental assessment have been also discussed. Finally, applications in which both polymers can replace petroleum-based plastics have been explored and provided.

153 citations


Journal ArticleDOI
TL;DR: This review article surveys state-of-the-art nanomaterials-based electrochemical sensors and biosensors for the detection and quantification of six classes of significant pharmaceutical compounds, including anti-inflammatory, anti-depressant,Anti-bacterial, anti -viral, pro-fungal, and anti-cancer drugs.

146 citations


Journal ArticleDOI
Lawrence Berkeley National Laboratory1, National University of Singapore2, Stanford University3, University of Wisconsin-Madison4, National Ecological Observatory Network5, Oak Ridge National Laboratory6, McMaster University7, University of Nebraska–Lincoln8, University of California, Berkeley9, Agricultural Research Service10, University of British Columbia11, University of Colorado Boulder12, Ohio State University13, University of Florida14, University of Guelph15, University of Kansas16, Michigan State University17, Pacific Northwest National Laboratory18, United States Department of Agriculture19, University of New Mexico20, National Research Council21, Marine Biological Laboratory22, University of Alberta23, Virginia Commonwealth University24, University of Minnesota25, Dalhousie University26, Université de Montréal27, Carleton University28, Shinshu University29, Japan Agency for Marine-Earth Science and Technology30, Northern Arizona University31, Oregon State University32, Yale University33, Washington State University34, Harvard University35, Texas A&M University36, Indiana University37, Florida International University38, San Diego State University39, California State University, East Bay40, Wayne State University41, University of Sydney42, Wilfrid Laurier University43, University of Alabama44, Environment Canada45, United States Geological Survey46, Argonne National Laboratory47, Osaka Prefecture University48, University of Delaware49, University of Missouri50, University of Sheffield51
TL;DR: In this article, the authors evaluate the representativeness of flux footprints and evaluate potential biases as a consequence of the footprint-to-target-area mismatch, which can be used as a guide to identify site-periods suitable for specific applications.

Journal ArticleDOI
TL;DR: In this article, the authors present a systematic and comprehensive global stocktake of implemented human adaptation to climate change and identify eight priorities for global adaptation research: assess the effectiveness of adaptation responses, enhance the understanding of limits to adaptation, enable individuals and civil society to adapt, include missing places, scholars and scholarship, understand private sector responses, improve methods for synthesizing different forms of evidence, assess the adaptation at different temperature thresholds, and improve the inclusion of timescale and the dynamics of responses.
Abstract: Assessing global progress on human adaptation to climate change is an urgent priority. Although the literature on adaptation to climate change is rapidly expanding, little is known about the actual extent of implementation. We systematically screened >48,000 articles using machine learning methods and a global network of 126 researchers. Our synthesis of the resulting 1,682 articles presents a systematic and comprehensive global stocktake of implemented human adaptation to climate change. Documented adaptations were largely fragmented, local and incremental, with limited evidence of transformational adaptation and negligible evidence of risk reduction outcomes. We identify eight priorities for global adaptation research: assess the effectiveness of adaptation responses, enhance the understanding of limits to adaptation, enable individuals and civil society to adapt, include missing places, scholars and scholarship, understand private sector responses, improve methods for synthesizing different forms of evidence, assess the adaptation at different temperature thresholds, and improve the inclusion of timescale and the dynamics of responses. Determining progress in adaptation to climate change is challenging, yet critical as climate change impacts increase. A stocktake of the scientific literature on implemented adaptation now shows that adaptation is mostly fragmented and incremental, with evidence lacking for its impact on reducing risk.

Journal ArticleDOI
TL;DR: The authors identified public sentiments and opinions toward the COVID-19 vaccines based on the content of Twitter and found a slight difference in the prevalence of positive and negative sentiments, with positive being the dominant polarity and having higher engagements.

Journal ArticleDOI
TL;DR: This work introduces a secure authentication model with low latency for drones in smart cities that looks to leverage blockchain technology, and uses a customized decentralized consensus, known as drone-based delegated proof of stake (DDPOS), for drones among zones in a smart city that does not require reauthentication.
Abstract: There is currently widespread use of drones and drone technology due to their rising applications that have come into fruition in the military, safety surveillance, agriculture, smart transportation, shipping, and delivery of packages in our Internet-of-Things global landscape. However, there are security-specific challenges with the authentication of drones while airborne. The current authentication approaches, in most drone-based applications, are subject to latency issues in real time with security vulnerabilities for attacks. To address such issues, we introduce a secure authentication model with low latency for drones in smart cities that looks to leverage blockchain technology. We apply a zone-based architecture in a network of drones, and use a customized decentralized consensus, known as drone-based delegated proof of stake (DDPOS), for drones among zones in a smart city that does not require reauthentication. The proposed architecture aims for positive impacts on increased security and reduced latency on the Internet of Drones (IoD). Moreover, we provide an empirical analysis of the proposed architecture compared to other peer models previously proposed for IoD to demonstrate its performance and security authentication capability. The experimental results clearly show that not only does the proposed architecture have low packet loss rate, high throughput, and low end-to-end delay in comparison to peer models but also can detect 97.5% of attacks by malicious drones while airborne.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a coupled social-epidemiological model of SARS-CoV-2 transmission in which social and epidemiological dynamics interact with one another and modelled how population adherence to non-pharmaceutical interventions responds to case incidence.
Abstract: Summary Background During the COVID-19 pandemic, authorities must decide which groups to prioritise for vaccination in a shifting social–epidemiological landscape in which the success of large-scale non-pharmaceutical interventions requires broad social acceptance We aimed to compare projected COVID-19 mortality under four different strategies for the prioritisation of SARS-CoV-2 vaccines Methods We developed a coupled social–epidemiological model of SARS-CoV-2 transmission in which social and epidemiological dynamics interact with one another We modelled how population adherence to non-pharmaceutical interventions responds to case incidence In the model, schools and workplaces are also closed and reopened on the basis of reported cases The model was parameterised with data on COVID-19 cases and mortality, SARS-CoV-2 seroprevalence, population mobility, and demography from Ontario, Canada (population 14·5 million) Disease progression parameters came from the SARS-CoV-2 epidemiological literature We assumed a vaccine with 75% efficacy against disease and transmissibility We compared vaccinating those aged 60 years and older first (oldest-first strategy), vaccinating those younger than 20 years first (youngest-first strategy), vaccinating uniformly by age (uniform strategy), and a novel contact-based strategy The latter three strategies interrupt transmission, whereas the first targets a vulnerable group to reduce disease Vaccination rates ranged from 0·5% to 5% of the population per week, beginning on either Jan 1 or Sept 1, 2021 Findings Case notifications, non-pharmaceutical intervention adherence, and lockdown undergo successive waves that interact with the timing of the vaccine programme to determine the relative effectiveness of the four strategies Transmission-interrupting strategies become relatively more effective with time as herd immunity builds The model predicts that, in the absence of vaccination, 72 000 deaths (95% credible interval 40 000–122 000) would occur in Ontario from Jan 1, 2021, to March 14, 2025, and at a vaccination rate of 1·5% of the population per week, the oldest-first strategy would reduce COVID-19 mortality by 90·8% on average (followed by 89·5% in the uniform, 88·9% in the contact-based, and 88·2% in the youngest-first strategies) 60 000 deaths (31 000–108 000) would occur from Sept 1, 2021, to March 14, 2025, in the absence of vaccination, and the contact-based strategy would reduce COVID-19 mortality by 92·6% on average (followed by 92·1% in the uniform, 91·0% in the oldest-first, and 88·3% in the youngest-first strategies) at a vaccination rate of 1·5% of the population per week Interpretation The most effective vaccination strategy for reducing mortality due to COVID-19 depends on the time course of the pandemic in the population For later vaccination start dates, use of SARS-CoV-2 vaccines to interrupt transmission might prevent more deaths than prioritising vulnerable age groups Funding Ontario Ministry of Colleges and Universities

Journal ArticleDOI
01 Jun 2021
TL;DR: In this paper, a bibliometric survey of research papers focused on the security aspects of Internet of Things (IoT) aided smart grids is presented, which is the very first survey paper in this specific field.
Abstract: The integration of sensors and communication technology in power systems, known as the smart grid, is an emerging topic in science and technology. One of the critical issues in the smart grid is its increased vulnerability to cyber-threats. As such, various types of threats and defense mechanisms are proposed in literature. This paper offers a bibliometric survey of research papers focused on the security aspects of Internet of Things (IoT) aided smart grids. To the best of the authors’ knowledge, this is the very first bibliometric survey paper in this specific field. A bibliometric analysis of all journal articles is performed and the findings are sorted by dates, authorship, and key concepts. Furthermore, this paper also summarizes the types of cyber-threats facing the smart grid, the various security mechanisms proposed in literature, as well as the research gaps in the field of smart grid security .

Journal ArticleDOI
TL;DR: In this paper, the authors used a combination of phenology, climate and geography data to predict the site-based rice yields using a traditional regression-based method (MLR, multiple linear regression), and more advanced three machine learning (ML) methods: backpropagation neural network (BP), support vector machine (SVM), and random forest (RF).

Journal ArticleDOI
TL;DR: In this article, the authors combine climate modelling and data-driven approaches to provide global multi-model projections of urban climates over the twenty-first century, with high inter-model confidence.
Abstract: Effective urban planning for climate-driven risks relies on robust climate projections specific to built landscapes. Such projections are absent because of a near-universal lack of urban representation in global-scale Earth system models. Here, we combine climate modelling and data-driven approaches to provide global multi-model projections of urban climates over the twenty-first century. The results demonstrate the inter-model robustness of specific levels of urban warming over certain regions under climate change. Under a high-emissions scenario, cities in the United States, Middle East, northern Central Asia, northeastern China and inland South America and Africa are estimated to experience substantial warming of more than 4 K—larger than regional warming—by the end of the century, with high inter-model confidence. Our findings highlight the critical need for multi-model global projections of local urban climates for climate-sensitive development and support green infrastructure intervention as an effective means of reducing urban heat stress on large scales. An urban climate model emulator has been used with a multi-model archive to estimate that in a high-emissions scenario, many cities will warm by over 4 K during local summers. Near-global relative humidity decreases highlight the potential for green infrastructure and more efficient urban cooling mechanisms.

Proceedings ArticleDOI
21 Jan 2021
TL;DR: SSTVOS as discussed by the authors extracts per-pixel representations for each object in a video using sparse attention over spatio-temporal features, which allows a model to learn to attend over a history of multiple frames and provides suitable inductive bias for performing correspondence-like computations necessary for solving motion segmentation.
Abstract: In this paper we introduce a Transformer-based approach to video object segmentation (VOS). To address compounding error and scalability issues of prior work, we propose a scalable, end-to-end method for VOS called Sparse Spatiotemporal Transformers (SST). SST extracts per-pixel representations for each object in a video using sparse attention over spatiotemporal features. Our attention-based formulation for VOS allows a model to learn to attend over a history of multiple frames and provides suitable inductive bias for performing correspondence-like computations necessary for solving motion segmentation. We demonstrate the effectiveness of attention-based over recurrent networks in the spatiotemporal domain. Our method achieves competitive results on YouTube-VOS and DAVIS 2017 with improved scalability and robustness to occlusions compared with the state of the art. Code is available at https://github.com/dukebw/SSTVOS.

Journal ArticleDOI
19 Feb 2021
TL;DR: In this article, the authors address the key factors limiting successful field applications of bio-fertilizers and suggest potential solutions based on emerging strategies for product development and discuss the importance of biosafety guidelines and propose new avenues of research for bio-product development.
Abstract: Global population growth poses a threat to food security in an era of increased ecosystem degradation, climate change, soil erosion and biodiversity loss. In this context, harnessing naturally-occurring processes such as those provided by soil and plant-associated microorganisms presents a promising strategy to reduce dependency on agrochemicals. Biofertilizers are living microbes that enhance plant nutrition by either by mobilizing or increasing nutrient availability in soils. Various microbial taxa including beneficial bacteria and fungi are currently used as biofertilizers, as they successfully colonize the rhizosphere, rhizoplane or root interior. Despite their great potential to improve soil fertility, biofertilizers have yet to replace conventional chemical fertilizers in commercial agriculture. In the last ten years, multi-omics studies have made a significant step forward in understanding the drivers, roles, processes and mechanisms in the plant microbiome. However, translating this knowledge on microbiome functions in order to capitalize on plant nutrition in agroecosystems still remains a challenge. Here, we address the key factors limiting successful field applications of biofertilizers and suggest potential solutions based on emerging strategies for product development. Finally, we discuss the importance of biosafety guidelines and propose new avenues of research for biofertilizer development.

Journal ArticleDOI
01 Jan 2021
TL;DR: In this paper, the authors provide an overview of the application of different machine learning techniques in analysis of hyperspectral images for determination of food quality. But, the field of deep learning is relatively new and need further research for its full utilization.
Abstract: Non-destructive testing techniques have gained importance in monitoring food quality over the years. Hyperspectral imaging is one of the important non-destructive quality testing techniques which provides both spatial and spectral information. Advancement in machine learning techniques for rapid analysis with higher classification accuracy have improved the potential of using this technique for food applications. This paper provides an overview of the application of different machine learning techniques in analysis of hyperspectral images for determination of food quality. It covers the principle underlying hyperspectral imaging, the advantages, and the limitations of each machine learning technique. The machine learning techniques exhibited rapid analysis of hyperspectral images of food products with high accuracy thereby enabling robust classification or regression models. The selection of effective wavelengths from the hyperspectral data is of paramount importance since it greatly reduces the computational load and time which enhances the scope for real time applications. Due to the feature learning nature of deep learning, it is one of the most promising and powerful techniques for real time applications. However, the field of deep learning is relatively new and need further research for its full utilization. Similarly, lifelong machine learning paves the way for real time HSI applications but needs further research to incorporate the seasonal variations in food quality. Further, the research gaps in machine learning techniques for hyperspectral image analysis, and the prospects are discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the impact of the COVID-19 pandemic on the food supply chain in North America and found that the specialization in most food supply chains designed for "just-in-time" delivery to specific customers with no reserve capacity, which led to the initial disruptions, may have also been responsible for its rapid rebound.

Journal ArticleDOI
TL;DR: This paper proposes an efficient and secure decision tree classification scheme that protects the confidentiality of the decision tree classifier and the user’s data, and provides formal security proofs to demonstrate that this scheme achieves faster-than-linear classification speed.
Abstract: Decision tree classification has become a prevailing technique for online diagnosis services. By outsourcing computation intensive tasks to a cloud server, cloud-assisted online diagnosis services are better ways for cases that the storage and computation requirements exceed the capability of medical institutions. With privacy concerns as well as intellectual property protection issues, the valuable diagnosis classifier and the sensitive user data should be protected against the cloud server. In this paper, we identify a work-flow for cloud-assisted online diagnosis services. We propose an efficient and secure decision tree classification scheme in the proposed work-flow. Specifically, the medical institution transforms a locally pre-trained decision tree classifier to a decision table, and later uses searchable symmetric encryption to encrypt the decision table. Then, the encrypted table is outsourced to the cloud server, and a user can submit encrypted physiological features to the cloud server and obtain an encrypted diagnosis prediction back. We provide formal security proofs to demonstrate that our scheme protects the confidentiality of the decision tree classifier and the user’s data. The performance analysis shows that our scheme achieves faster-than-linear classification speed. Experimental evaluations show that our scheme requires several micro-seconds to process a diagnosis request in the tested datasets.

Journal ArticleDOI
TL;DR: In this paper, the authors used a panel threshold model to examine the indirect effect of democratic institutions on renewable energy consumption and found that democratic institutions play a significant role in green energy consumption in less democratic countries.

Journal ArticleDOI
TL;DR: In this paper, the authors systematically review 146 studies from 1987-2017 that conduct physically-based numerical modelling of urban air temperature reduction resulting from green-blue infrastructure and reflective materials and conclude that evaluation of the base case simulation is not a sufficient prerequisite for accurate simulation of heat mitigation strategy cooling.
Abstract: Infrastructure-based heat reduction strategies can help cities adapt to high temperatures, but simulations of their cooling potential yield widely varying predictions. We systematically review 146 studies from 1987-2017 that conduct physically based numerical modelling of urban air temperature reduction resulting from green-blue infrastructure and reflective materials. Studies are grouped into two modelling scales: neighbourhood scale, building-resolving (i.e., microscale); and city scale, neighbourhood-resolving (i.e., mesoscale). Street tree cooling has primarily been assessed at the microscale, whereas mesoscale modelling has favoured reflective roof treatments, which are attributed to model physics limitations at each scale. We develop 25 criteria to assess contextualization and reliability of each study based on metadata reporting and methodological quality, respectively. Studies have shortcomings with respect to neighbourhood characterization, reporting areal coverages of heat mitigation implementations, evaluation of base case simulations, and evaluation of modelled physical processes relevant to heat reduction. To aid comparison among studies, we introduce two metrics: the Albedo Cooling Effectiveness (ACE), and the Vegetation Cooling Effectiveness (VCE). A sub-sample of 47 higher quality studies suggests that high reflectivity coatings or materials offer ≈0.2-0.6 °C cooling per 0.10 neighbourhood albedo increase, and that trees yield ≈0.3 °C cooling per 0.10 canopy cover increase, for afternoon clear-sky summer conditions. VCE of low vegetation and green roofs varies more strongly between studies. Both ACE and VCE exhibit a striking dependence on model choice and model scale, particularly for albedo and roof-level implementations, suggesting that much of the variation of cooling magnitudes between studies may be attributed to model physics representation. We conclude that evaluation of the base case simulation is not a sufficient prerequisite for accurate simulation of heat mitigation strategy cooling. We identify a three-phase framework for assessment of the suitability of a numerical model for a heat mitigation experiment, which emphasizes assessment of urban canopy layer mixing and of the physical processes associated with the heat reduction implementation. Based on our findings, we include recommendations for optimal design and communication of urban heat mitigation simulation studies.

Journal ArticleDOI
TL;DR: The authors argue that the brain is the biological substrate from which both addiction and the capacity for behavior change arise, arguing for an intensified neuroscientific study of recovery, and argue that these disagreements reveal the need for multidisciplinary research that integrates neuroscientific, behavioral, clinical and sociocultural perspectives.

Journal ArticleDOI
15 Jul 2021
TL;DR: In this article, the authors compare 13 nitrogen budget datasets covering 115 countries and regions over 1961-2015 and find that the most uncertain nitrogen budget terms by country showed ranges as large as their medians, revealing areas for improvement.
Abstract: Input–output estimates of nitrogen on cropland are essential for improving nitrogen management and better understanding the global nitrogen cycle. Here, we compare 13 nitrogen budget datasets covering 115 countries and regions over 1961–2015. Although most datasets showed similar spatiotemporal patterns, some annual estimates varied widely among them, resulting in large ranges and uncertainty. In 2010, global medians (in TgN yr−1) and associated minimum–maximum ranges were 73 (64–84) for global harvested crop nitrogen; 161 (139–192) for total nitrogen inputs; 86 (68–97) for nitrogen surplus; and 46% (40–53%) for nitrogen use efficiency. Some of the most uncertain nitrogen budget terms by country showed ranges as large as their medians, revealing areas for improvement. A benchmark nitrogen budget dataset, derived from central tendencies of the original datasets, can be used in model comparisons and inform sustainable nitrogen management in food systems. Existing datasets of nitrogen (N) balance in agriculture are often discrepant. Comparing 13 of them regarding five metrics (fertilizer application, manure application, biological N fixation, atmospheric deposition, and N harvested as crop products) over 1961–2015 reveals why. Recommendations for improving N quantification and an N budget benchmark dataset are also proposed.

Journal ArticleDOI
27 Jan 2021-Nature
TL;DR: In this paper, a neuroprosthetic baroreflex was developed to restore haemodynamic stability after spinal cord injury, which is due to the interruption of supraspinal efferent commands to sympathetic circuits located in the spinal cord.
Abstract: Spinal cord injury (SCI) induces haemodynamic instability that threatens survival1–3, impairs neurological recovery4,5, increases the risk of cardiovascular disease6,7, and reduces quality of life8,9. Haemodynamic instability in this context is due to the interruption of supraspinal efferent commands to sympathetic circuits located in the spinal cord10, which prevents the natural baroreflex from controlling these circuits to adjust peripheral vascular resistance. Epidural electrical stimulation (EES) of the spinal cord has been shown to compensate for interrupted supraspinal commands to motor circuits below the injury11, and restored walking after paralysis12. Here, we leveraged these concepts to develop EES protocols that restored haemodynamic stability after SCI. We established a preclinical model that enabled us to dissect the topology and dynamics of the sympathetic circuits, and to understand how EES can engage these circuits. We incorporated these spatial and temporal features into stimulation protocols to conceive a clinical-grade biomimetic haemodynamic regulator that operates in a closed loop. This ‘neuroprosthetic baroreflex’ controlled haemodynamics for extended periods of time in rodents, non-human primates and humans, after both acute and chronic SCI. We will now conduct clinical trials to turn the neuroprosthetic baroreflex into a commonly available therapy for people with SCI. An epidural spinal cord stimulation system regulates blood pressure in the acute and chronic phases of spinal cord injury.

Journal ArticleDOI
TL;DR: In this paper, the authors report that North American deer mice (Peromyscus maniculatus) are susceptible to SARS-CoV-2 infection following intranasal exposure to a human isolate, resulting in viral replication in the upper and lower respiratory tract with little or no signs of disease.
Abstract: Widespread circulation of SARS-CoV-2 in humans raises the theoretical risk of reverse zoonosis events with wildlife, reintroductions of SARS-CoV-2 into permissive nondomesticated animals. Here we report that North American deer mice (Peromyscus maniculatus) are susceptible to SARS-CoV-2 infection following intranasal exposure to a human isolate, resulting in viral replication in the upper and lower respiratory tract with little or no signs of disease. Further, shed infectious virus is detectable in nasal washes, oropharyngeal and rectal swabs, and viral RNA is detectable in feces and occasionally urine. We further show that deer mice are capable of transmitting SARS-CoV-2 to naive deer mice through direct contact. The extent to which these observations may translate to wild deer mouse populations remains unclear, and the risk of reverse zoonosis and/or the potential for the establishment of Peromyscus rodents as a North American reservoir for SARS-CoV-2 remains unknown.

Journal ArticleDOI
16 Mar 2021
TL;DR: The meta-analysis demonstrates the suitability of the 5-level validation scale for assessing targeted eDNA assays and provides guidance on validation and reporting standards for newly developed assays.
Abstract: The use of environmental DNA (eDNA) analysis for species monitoring requires rigorous validation - from field sampling to the analysis of PCR-based results - for meaningful application and interpretation. Assays targeting eDNA released by individual species are typically validated with no predefined criteria to answer specific research questions in one ecosystem. Hence, the general applicability of assays as well as associated uncertainties and limitations, often remain undetermined. The absence of clear guidelines for assay validation prevents targeted eDNA assays from being incorporated into species monitoring and policy; thus, their establishment is essential for realizing the potential of eDNA-based surveys. We describe the measures and tests necessary for successful validation of targeted eDNA assays and the associated pitfalls to form the basis of guidelines. A list of 122 variables was compiled, consolidated into 14 thematic blocks, (e.g. “ in silico analysis”), and arranged on a 5-level validation scale from “incomplete” to “operational” with defined minimum validation criteria for each level. These variables were evaluated for 546 published single-species assays. The resulting dataset was used to provide an overview of current validation practices and test the applicability of the validation scale for future assay rating. Of the 122 variables, 20% to 76% were reported; the majority (30%) of investigated assays were classified as Level 1 (incomplete), and 15% did not achieve this first level. These assays were characterised by minimal in silico and in vitro testing, but their share in annually published eDNA assays has declined since 2014. The meta-analysis demonstrates the suitability of the 5-level validation scale for assessing targeted eDNA assays. It is a user-friendly tool to evaluate previously published assays for future research and routine monitoring, while also enabling the appropriate interpretation of results. Finally, it provides guidance on validation and reporting standards for newly developed assays.