scispace - formally typeset
Search or ask a question

Showing papers by "University of Nevada, Reno published in 2019"


Journal ArticleDOI
01 May 2019-Nature
TL;DR: A comprehensive assessment of the world’s rivers and their connectivity shows that only 37 per cent of rivers longer than 1,000 kilometres remain free-flowing over their entire length.
Abstract: Free-flowing rivers (FFRs) support diverse, complex and dynamic ecosystems globally, providing important societal and economic services. Infrastructure development threatens the ecosystem processes, biodiversity and services that these rivers support. Here we assess the connectivity status of 12 million kilometres of rivers globally and identify those that remain free-flowing in their entire length. Only 37 per cent of rivers longer than 1,000 kilometres remain free-flowing over their entire length and 23 per cent flow uninterrupted to the ocean. Very long FFRs are largely restricted to remote regions of the Arctic and of the Amazon and Congo basins. In densely populated areas only few very long rivers remain free-flowing, such as the Irrawaddy and Salween. Dams and reservoirs and their up- and downstream propagation of fragmentation and flow regulation are the leading contributors to the loss of river connectivity. By applying a new method to quantify riverine connectivity and map FFRs, we provide a foundation for concerted global and national strategies to maintain or restore them. A comprehensive assessment of the world’s rivers and their connectivity shows that only 37 per cent of rivers longer than 1,000 kilometres remain free-flowing over their entire length.

1,071 citations


Journal ArticleDOI
29 Mar 2019-Science
TL;DR: A global, quantitative assessment of the amphibian chytridiomycosis panzootic demonstrates its role in the decline of at least 501 amphibian species over the past half-century and represents the greatest recorded loss of biodiversity attributable to a disease.
Abstract: Anthropogenic trade and development have broken down dispersal barriers, facilitating the spread of diseases that threaten Earth's biodiversity. We present a global, quantitative assessment of the amphibian chytridiomycosis panzootic, one of the most impactful examples of disease spread, and demonstrate its role in the decline of at least 501 amphibian species over the past half-century, including 90 presumed extinctions. The effects of chytridiomycosis have been greatest in large-bodied, range-restricted anurans in wet climates in the Americas and Australia. Declines peaked in the 1980s, and only 12% of declined species show signs of recovery, whereas 39% are experiencing ongoing decline. There is risk of further chytridiomycosis outbreaks in new areas. The chytridiomycosis panzootic represents the greatest recorded loss of biodiversity attributable to a disease.

680 citations


Journal ArticleDOI
12 Oct 2019-Polymers
TL;DR: An overview of a diverse range of fibers, their properties, functionality, classification, and various fiber composite manufacturing techniques is presented to discover the optimized fiber-reinforced composite material for significant applications.
Abstract: Composites have been found to be the most promising and discerning material available in this century. Presently, composites reinforced with fibers of synthetic or natural materials are gaining more importance as demands for lightweight materials with high strength for specific applications are growing in the market. Fiber-reinforced polymer composite offers not only high strength to weight ratio, but also reveals exceptional properties such as high durability; stiffness; damping property; flexural strength; and resistance to corrosion, wear, impact, and fire. These wide ranges of diverse features have led composite materials to find applications in mechanical, construction, aerospace, automobile, biomedical, marine, and many other manufacturing industries. Performance of composite materials predominantly depends on their constituent elements and manufacturing techniques, therefore, functional properties of various fibers available worldwide, their classifications, and the manufacturing techniques used to fabricate the composite materials need to be studied in order to figure out the optimized characteristic of the material for the desired application. An overview of a diverse range of fibers, their properties, functionality, classification, and various fiber composite manufacturing techniques is presented to discover the optimized fiber-reinforced composite material for significant applications. Their exceptional performance in the numerous fields of applications have made fiber-reinforced composite materials a promising alternative over solitary metals or alloys.

619 citations


Journal ArticleDOI
01 Aug 2019-Nature
TL;DR: A deep learning approach that predicts the risk of acute kidney injury and provides confidence assessments and a list of the clinical features that are most salient to each prediction, alongside predicted future trajectories for clinically relevant blood tests are developed.
Abstract: The early prediction of deterioration could have an important role in supporting healthcare professionals, as an estimated 11% of deaths in hospital follow a failure to promptly recognize and treat deteriorating patients1. To achieve this goal requires predictions of patient risk that are continuously updated and accurate, and delivered at an individual level with sufficient context and enough time to act. Here we develop a deep learning approach for the continuous risk prediction of future deterioration in patients, building on recent work that models adverse events from electronic health records2–17 and using acute kidney injury—a common and potentially life-threatening condition18—as an exemplar. Our model was developed on a large, longitudinal dataset of electronic health records that cover diverse clinical environments, comprising 703,782 adult patients across 172 inpatient and 1,062 outpatient sites. Our model predicts 55.8% of all inpatient episodes of acute kidney injury, and 90.2% of all acute kidney injuries that required subsequent administration of dialysis, with a lead time of up to 48 h and a ratio of 2 false alerts for every true alert. In addition to predicting future acute kidney injury, our model provides confidence assessments and a list of the clinical features that are most salient to each prediction, alongside predicted future trajectories for clinically relevant blood tests9. Although the recognition and prompt treatment of acute kidney injury is known to be challenging, our approach may offer opportunities for identifying patients at risk within a time window that enables early treatment. A deep learning approach that predicts the risk of acute kidney injury may help to identify patients at risk of health deterioration within a time window that enables early treatment.

617 citations


Journal ArticleDOI
01 Dec 2019-Nature
TL;DR: The development of titanium–copper alloys that have a high constitutional supercooling capacity as a result of partitioning of the alloying element during solidification, which can override the negative effect of a high thermal gradient in the laser-melted region during additive manufacturing.
Abstract: Additive manufacturing, often known as three-dimensional (3D) printing, is a process in which a part is built layer-by-layer and is a promising approach for creating components close to their final (net) shape. This process is challenging the dominance of conventional manufacturing processes for products with high complexity and low material waste1. Titanium alloys made by additive manufacturing have been used in applications in various industries. However, the intrinsic high cooling rates and high thermal gradient of the fusion-based metal additive manufacturing process often leads to a very fine microstructure and a tendency towards almost exclusively columnar grains, particularly in titanium-based alloys1. (Columnar grains in additively manufactured titanium components can result in anisotropic mechanical properties and are therefore undesirable2.) Attempts to optimize the processing parameters of additive manufacturing have shown that it is difficult to alter the conditions to promote equiaxed growth of titanium grains3. In contrast with other common engineering alloys such as aluminium, there is no commercial grain refiner for titanium that is able to effectively refine the microstructure. To address this challenge, here we report on the development of titanium-copper alloys that have a high constitutional supercooling capacity as a result of partitioning of the alloying element during solidification, which can override the negative effect of a high thermal gradient in the laser-melted region during additive manufacturing. Without any special process control or additional treatment, our as-printed titanium-copper alloy specimens have a fully equiaxed fine-grained microstructure. They also display promising mechanical properties, such as high yield strength and uniform elongation, compared to conventional alloys under similar processing conditions, owing to the formation of an ultrafine eutectoid microstructure that appears as a result of exploiting the high cooling rates and multiple thermal cycles of the manufacturing process. We anticipate that this approach will be applicable to other eutectoid-forming alloy systems, and that it will have applications in the aerospace and biomedical industries.

489 citations


Journal ArticleDOI
TL;DR: In this article, a community initiative to identify major unsolved scientific problems in hydrology motivated by a need for stronger harmonisation of research efforts is described. But despite the diversity of the participants (230 scientists in total), the process revealed much about community priorities and the state of our science: a preference for continuity in research questions rather than radical departures or redirections from past and current work.
Abstract: This paper is the outcome of a community initiative to identify major unsolved scientific problems in hydrology motivated by a need for stronger harmonisation of research efforts. The procedure involved a public consultation through online media, followed by two workshops through which a large number of potential science questions were collated, prioritised, and synthesised. In spite of the diversity of the participants (230 scientists in total), the process revealed much about community priorities and the state of our science: a preference for continuity in research questions rather than radical departures or redirections from past and current work. Questions remain focused on the process-based understanding of hydrological variability and causality at all space and time scales. Increased attention to environmental change drives a new emphasis on understanding how change propagates across interfaces within the hydrological system and across disciplinary boundaries. In particular, the expansion of the human footprint raises a new set of questions related to human interactions with nature and water cycle feedbacks in the context of complex water management problems. We hope that this reflection and synthesis of the 23 unsolved problems in hydrology will help guide research efforts for some years to come.

469 citations


Journal ArticleDOI
TL;DR: Clinical science seems to have reached a tipping point, and a new paradigm is beginning to emerge that is questioning the validity and utility of the medical illness model, which assumes that latent disease entities are targeted with specific therapy protocols.
Abstract: Clinical science seems to have reached a tipping point. It appears that a new paradigm is beginning to emerge that is questioning the validity and utility of the medical illness model, which assumes that latent disease entities are targeted with specific therapy protocols. A new generation of evidence-based care has begun to move toward process-based therapies to target core mediators and moderators based on testable theories. This could represent a paradigm shift in clinical science with far-reaching implications. Clinical science might see a decline of named therapies defined by set technologies, a decline of broad schools, a rise of testable models, a rise of mediation and moderation studies, the emergence of new forms of diagnosis based on functional analysis, a move from nomothetic to idiographic approaches, and a move toward processes that specify modifiable elements. These changes could integrate or bridge different treatment orientations, settings, and even cultures.

403 citations


Journal ArticleDOI
TL;DR: In this paper, the authors bring together hydrologists, critical zone scientists, and ESM developers to explore how hillslope structures may modulate ESM grid-level water, energy, and biogeochemical fluxes.
Abstract: Earth System Models (ESMs) are essential tools for understanding and predicting global change, but they cannot explicitly resolve hillslope‐scale terrain structures that fundamentally organize water, energy, and biogeochemical stores and fluxes at subgrid scales. Here we bring together hydrologists, Critical Zone scientists, and ESM developers, to explore how hillslope structures may modulate ESM grid‐level water, energy, and biogeochemical fluxes. In contrast to the one‐dimensional (1‐D), 2‐ to 3‐m deep, and free‐draining soil hydrology in most ESM land models, we hypothesize that 3‐D, lateral ridge‐to‐valley flow through shallow and deep paths and insolation contrasts between sunny and shady slopes are the top two globally quantifiable organizers of water and energy (and vegetation) within an ESM grid cell. We hypothesize that these two processes are likely to impact ESM predictions where (and when) water and/or energy are limiting. We further hypothesize that, if implemented in ESM land models, these processes will increase simulated continental water storage and residence time, buffering terrestrial ecosystems against seasonal and interannual droughts. We explore efficient ways to capture these mechanisms in ESMs and identify critical knowledge gaps preventing us from scaling up hillslope to global processes. One such gap is our extremely limited knowledge of the subsurface, where water is stored (supporting vegetation) and released to stream baseflow (supporting aquatic ecosystems). We conclude with a set of organizing hypotheses and a call for global syntheses activities and model experiments to assess the impact of hillslope hydrology on global change predictions.

274 citations


Journal ArticleDOI
TL;DR: A large drug combination dataset, consisting of 11,576 experiments from 910 combinations across 85 molecularly characterized cancer cell lines, and results of a DREAM Challenge to evaluate computational strategies for predicting synergistic drug pairs and biomarkers are reported.
Abstract: The effectiveness of most cancer targeted therapies is short-lived. Tumors often develop resistance that might be overcome with drug combinations. However, the number of possible combinations is vast, necessitating data-driven approaches to find optimal patient-specific treatments. Here we report AstraZeneca's large drug combination dataset, consisting of 11,576 experiments from 910 combinations across 85 molecularly characterized cancer cell lines, and results of a DREAM Challenge to evaluate computational strategies for predicting synergistic drug pairs and biomarkers. 160 teams participated to provide a comprehensive methodological development and benchmarking. Winning methods incorporate prior knowledge of drug-target interactions. Synergy is predicted with an accuracy matching biological replicates for >60% of combinations. However, 20% of drug combinations are poorly predicted by all methods. Genomic rationale for synergy predictions are identified, including ADAM17 inhibitor antagonism when combined with PIK3CB/D inhibition contrasting to synergy when combined with other PI3K-pathway inhibitors in PIK3CA mutant cells.

227 citations


Journal ArticleDOI
05 Jul 2019-Science
TL;DR: It is found that submicrometer-size magnesium samples exhibit high plasticity that is far greater than for their bulk counterparts, which should allow development of high-ductility magnesium and other metal alloys.
Abstract: Lightweight magnesium alloys are attractive as structural materials for improving energy efficiency in applications such as weight reduction of transportation vehicles. One major obstacle for widespread applications is the limited ductility of magnesium, which has been attributed to [Formula: see text] dislocations failing to accommodate plastic strain. We demonstrate, using in situ transmission electron microscope mechanical testing, that [Formula: see text] dislocations of various characters can accommodate considerable plasticity through gliding on pyramidal planes. We found that submicrometer-size magnesium samples exhibit high plasticity that is far greater than for their bulk counterparts. Small crystal size usually brings high stress, which in turn activates more [Formula: see text] dislocations in magnesium to accommodate plasticity, leading to both high strength and good plasticity.

226 citations


Journal ArticleDOI
26 Jun 2019-Nature
TL;DR: The experimental and molecular dynamics results indicate that a theory beyond classical nucleation theory is needed to describe early-stage nucleation at the atomic scale, and it is anticipated that the reported approach will open the door to the study of many fundamental problems in materials science, nanoscience, condensed matter physics and chemistry.
Abstract: Nucleation plays a critical role in many physical and biological phenomena that range from crystallization, melting and evaporation to the formation of clouds and the initiation of neurodegenerative diseases1-3. However, nucleation is a challenging process to study experimentally, especially in its early stages, when several atoms or molecules start to form a new phase from a parent phase. A number of experimental and computational methods have been used to investigate nucleation processes4-17, but experimental determination of the three-dimensional atomic structure and the dynamics of early-stage nuclei has been unachievable. Here we use atomic electron tomography to study early-stage nucleation in four dimensions (that is, including time) at atomic resolution. Using FePt nanoparticles as a model system, we find that early-stage nuclei are irregularly shaped, each has a core of one to a few atoms with the maximum order parameter, and the order parameter gradient points from the core to the boundary of the nucleus. We capture the structure and dynamics of the same nuclei undergoing growth, fluctuation, dissolution, merging and/or division, which are regulated by the order parameter distribution and its gradient. These experimental observations are corroborated by molecular dynamics simulations of heterogeneous and homogeneous nucleation in liquid-solid phase transitions of Pt. Our experimental and molecular dynamics results indicate that a theory beyond classical nucleation theory1,2,18 is needed to describe early-stage nucleation at the atomic scale. We anticipate that the reported approach will open the door to the study of many fundamental problems in materials science, nanoscience, condensed matter physics and chemistry, such as phase transition, atomic diffusion, grain boundary dynamics, interface motion, defect dynamics and surface reconstruction with four-dimensional atomic resolution.

Journal ArticleDOI
TL;DR: A comprehensive survey on the application of blockchain in smart grid, identifying the significant security challenges of smart grid scenarios that can be addressed by blockchain and presenting a number of blockchain-based recent research works presented in different literature addressing security issues.
Abstract: The concept of smart grid has been introduced as a new vision of the conventional power grid to figure out an efficient way of integrating green and renewable energy technologies. In this way, Internet-connected smart grid, also called energy Internet, is also emerging as an innovative approach to ensure the energy from anywhere at any time. The ultimate goal of these developments is to build a sustainable society. However, integrating and coordinating a large number of growing connections can be a challenging issue for the traditional centralized grid system. Consequently, the smart grid is undergoing a transformation to the decentralized topology from its centralized form. On the other hand, blockchain has some excellent features which make it a promising application for smart grid paradigm. In this paper, we aim to provide a comprehensive survey on application of blockchain in smart grid. As such, we identify the significant security challenges of smart grid scenarios that can be addressed by blockchain. Then, we present a number of blockchain-based recent research works presented in different literatures addressing security issues in the area of smart grid. We also summarize several related practical projects, trials, and products that have been emerged recently. Finally, we discuss essential research challenges and future directions of applying blockchain to smart grid security issues.

Journal ArticleDOI
TL;DR: Important characteristics and considerations in the selection, design, and implementation of various prominent and unique robotic artificial muscles for biomimetic robots are discussed, and perspectives on next-generation muscle-powered robots are provided.
Abstract: Robotic artificial muscles are a subset of artificial muscles that are capable of producing biologically inspired motions useful for robot systems, i.e., large power-to-weight ratios, inherent compliance, and large range of motions. These actuators, ranging from shape memory alloys to dielectric elastomers, are increasingly popular for biomimetic robots as they may operate without using complex linkage designs or other cumbersome mechanisms. Recent achievements in fabrication, modeling, and control methods have significantly contributed to their potential utilization in a wide range of applications. However, no survey paper has gone into depth regarding considerations pertaining to their selection, design, and usage in generating biomimetic motions. In this paper, we discuss important characteristics and considerations in the selection, design, and implementation of various prominent and unique robotic artificial muscles for biomimetic robots, and provide perspectives on next-generation muscle-powered robots.

Journal ArticleDOI
TL;DR: It is shown that the proposed Hybrid SVM can reach a classification accuracy of up to 99.38% for the EEG datasets and is an efficient tool for neuroscientists to detect epileptic seizure in EEG.
Abstract: The aim of this study is to establish a hybrid model for epileptic seizure detection with genetic algorithm (GA) and particle swarm optimization (PSO) to determine the optimum parameters of support vector machines (SVMs) for classification of EEG data. SVMs are one of the robust machine learning techniques and have been extensively used in many application areas. The kernel parameter’s setting for SVMs in training process effects the classification accuracy. We used GA- and PSO-based approach to optimize the SVM parameters. Compared to the GA algorithm, the PSO-based approach significantly improves the classification accuracy. It is shown that the proposed Hybrid SVM can reach a classification accuracy of up to 99.38% for the EEG datasets. Hence, the proposed Hybrid SVM is an efficient tool for neuroscientists to detect epileptic seizure in EEG.

Journal ArticleDOI
TL;DR: Fundamental concepts, solution algorithms, and application guidance associated with using infrastructure-based LiDAR sensors to accurately detect and track pedestrians and vehicles at intersections are explored.
Abstract: Light Detection and Ranging (LiDAR) is a remote sensing technology widely used in many areas ranging from making precise medical equipment to creating accurate elevation maps of farmlands. In transportation, although it has been used to assist some design and planning works, the application has been predominantly focused on autonomous vehicles, regardless of its great potential in precise detection and tracking of all road users if implemented in the field. This paper explores fundamental concepts, solution algorithms, and application guidance associated with using infrastructure-based LiDAR sensors to accurately detect and track pedestrians and vehicles at intersections. Based on LiDAR data collected in the field, investigations were conducted in the order of background filtering, object clustering, pedestrian and vehicle classification, and tracking. The results of the analysis include accurate and real-time information of the presence, position, velocity, and direction of pedestrians and vehicles. By studying the data from infrastructure-mounted LiDAR sensors at intersections, this paper offers insights into some critical techniques that are valuable to both researchers and practitioners toward field implementation of LiDAR sensors.

Journal ArticleDOI
26 Sep 2019-ACS Nano
TL;DR: The graphene monolith has an ultra-high through-plane thermal conductivity of 143 W m-1 K-1 exceeding that of many metals, and a low compressive modulus comparable to that of silicones, demonstrating the superior ability to solve the interfacial heat transfer issues in electronic systems.
Abstract: Along with the technology evolution for dense integration of high-power, high-frequency devices in electronics, the accompanying interfacial heat transfer problem leads to urgent demands for advanced thermal interface materials (TIMs) with both high through-plane thermal conductivity and good compressibility. Most metals have satisfactory thermal conductivity but relatively high compressive modulus, and soft silicones are typically thermal insulators (0.3 W m-1 K-1). Currently, it is a great challenge to develop a soft material with the thermal conductivity up to metal level for TIM application. This study solves this problem by constructing a graphene-based microstructure composed of mainly vertical graphene and a thin cap of horizontal graphene layers on both the top and bottom sides through a mechanical machining process to manipulate the stacked architecture of conventional graphene paper. The resultant graphene monolith has an ultrahigh through-plane thermal conductivity of 143 W m-1 K-1, exceeding that of many metals, and a low compressive modulus of 0.87 MPa, comparable to that of silicones. In the actual TIM performance measurement, the system cooling efficiency with our graphene monolith as TIM is 3 times as high as that of the state-of-the-art commercial TIM, demonstrating the superior ability to solve the interfacial heat transfer issues in electronic systems.

Journal ArticleDOI
TL;DR: Studies suggest an association between exposure to e-cigarette marketing and lower harm perceptions of e-cigarettes, intention to use e- cigarettes, and e- cigarette trial, highlighting the need to for advertising regulations that support public health goals.
Abstract: Introduction Given the lack of regulation on marketing of electronic cigarettes (e-cigarettes) in the United States and the increasing exchange of e-cigarette-related information online, it is critical to understand how e-cigarette companies market e-cigarettes and how the public engages with e-cigarette information. Methods Results are from a systematic review of peer-reviewed literature on e-cigarettes via a PubMed search through June 1, 2017. Search terms included: "e-cigarette*" or "electronic cigarette" or "electronic cigarettes" or "electronic nicotine delivery" or "vape" or "vaping." Experimental studies, quasi-experimental studies, observational studies, qualitative studies, and mixed methods studies providing empirical findings on e-cigarette marketing and communication (ie, nonmarketing communication in the public) were included. Results One hundred twenty-four publications on e-cigarette marketing and communication were identified. They covered topics including e-cigarette advertisement claims/promotions and exposure/receptivity, the effect of e-cigarette advertisements on e-cigarette and cigarette use, public engagement with e-cigarette information, and the public's portrayal of e-cigarettes. Studies show increases in e-cigarette marketing expenditures and online engagement through social media over time, that e-cigarettes are often framed as an alternative to combustible cigarettes, and that e-cigarette advertisement exposure may be associated with e-cigarette trial in adolescents and young adults. Discussion Few studies examine the effects of e-cigarette marketing on perceptions and e-cigarette and cigarette use. Evidence suggests that exposure to e-cigarette advertisements affects perceptions and trial of e-cigarettes, but there is no evidence that exposure affects cigarette use. No studies examined how exposure to e-cigarette communication, particularly misleading or inaccurate information, impacts e-cigarette, and tobacco use behaviors. Implications The present article provides a comprehensive review of e-cigarette marketing and how the public engages with e-cigarette information. Studies suggest an association between exposure to e-cigarette marketing and lower harm perceptions of e-cigarettes, intention to use e-cigarettes, and e-cigarette trial, highlighting the need to for advertising regulations that support public health goals. Findings from this review also present the methodological limitations of the existing research (primarily due to cross-sectional and correlational analyses) and underscore the need for timely, rigorous research to provide an accurate understanding of e-cigarette marketing and communication and its impact on e-cigarette and tobacco product use.

Journal ArticleDOI
01 Jul 2019
TL;DR: A review and update of the growing body of research that shows that sediments in remote mountain lakes archive regional and global environmental changes, including those linked to climate change, altered biogeochemical cycles, and changes in dust composition and deposition, atmospheric fertilization, and biological manipulations can be found in this paper.
Abstract: Mountain lakes are often situated in protected natural areas, a feature that leads to their role as sentinels of global environmental change. Despite variations in latitude, mountain lakes share many features, including their location in catchments with steep topographic gradients, cold temperatures, high incident solar and ultraviolet radiation (UVR), and prolonged ice and snow cover. These characteristics, in turn, affect mountain lake ecosystem structure, diversity, and productivity. The lakes themselves are mostly small, and up until recently, have been characterized as oligotrophic. This paper provides a review and update of the growing body of research that shows that sediments in remote mountain lakes archive regional and global environmental changes, including those linked to climate change, altered biogeochemical cycles, and changes in dust composition and deposition, atmospheric fertilization, and biological manipulations. These archives provide an important record of global environmental change that pre-dates typical monitoring windows. Paleolimnological research at strategically selected lakes has increased our knowledge of interactions among multiple stressors and their synergistic effects on lake systems. Lakes from transects across steep climate (i.e., temperature and effective moisture) gradients in mountain regions show how environmental change alters lakes in close proximity, but at differing climate starting points. Such research in particular highlights the impacts of melting glaciers on mountain lakes. The addition of new proxies, including DNA-based techniques and advanced stable isotopic analyses, provides a gateway to addressing novel research questions about global environmental change. Recent advances in remote sensing and continuous, high-frequency, limnological measurements will improve spatial and temporal resolution and help to add records to spatial gaps including tropical and southern latitudes. Mountain lake records provide a unique opportunity for global scale assessments that provide knowledge necessary to protect the Earth system.

Journal ArticleDOI
TL;DR: In this paper, discontinuous cyclic loading tests were conducted to evaluate the effect of low-stress time intervals (LSIs) on the fatigue performance of salt rock, and the results showed that the fatigue life of the specimen subjected to cyclic compression decreased sharply in comparison with conventional fatigue tests.

Journal ArticleDOI
TL;DR: In this paper, the authors show that a long-lived magnetized neutron star with a poloidal field is fully consistent with the electromagnetic dataset, when spin down losses are dominated by gravitational wave (GW) emission.
Abstract: Multi-messenger observations of GW170817 have not conclusively established whether the merger remnant is a black hole (BH) or a neutron star (NS). We show that a long-lived magnetized NS with a poloidal field $B\approx 10^{12}$G is fully consistent with the electromagnetic dataset, when spin down losses are dominated by gravitational wave (GW) emission. The required ellipticity $\epsilon\gtrsim 10^{-5}$ can result from a toroidal magnetic field component much stronger than the poloidal component, a configuration expected from a NS newly formed from a merger. Abrupt magnetic dissipation of the toroidal component can lead to the appearance of X-ray flares, analogous to the one observed in gamma-ray burst (GRB) afterglows. In the X-ray afterglow of GW170817 we identify a low-significance ($\gtrsim 3\sigma$) temporal feature at 155 d, consistent with a sudden reactivation of the central NS. Energy injection from the NS spin down into the relativistic shock is negligible, and the underlying continuum is fully accounted for by a structured jet seen off-axis. Whereas radio and optical observations probe the interaction of this jet with the surrounding medium, observations at X-ray wavelengths, performed with adequate sampling, open a privileged window on to the merger remnant.

Journal ArticleDOI
TL;DR: The BOULEVARD trial met its primary end point; faricimab demonstrated statistically superior visual acuity gains versus ranibizumab at week 24 in treatment-naïve patients, and the benefit of simultaneous inhibition of angiopoietin-2 and VEGF-A with faricIMab for patients with DME is suggested.

Journal ArticleDOI
01 Aug 2019
TL;DR: Forister et al. as mentioned in this paper pointed out that this response need not wait for full resolution of the many physiological, behavioral, and demographic aspects of declining insect populations, and suggested primary policy goals summarized at scales from nations to farms to homes.
Abstract: Correspondence Matthew L. Forister, Biology Department Mail Stop 314, University of Nevada Reno, 1664 N Virginia Street, Reno, NV 89557. Email: mforister@unr.edu Abstract Recent regional reports and trends in biomonitoring suggest that insects are experiencing a multicontinental crisis that is apparent as reductions in abundance, diversity, and biomass. Given the centrality of insects to terrestrial ecosystems and the food chain that supports humans, the importance of addressing these declines cannot be overstated. The scientific community has understandably been focused on establishing the breadth and depth of the phenomenon and on documenting factors causing insect declines. In parallel with ongoing research, it is now time for the development of a policy consensus that will allow for a swift societal response. We point out that this response need not wait for full resolution of the many physiological, behavioral, and demographic aspects of declining insect populations. To these ends, we suggest primary policy goals summarized at scales from nations to farms to homes.

Journal ArticleDOI
TL;DR: This work highlights the signaling systems required for pollen tube navigation and the potential roles of Ca2+ signals, and suggests strategies to improve seed crop yields that are under threat from climate change.
Abstract: In flowering plants, pollen tubes undergo tip growth to deliver two nonmotile sperm to the ovule where they fuse with an egg and central cell to achieve double fertilization. This extended journey involves rapid growth and changes in gene activity that manage compatible interactions with at least seven different cell types. Nearly half of the genome is expressed in haploid pollen, which facilitates genetic analysis, even of essential genes. These unique attributes make pollen an ideal system with which to study plant cell-cell interactions, tip growth, cell migration, the modulation of cell wall integrity, and gene expression networks. We highlight the signaling systems required for pollen tube navigation and the potential roles of Ca2+ signals. The dynamics of pollen development make sexual reproduction highly sensitive to heat stress. Understanding this vulnerability may generate strategies to improve seed crop yields that are under threat from climate change.

Journal ArticleDOI
TL;DR: Methods of assessment and analysis that can integrate idiographic and nomothetic approaches in a process-based era are explored.

Journal ArticleDOI
TL;DR: It is found that in more mesic sites with high SOC concentrations, soil priming effects are more likely to be negative, with important implications for the improvement of C cycling models under global change scenarios.
Abstract: Identifying the global drivers of soil priming is essential to understanding C cycling in terrestrial ecosystems. We conducted a survey of soils across 86 globally-distributed locations, spanning a wide range of climates, biotic communities, and soil conditions, and evaluated the apparent soil priming effect using 13C-glucose labeling. Here we show that the magnitude of the positive apparent priming effect (increase in CO2 release through accelerated microbial biomass turnover) was negatively associated with SOC content and microbial respiration rates. Our statistical modeling suggests that apparent priming effects tend to be negative in more mesic sites associated with higher SOC contents. In contrast, a single-input of labile C causes positive apparent priming effects in more arid locations with low SOC contents. Our results provide solid evidence that SOC content plays a critical role in regulating apparent priming effects, with important implications for the improvement of C cycling models under global change scenarios.

Journal ArticleDOI
TL;DR: In this paper, the authors quantified changes in population abundance and distribution range of freshwater megafauna species globally and in Europe and the United States from literature and databases of the International Union for Conservation of Nature and NatureServe.
Abstract: Freshwater ecosystems are among the most diverse and dynamic ecosystems on Earth. At the same time, they are among the most threatened ecosystems but remain underrepresented in biodiversity research and conservation efforts. The rate of decline of vertebrate populations is much higher in freshwaters than in terrestrial or marine realms. Freshwater megafauna (i.e., freshwater animals that can reach a body mass ≥30 kg) are intrinsically prone to extinction due to their large body size, complex habitat requirements and slow life-history strategies such as long life span and late maturity. However, population trends and distribution changes of freshwater megafauna, at continental or global scales, remain unclear. In the present study, we compiled population data of 126 freshwater megafauna species globally from the Living Planet Database and available literature, and distribution data of 44 species inhabiting Europe and the United States from literature and databases of the International Union for Conservation of Nature and NatureServe. We quantified changes in population abundance and distribution range of freshwater megafauna species. Globally, freshwater megafauna populations declined by 88% from 1970 to 2012, with the highest declines in the Indomalaya and Palearctic realms (-99% and -97%, respectively). Among taxonomic groups, mega-fishes exhibited the greatest global decline (-94%). In addition, freshwater megafauna experienced major range contractions. For example, distribution ranges of 42% of all freshwater megafauna species in Europe contracted by more than 40% of historical areas. We highlight the various sources of uncertainty in tracking changes in populations and distributions of freshwater megafauna, such as the lack of monitoring data and taxonomic and spatial biases. The detected trends emphasize the critical plight of freshwater megafauna globally and highlight the broader need for concerted, targeted and timely conservation of freshwater biodiversity.

Journal ArticleDOI
TL;DR: This article presents the most comprehensive comparative study on pathway analysis methods available to date and discovers that most, if not all, listed approaches are biased and can produce skewed results under the null.
Abstract: Many high-throughput experiments compare two phenotypes such as disease vs. healthy, with the goal of understanding the underlying biological phenomena characterizing the given phenotype. Because of the importance of this type of analysis, more than 70 pathway analysis methods have been proposed so far. These can be categorized into two main categories: non-topology-based (non-TB) and topology-based (TB). Although some review papers discuss this topic from different aspects, there is no systematic, large-scale assessment of such methods. Furthermore, the majority of the pathway analysis approaches rely on the assumption of uniformity of p values under the null hypothesis, which is often not true. This article presents the most comprehensive comparative study on pathway analysis methods available to date. We compare the actual performance of 13 widely used pathway analysis methods in over 1085 analyses. These comparisons were performed using 2601 samples from 75 human disease data sets and 121 samples from 11 knockout mouse data sets. In addition, we investigate the extent to which each method is biased under the null hypothesis. Together, these data and results constitute a reliable benchmark against which future pathway analysis methods could and should be tested. Overall, the result shows that no method is perfect. In general, TB methods appear to perform better than non-TB methods. This is somewhat expected since the TB methods take into consideration the structure of the pathway which is meant to describe the underlying phenomena. We also discover that most, if not all, listed approaches are biased and can produce skewed results under the null.

Journal ArticleDOI
TL;DR: In this article, the roles of resource availability, nutrient stoichiometry, and soil abiotic factors in driving belowground biodiversity across 16 soil chronosequences (from centuries to millennia) spanning a wide range of globally distributed ecosystem types.
Abstract: Belowground organisms play critical roles in maintaining multiple ecosystem processes, including plant productivity, decomposition, and nutrient cycling. Despite their importance, however, we have a limited understanding of how and why belowground biodiversity (bacteria, fungi, protists, and invertebrates) may change as soils develop over centuries to millennia (pedogenesis). Moreover, it is unclear whether belowground biodiversity changes during pedogenesis are similar to the patterns observed for aboveground plant diversity. Here we evaluated the roles of resource availability, nutrient stoichiometry, and soil abiotic factors in driving belowground biodiversity across 16 soil chronosequences (from centuries to millennia) spanning a wide range of globally distributed ecosystem types. Changes in belowground biodiversity during pedogenesis followed two main patterns. In lower-productivity ecosystems (i.e., drier and colder), increases in belowground biodiversity tracked increases in plant cover. In more productive ecosystems (i.e., wetter and warmer), increased acidification during pedogenesis was associated with declines in belowground biodiversity. Changes in the diversity of bacteria, fungi, protists, and invertebrates with pedogenesis were strongly and positively correlated worldwide, highlighting that belowground biodiversity shares similar ecological drivers as soils and ecosystems develop. In general, temporal changes in aboveground plant diversity and belowground biodiversity were not correlated, challenging the common perception that belowground biodiversity should follow similar patterns to those of plant diversity during ecosystem development. Taken together, our findings provide evidence that ecological patterns in belowground biodiversity are predictable across major globally distributed ecosystem types and suggest that shifts in plant cover and soil acidification during ecosystem development are associated with changes in belowground biodiversity over centuries to millennia.

Proceedings Article
01 May 2019
TL;DR: This paper tackles the dual challenge of SLO compliance and cost effectiveness with MArk (Model Ark), a general-purpose inference serving system built in Amazon Web Services (AWS), and evaluated the performance of MArk using several state-of-the-art ML models trained in popular frameworks including TensorFlow, MXNet, and Keras.
Abstract: The advances of Machine Learning (ML) have sparked a growing demand of ML-as-a-Service: developers train ML models and publish them in the cloud as online services to provide low-latency inference at scale. The key challenge of ML model serving is to meet the response-time Service-Level Objectives (SLOs) of inference workloads while minimizing the serving cost. In this paper, we tackle the dual challenge of SLO compliance and cost effectiveness with MArk (Model Ark), a general-purpose inference serving system built in Amazon Web Services (AWS). MArk employs three design choices tailor-made for inference workload. First, MArk dynamically batches requests and opportunistically serves them using expensive hardware accelerators (e.g., GPU) for improved performance-cost ratio. Second, instead of relying on feedback control scaling or over-provisioning to serve dynamic workload, which can be too slow or too expensive for inference serving, MArk employs predictive autoscaling to hide the provisioning latency at low cost. Third, given the stateless nature of inference serving, MArk exploits the flexible, yet costly serverless instances to cover the occasional load spikes that are hard to predict. We evaluated the performance of MArk using several state-of-the-art ML models trained in popular frameworks including TensorFlow, MXNet, and Keras. Compared with the premier industrial ML serving platform SageMaker, MArk reduces the serving cost up to 7.8× while achieving even better latency performance.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a class of dynamic windows that combines reversible metal electrodeposition with ion insertion chemistry, and demonstrated that these hybrid windows cycle at least 4,000 times without degradation and are compatible with flexible substrates.
Abstract: Dynamic windows with electronically controlled transmission reduce glare without obstructing views while increasing the energy efficiency of buildings and automobiles via lighting, heating and cooling savings. Electrochromic materials, which change colour with voltage, are widely explored for use in dynamic windows, but they have not been extensively commercialized due to problems associated with colour, cost, switching speed and durability. Here, we develop a class of dynamic windows that combines reversible metal electrodeposition with ion insertion chemistry. These devices function through the reversible electroplating of Bi and Cu at the working electrode and Li+ insertion in a nickel oxide counter electrode. In one minute, 100 cm2 windows uniformly switch between a clear state with 75% transmission and a colour-neutral black state possessing 10% transmission, which represents a significant improvement over previous metal-based architectures. We demonstrate that these hybrid windows cycle at least 4,000 times without degradation and are compatible with flexible substrates. Lastly, we discuss how this approach can be used to design practical large-scale windows. Metal-based smart windows allow for light and heat transmission control but suffer from poor metal ion diffusion over large areas. Here, the authors demonstrate a 100 cm2 window that is uniformly switchable from clear to black in 60 s by combining reversible metal electrodeposition with ion insertion.