scispace - formally typeset
Search or ask a question

Showing papers by "Arizona State University published in 2018"


Journal ArticleDOI
17 Apr 2018-Immunity
TL;DR: An extensive immunogenomic analysis of more than 10,000 tumors comprising 33 diverse cancer types by utilizing data compiled by TCGA identifies six immune subtypes that encompass multiple cancer types and are hypothesized to define immune response patterns impacting prognosis.

3,246 citations


Posted Content
TL;DR: This paper presents UNet++, a new, more powerful architecture for medical image segmentation where the encoder and decoder sub-networks are connected through a series of nested, dense skip pathways, and argues that the optimizer would deal with an easier learning task when the feature maps from the decoder and encoder networks are semantically similar.
Abstract: In this paper, we present UNet++, a new, more powerful architecture for medical image segmentation. Our architecture is essentially a deeply-supervised encoder-decoder network where the encoder and decoder sub-networks are connected through a series of nested, dense skip pathways. The re-designed skip pathways aim at reducing the semantic gap between the feature maps of the encoder and decoder sub-networks. We argue that the optimizer would deal with an easier learning task when the feature maps from the decoder and encoder networks are semantically similar. We have evaluated UNet++ in comparison with U-Net and wide U-Net architectures across multiple medical image segmentation tasks: nodule segmentation in the low-dose CT scans of chest, nuclei segmentation in the microscopy images, liver segmentation in abdominal CT scans, and polyp segmentation in colonoscopy videos. Our experiments demonstrate that UNet++ with deep supervision achieves an average IoU gain of 3.9 and 3.4 points over U-Net and wide U-Net, respectively.

2,254 citations


Book ChapterDOI
20 Sep 2018
TL;DR: UNet++ as discussed by the authors is a deeply-supervised encoder-decoder network where the encoder and decoder sub-networks are connected through a series of nested, dense skip pathways.
Abstract: In this paper, we present UNet++, a new, more powerful architecture for medical image segmentation. Our architecture is essentially a deeply-supervised encoder-decoder network where the encoder and decoder sub-networks are connected through a series of nested, dense skip pathways. The re-designed skip pathways aim at reducing the semantic gap between the feature maps of the encoder and decoder sub-networks. We argue that the optimizer would deal with an easier learning task when the feature maps from the decoder and encoder networks are semantically similar. We have evaluated UNet++ in comparison with U-Net and wide U-Net architectures across multiple medical image segmentation tasks: nodule segmentation in the low-dose CT scans of chest, nuclei segmentation in the microscopy images, liver segmentation in abdominal CT scans, and polyp segmentation in colonoscopy videos. Our experiments demonstrate that UNet++ with deep supervision achieves an average IoU gain of 3.9 and 3.4 points over U-Net and wide U-Net, respectively.

2,067 citations


Journal ArticleDOI
Daniel J. Benjamin1, James O. Berger2, Magnus Johannesson1, Magnus Johannesson3, Brian A. Nosek4, Brian A. Nosek5, Eric-Jan Wagenmakers6, Richard A. Berk7, Kenneth A. Bollen8, Björn Brembs9, Lawrence D. Brown7, Colin F. Camerer10, David Cesarini11, David Cesarini12, Christopher D. Chambers13, Merlise A. Clyde2, Thomas D. Cook14, Thomas D. Cook15, Paul De Boeck16, Zoltan Dienes17, Anna Dreber3, Kenny Easwaran18, Charles Efferson19, Ernst Fehr20, Fiona Fidler21, Andy P. Field17, Malcolm R. Forster22, Edward I. George7, Richard Gonzalez23, Steven N. Goodman24, Edwin J. Green25, Donald P. Green26, Anthony G. Greenwald27, Jarrod D. Hadfield28, Larry V. Hedges15, Leonhard Held20, Teck-Hua Ho29, Herbert Hoijtink30, Daniel J. Hruschka31, Kosuke Imai32, Guido W. Imbens24, John P. A. Ioannidis24, Minjeong Jeon33, James Holland Jones34, Michael Kirchler35, David Laibson36, John A. List37, Roderick J. A. Little23, Arthur Lupia23, Edouard Machery38, Scott E. Maxwell39, Michael A. McCarthy21, Don A. Moore40, Stephen L. Morgan41, Marcus R. Munafò42, Shinichi Nakagawa43, Brendan Nyhan44, Timothy H. Parker45, Luis R. Pericchi46, Marco Perugini47, Jeffrey N. Rouder48, Judith Rousseau49, Victoria Savalei50, Felix D. Schönbrodt51, Thomas Sellke52, Betsy Sinclair53, Dustin Tingley36, Trisha Van Zandt16, Simine Vazire54, Duncan J. Watts55, Christopher Winship36, Robert L. Wolpert2, Yu Xie32, Cristobal Young24, Jonathan Zinman44, Valen E. Johnson1, Valen E. Johnson18 
University of Southern California1, Duke University2, Stockholm School of Economics3, Center for Open Science4, University of Virginia5, University of Amsterdam6, University of Pennsylvania7, University of North Carolina at Chapel Hill8, University of Regensburg9, California Institute of Technology10, Research Institute of Industrial Economics11, New York University12, Cardiff University13, Mathematica Policy Research14, Northwestern University15, Ohio State University16, University of Sussex17, Texas A&M University18, Royal Holloway, University of London19, University of Zurich20, University of Melbourne21, University of Wisconsin-Madison22, University of Michigan23, Stanford University24, Rutgers University25, Columbia University26, University of Washington27, University of Edinburgh28, National University of Singapore29, Utrecht University30, Arizona State University31, Princeton University32, University of California, Los Angeles33, Imperial College London34, University of Innsbruck35, Harvard University36, University of Chicago37, University of Pittsburgh38, University of Notre Dame39, University of California, Berkeley40, Johns Hopkins University41, University of Bristol42, University of New South Wales43, Dartmouth College44, Whitman College45, University of Puerto Rico46, University of Milan47, University of California, Irvine48, Paris Dauphine University49, University of British Columbia50, Ludwig Maximilian University of Munich51, Purdue University52, Washington University in St. Louis53, University of California, Davis54, Microsoft55
TL;DR: The default P-value threshold for statistical significance is proposed to be changed from 0.05 to 0.005 for claims of new discoveries in order to reduce uncertainty in the number of discoveries.
Abstract: We propose to change the default P-value threshold for statistical significance from 0.05 to 0.005 for claims of new discoveries.

1,586 citations


Journal ArticleDOI
25 May 2018-Science
TL;DR: Research prospects for more sustainable routes to nitrogen commodity chemicals are reviewed, considering developments in enzymatic, homogeneous, and heterogeneous catalysis, as well as electrochemical, photochemical, and plasma-based approaches.
Abstract: BACKGROUND The invention of the Haber-Bosch (H-B) process in the early 1900s to produce ammonia industrially from nitrogen and hydrogen revolutionized the manufacture of fertilizer and led to fundamental changes in the way food is produced. Its impact is underscored by the fact that about 50% of the nitrogen atoms in humans today originate from this single industrial process. In the century after the H-B process was invented, the chemistry of carbon moved to center stage, resulting in remarkable discoveries and a vast array of products including plastics and pharmaceuticals. In contrast, little has changed in industrial nitrogen chemistry. This scenario reflects both the inherent efficiency of the H-B process and the particular challenge of breaking the strong dinitrogen bond. Nonetheless, the reliance of the H-B process on fossil fuels and its associated high CO 2 emissions have spurred recent interest in finding more sustainable and environmentally benign alternatives. Nitrogen in its more oxidized forms is also industrially, biologically, and environmentally important, and synergies in new combinations of oxidative and reductive transformations across the nitrogen cycle could lead to improved efficiencies. ADVANCES Major effort has been devoted to developing alternative and environmentally friendly processes that would allow NH 3 production at distributed sources under more benign conditions, rather than through the large-scale centralized H-B process. Hydrocarbons (particularly methane) and water are the only two sources of hydrogen atoms that can sustain long-term, large-scale NH 3 production. The use of water as the hydrogen source for NH 3 production requires substantially more energy than using methane, but it is also more environmentally benign, does not contribute to the accumulation of greenhouse gases, and does not compete for valuable and limited hydrocarbon resources. Microbes living in all major ecosystems are able to reduce N 2 to NH 3 by using the enzyme nitrogenase. A deeper understanding of this enzyme could lead to more efficient catalysts for nitrogen reduction under ambient conditions. Model molecular catalysts have been designed that mimic some of the functions of the active site of nitrogenase. Some modest success has also been achieved in designing electrocatalysts for dinitrogen reduction. Electrochemistry avoids the expense and environmental damage of steam reforming of methane (which accounts for most of the cost of the H-B process), and it may provide a means for distributed production of ammonia. On the oxidative side, nitric acid is the principal commodity chemical containing oxidized nitrogen. Nearly all nitric acid is manufactured by oxidation of NH 3 through the Ostwald process, but a more direct reaction of N 2 with O 2 might be practically feasible through further development of nonthermal plasma technology. Heterogeneous NH 3 oxidation with O 2 is at the heart of the Ostwald process and is practiced in a variety of environmental protection applications as well. Precious metals remain the workhorse catalysts, and opportunities therefore exist to develop lower-cost materials with equivalent or better activity and selectivity. Nitrogen oxides are also environmentally hazardous pollutants generated by industrial and transportation activities, and extensive research has gone into developing and applying reduction catalysts. Three-way catalytic converters are operating on hundreds of millions of vehicles worldwide. However, increasingly stringent emissions regulations, coupled with the low exhaust temperatures of high-efficiency engines, present challenges for future combustion emissions control. Bacterial denitrification is the natural analog of this chemistry and another source of study and inspiration for catalyst design. OUTLOOK Demands for greater energy efficiency, smaller-scale and more flexible processes, and environmental protection provide growing impetus for expanding the scope of nitrogen chemistry. Nitrogenase, as well as nitrifying and denitrifying enzymes, will eventually be understood in sufficient detail that robust molecular catalytic mimics will emerge. Electrochemical and photochemical methods also demand more study. Other intriguing areas of research that have provided tantalizing results include chemical looping and plasma-driven processes. The grand challenge in the field of nitrogen chemistry is the development of catalysts and processes that provide simple, low-energy routes to the manipulation of the redox states of nitrogen.

1,153 citations


Journal ArticleDOI
28 Feb 2018-Nature
TL;DR: The detection of a flattened absorption profile in the sky-averaged radio spectrum that is largely consistent with expectations for the 21-centimetre signal induced by early stars; however, the best-fitting amplitude of the profile is more than a factor of two greater than the largest predictions.
Abstract: The 21-cm absorption profile is detected in the sky-averaged radio spectrum, but is much stronger than predicted, suggesting that the primordial gas might have been cooler than predicted. As the first stars heated hydrogen in the early Universe, the 21-cm hyperfine line—an astronomical standard that represents the spin-flip transition in the ground state of atomic hydrogen—was altered, causing the hydrogen gas to absorb photons from the microwave background. This should produce an observable absorption signal at frequencies of less than 200 megahertz (MHz). Judd Bowman and colleagues report the observation of an absorption profile centred at a frequency of 78 MHz that is about 19 MHz wide and 0.5 kelvin deep. The profile is generally in line with expectations, although it is deeper than predicted. An accompanying paper by Rennan Barkana suggests that baryons were interacting with cold dark-matter particles in the early Universe, cooling the gas more than had been expected. After stars formed in the early Universe, their ultraviolet light is expected, eventually, to have penetrated the primordial hydrogen gas and altered the excitation state of its 21-centimetre hyperfine line. This alteration would cause the gas to absorb photons from the cosmic microwave background, producing a spectral distortion that should be observable today at radio frequencies of less than 200 megahertz1. Here we report the detection of a flattened absorption profile in the sky-averaged radio spectrum, which is centred at a frequency of 78 megahertz and has a best-fitting full-width at half-maximum of 19 megahertz and an amplitude of 0.5 kelvin. The profile is largely consistent with expectations for the 21-centimetre signal induced by early stars; however, the best-fitting amplitude of the profile is more than a factor of two greater than the largest predictions2. This discrepancy suggests that either the primordial gas was much colder than expected or the background radiation temperature was hotter than expected. Astrophysical phenomena (such as radiation from stars and stellar remnants) are unlikely to account for this discrepancy; of the proposed extensions to the standard model of cosmology and particle physics, only cooling of the gas as a result of interactions between dark matter and baryons seems to explain the observed amplitude3. The low-frequency edge of the observed profile indicates that stars existed and had produced a background of Lyman-α photons by 180 million years after the Big Bang. The high-frequency edge indicates that the gas was heated to above the radiation temperature less than 100 million years later.

992 citations


Journal ArticleDOI
29 Jun 2018-Science
TL;DR: In this paper, the authors examine barriers and opportunities associated with these difficult-to-decarbonize services and processes, including possible technological solutions and research and development priorities, and examine the use of existing technologies to meet future demands for these services without net addition of CO2 to the atmosphere.
Abstract: Some energy services and industrial processes-such as long-distance freight transport, air travel, highly reliable electricity, and steel and cement manufacturing-are particularly difficult to provide without adding carbon dioxide (CO2) to the atmosphere. Rapidly growing demand for these services, combined with long lead times for technology development and long lifetimes of energy infrastructure, make decarbonization of these services both essential and urgent. We examine barriers and opportunities associated with these difficult-to-decarbonize services and processes, including possible technological solutions and research and development priorities. A range of existing technologies could meet future demands for these services and processes without net addition of CO2 to the atmosphere, but their use may depend on a combination of cost reductions via research and innovation, as well as coordinated deployment and integration of operations across currently discrete energy industries.

951 citations


Journal ArticleDOI
TL;DR: It is demonstrated that intravenously injected DNA nanorobots deliver thrombin specifically to tumor-associated blood vessels and induce intravascular thrombosis, resulting in tumor necrosis and inhibition of tumor growth.
Abstract: Nanoscale robots have potential as intelligent drug delivery systems that respond to molecular triggers. Using DNA origami we constructed an autonomous DNA robot programmed to transport payloads and present them specifically in tumors. Our nanorobot is functionalized on the outside with a DNA aptamer that binds nucleolin, a protein specifically expressed on tumor-associated endothelial cells, and the blood coagulation protease thrombin within its inner cavity. The nucleolin-targeting aptamer serves both as a targeting domain and as a molecular trigger for the mechanical opening of the DNA nanorobot. The thrombin inside is thus exposed and activates coagulation at the tumor site. Using tumor-bearing mouse models, we demonstrate that intravenously injected DNA nanorobots deliver thrombin specifically to tumor-associated blood vessels and induce intravascular thrombosis, resulting in tumor necrosis and inhibition of tumor growth. The nanorobot proved safe and immunologically inert in mice and Bama miniature pigs. Our data show that DNA nanorobots represent a promising strategy for precise drug delivery in cancer therapy.

936 citations


Journal ArticleDOI
TL;DR: The Modules for Experiments in Stellar Astrophysics (MESA) software instrument as discussed by the authors has been updated with the capability to handle floating point exceptions and stellar model optimization, as well as four new software tools.
Abstract: We update the capabilities of the software instrument Modules for Experiments in Stellar Astrophysics (MESA) and enhance its ease of use and availability. Our new approach to locating convective boundaries is consistent with the physics of convection, and yields reliable values of the convective-core mass during both hydrogen- and helium-burning phases. Stars with become white dwarfs and cool to the point where the electrons are degenerate and the ions are strongly coupled, a realm now available to study with MESA due to improved treatments of element diffusion, latent heat release, and blending of equations of state. Studies of the final fates of massive stars are extended in MESA by our addition of an approximate Riemann solver that captures shocks and conserves energy to high accuracy during dynamic epochs. We also introduce a 1D capability for modeling the effects of Rayleigh–Taylor instabilities that, in combination with the coupling to a public version of the radiation transfer instrument, creates new avenues for exploring Type II supernova properties. These capabilities are exhibited with exploratory models of pair-instability supernovae, pulsational pair-instability supernovae, and the formation of stellar-mass black holes. The applicability of MESA is now widened by the capability to import multidimensional hydrodynamic models into MESA. We close by introducing software modules for handling floating point exceptions and stellar model optimization, as well as four new software tools— , -Docker, , and mesastar.org—to enhance MESA's education and research impact.

808 citations


Journal ArticleDOI
TL;DR: The UWBG semiconductor materials, such as high Al‐content AlGaN, diamond and Ga2O3, advanced in maturity to the point where realizing some of their tantalizing advantages is a relatively near‐term possibility.
Abstract: J. Y. Tsao,* S. Chowdhury, M. A. Hollis,* D. Jena, N. M. Johnson, K. A. Jones, R. J. Kaplar,* S. Rajan, C. G. Van de Walle, E. Bellotti, C. L. Chua, R. Collazo, M. E. Coltrin, J. A. Cooper, K. R. Evans, S. Graham, T. A. Grotjohn, E. R. Heller, M. Higashiwaki, M. S. Islam, P. W. Juodawlkis, M. A. Khan, A. D. Koehler, J. H. Leach, U. K. Mishra, R. J. Nemanich, R. C. N. Pilawa-Podgurski, J. B. Shealy, Z. Sitar, M. J. Tadjer, A. F. Witulski, M. Wraback, and J. A. Simmons

785 citations


Journal ArticleDOI
23 Jan 2018
TL;DR: This comprehensive review summarizes state of the art, challenges, and prospects of the neuro-inspired computing with emerging nonvolatile memory devices and presents a device-circuit-algorithm codesign methodology to evaluate the impact of nonideal device effects on the system-level performance.
Abstract: This comprehensive review summarizes state of the art, challenges, and prospects of the neuro-inspired computing with emerging nonvolatile memory devices. First, we discuss the demand for developing neuro-inspired architecture beyond today’s von-Neumann architecture. Second, we summarize the various approaches to designing the neuromorphic hardware (digital versus analog, spiking versus nonspiking, online training versus offline training) and discuss why emerging nonvolatile memory is attractive for implementing the synapses in the neural network. Then, we discuss the desired device characteristics of the synaptic devices (e.g., multilevel states, weight update nonlinearity/asymmetry, variation/noise), and survey a few representative material systems and device prototypes reported in the literature that show the analog conductance tuning. These candidates include phase change memory, resistive memory, ferroelectric memory, floating-gate transistors, etc. Next, we introduce the crossbar array architecture to accelerate the weighted sum and weight update operations that are commonly used in the neuro-inspired machine learning algorithms, and review the recent progresses of array-level experimental demonstrations for pattern recognition tasks. In addition, we discuss the peripheral neuron circuit design issues and present a device-circuit-algorithm codesign methodology to evaluate the impact of nonideal device effects on the system-level performance (e.g., learning accuracy). Finally, we give an outlook on the customization of the learning algorithms for efficient hardware implementation.

Journal ArticleDOI
12 Jul 2018-Nature
TL;DR: It is overwhelmingly that the interventions improved the sustainability of China’s rural land systems, but the impacts are nuanced and adverse outcomes have occurred.
Abstract: China has responded to a national land-system sustainability emergency via an integrated portfolio of large-scale programmes. Here we review 16 sustainability programmes, which invested US$378.5 billion (in 2015 US$), covered 623.9 million hectares of land and involved over 500 million people, mostly since 1998. We find overwhelmingly that the interventions improved the sustainability of China’s rural land systems, but the impacts are nuanced and adverse outcomes have occurred. We identify some key characteristics of programme success, potential risks to their durability, and future research needs. We suggest directions for China and other nations as they progress towards the Sustainable Development Goals of the United Nations’ Agenda 2030.

Journal ArticleDOI
TL;DR: An iterative cluster Primal Dual Splitting algorithm for solving the large-scale sSVM problem in a decentralized fashion, which extracts important features discovered by the algorithm that are predictive of future hospitalizations, thus providing a way to interpret the classification results and inform prevention efforts.

Journal ArticleDOI
TL;DR: New opportunities and approaches for the application of nanotechnology to enhance the efficiency and affordability of water treatment and wastewater reuse and enhance water security are considered.
Abstract: No other resource is as necessary for life as water, and providing it universally in a safe, reliable and affordable manner is one of the greatest challenges of the twenty-first century. Here, we consider new opportunities and approaches for the application of nanotechnology to enhance the efficiency and affordability of water treatment and wastewater reuse. Potential development and implementation barriers are discussed along with research needs to overcome them and enhance water security. This Perspective provides an overview of the potential aspects of water treatment and cleaning in which nanotechnology could play an important role.

Journal ArticleDOI
TL;DR: In this paper, the fundamental principles necessary to understand electrochemical reduction technologies and how to apply them are described, and the applicability for treating drinking water matrices using electrochemical processes is analyzed, including existing implementation of commercial treatment systems.
Abstract: Nitrate contamination in surface and ground waters is one of this century’s major engineering challenges due to negative environmental impacts and the risk to human health in drinking water. Electrochemical reduction is a promising water treatment technology to manage nitrate in drinking water. This critical review describes the fundamental principles necessary to understand electrochemical reduction technologies and how to apply them. The focus is on electrochemical nitrate reduction mechanisms and pathways that form undesirable products (nitrite, ammonium) or the more desirable product (dinitrogen). Factors influencing the conversion rates and selectivity of electrochemical nitrate reduction, such as electrode material and operating parameters, are also described. Finally, the applicability for treating drinking water matrices using electrochemical processes is analyzed, including existing implementation of commercial treatment systems. Overall, this critical review contributes to the understanding of the potential applications and constraints of electrochemical reduction to manage nitrate in drinking waters and highlights directions for future research required for implementation.

Journal ArticleDOI
TL;DR: This work demonstrates analog resistive switching devices that possess desired characteristics for neuromorphic computing networks with minimal performance variations using a single-crystalline SiGe layer epitaxially grown on Si as a switching medium.
Abstract: Although several types of architecture combining memory cells and transistors have been used to demonstrate artificial synaptic arrays, they usually present limited scalability and high power consumption. Transistor-free analog switching devices may overcome these limitations, yet the typical switching process they rely on—formation of filaments in an amorphous medium—is not easily controlled and hence hampers the spatial and temporal reproducibility of the performance. Here, we demonstrate analog resistive switching devices that possess desired characteristics for neuromorphic computing networks with minimal performance variations using a single-crystalline SiGe layer epitaxially grown on Si as a switching medium. Such epitaxial random access memories utilize threading dislocations in SiGe to confine metal filaments in a defined, one-dimensional channel. This confinement results in drastically enhanced switching uniformity and long retention/high endurance with a high analog on/off ratio. Simulations using the MNIST handwritten recognition data set prove that epitaxial random access memories can operate with an online learning accuracy of 95.1%. Controlled widening of threading dislocations in SiGe layers epitaxially grown on Si allows the realization of resistive switching devices with enhanced uniformity, high on/off ratio and long retention times.

Journal ArticleDOI
TL;DR: Recon3D is presented, a computational resource that includes three-dimensional metabolite and protein structure data and enables integrated analyses of metabolic functions in humans and is used to functionally characterize mutations associated with disease, and identify metabolic response signatures that are caused by exposure to certain drugs.
Abstract: Genome-scale network reconstructions have helped uncover the molecular basis of metabolism. Here we present Recon3D, a computational resource that includes three-dimensional (3D) metabolite and protein structure data and enables integrated analyses of metabolic functions in humans. We use Recon3D to functionally characterize mutations associated with disease, and identify metabolic response signatures that are caused by exposure to certain drugs. Recon3D represents the most comprehensive human metabolic network model to date, accounting for 3,288 open reading frames (representing 17% of functionally annotated human genes), 13,543 metabolic reactions involving 4,140 unique metabolites, and 12,890 protein structures. These data provide a unique resource for investigating molecular mechanisms of human metabolism. Recon3D is available at http://vmh.life.

Journal ArticleDOI
TL;DR: This Analysis compares and contrasts methods for measuring the mechanical properties of cells by applying the different approaches to the same breast cancer cell line, highlighting how elastic and viscous moduli of MCF-7 breast cancer cells can vary 1,000-fold and 100-fold.
Abstract: The mechanical properties of cells influence their cellular and subcellular functions, including cell adhesion, migration, polarization, and differentiation, as well as organelle organization and trafficking inside the cytoplasm. Yet reported values of cell stiffness and viscosity vary substantially, which suggests differences in how the results of different methods are obtained or analyzed by different groups. To address this issue and illustrate the complementarity of certain approaches, here we present, analyze, and critically compare measurements obtained by means of some of the most widely used methods for cell mechanics: atomic force microscopy, magnetic twisting cytometry, particle-tracking microrheology, parallel-plate rheometry, cell monolayer rheology, and optical stretching. These measurements highlight how elastic and viscous moduli of MCF-7 breast cancer cells can vary 1,000-fold and 100-fold, respectively. We discuss the sources of these variations, including the level of applied mechanical stress, the rate of deformation, the geometry of the probe, the location probed in the cell, and the extracellular microenvironment.

Journal ArticleDOI
TL;DR: This American Association for the Study of Liver Diseases (AASLD) 2018 Practice Guidance on Primary Biliary Cholangitis is an update of the PBC guidelines published in 2009, and provides a data-supported approach to screening, diagnosis, and clinical management of patients with PBC.

Journal ArticleDOI
17 Apr 2018
TL;DR: This paper presents an overview of recent work in decentralized optimization and surveys the state-of-theart algorithms and their analyses tailored to these different scenarios, highlighting the role of the network topology.
Abstract: In decentralized optimization, nodes cooperate to minimize an overall objective function that is the sum (or average) of per-node private objective functions. Algorithms interleave local computations with communication among all or a subset of the nodes. Motivated by a variety of applications..decentralized estimation in sensor networks, fitting models to massive data sets, and decentralized control of multirobot systems, to name a few..significant advances have been made toward the development of robust, practical algorithms with theoretical performance guarantees. This paper presents an overview of recent work in this area. In general, rates of convergence depend not only on the number of nodes involved and the desired level of accuracy, but also on the structure and nature of the network over which nodes communicate (e.g., whether links are directed or undirected, static or time varying). We survey the state-of-theart algorithms and their analyses tailored to these different scenarios, highlighting the role of the network topology.

Journal ArticleDOI
TL;DR: A novel integrated machine learning and coordinated beamforming solution is developed to overcome challenges and enable highly-mobile mmWave applications with reliable coverage, low latency, and negligible training overhead.
Abstract: Supporting high mobility in millimeter wave (mmWave) systems enables a wide range of important applications, such as vehicular communications and wireless virtual/augmented reality. Realizing this in practice, though, requires overcoming several challenges. First, the use of narrow beams and the sensitivity of mmWave signals to blockage greatly impact the coverage and reliability of highly-mobile links. Second, highly-mobile users in dense mmWave deployments need to frequently hand-off between base stations (BSs), which is associated with critical control and latency overhead. Furthermore, identifying the optimal beamforming vectors in large antenna array mmWave systems requires considerable training overhead, which significantly affects the efficiency of these mobile systems. In this paper, a novel integrated machine learning and coordinated beamforming solution is developed to overcome these challenges and enable highly-mobile mmWave applications. In the proposed solution, a number of distributed yet coordinating BSs simultaneously serve a mobile user. This user ideally needs to transmit only one uplink training pilot sequence that will be jointly received at the coordinating BSs using omni or quasi-omni beam patterns. These received signals draw a defining signature not only for the user location, but also for its interaction with the surrounding environment. The developed solution then leverages a deep learning model that learns how to use these signatures to predict the beamforming vectors at the BSs. This renders a comprehensive solution that supports highly mobile mmWave applications with reliable coverage, low latency, and negligible training overhead. Extensive simulation results based on accurate ray-tracing, show that the proposed deep-learning coordinated beamforming strategy approaches the achievable rate of the genie-aided solution that knows the optimal beamforming vectors with no training overhead. Compared with traditional beamforming solutions, the results show that the proposed deep learning-based strategy attains higher rates, especially in high-mobility large-array regimes.


Journal ArticleDOI
07 Sep 2018-Science
TL;DR: Current knowledge of transient dynamics is reviewed, showing that hitherto idiosyncratic and individual patterns can be classified into a coherent framework, with important general lessons and directions for future study.
Abstract: The importance of transient dynamics in ecological systems and in the models that describe them has become increasingly recognized. However, previous work has typically treated each instance of these dynamics separately. We review both empirical examples and model systems, and outline a classification of transient dynamics based on ideas and concepts from dynamical systems theory. This classification provides ways to understand the likelihood of transients for particular systems, and to guide investigations to determine the timing of sudden switches in dynamics and other characteristics of transients. Implications for both management and underlying ecological theories emerge.

Journal ArticleDOI
TL;DR: NeuroSim, a circuit-level macro model that estimates the area, latency, dynamic energy, and leakage power to facilitate the design space exploration of neuro-inspired architectures with mainstream and emerging device technologies is developed.
Abstract: Neuro-inspired architectures based on synaptic memory arrays have been proposed for on-chip acceleration of weighted sum and weight update in machine/deep learning algorithms. In this paper, we developed NeuroSim, a circuit-level macro model that estimates the area, latency, dynamic energy, and leakage power to facilitate the design space exploration of neuro-inspired architectures with mainstream and emerging device technologies. NeuroSim provides flexible interface and a wide variety of design options at the circuit and device level. Therefore, NeuroSim can be used by neural networks (NNs) as a supporting tool to provide circuit-level performance evaluation. With NeuroSim, an integrated framework can be built with hierarchical organization from the device level (synaptic device properties) to the circuit level (array architectures) and then to the algorithm level (NN topology), enabling instruction-accurate evaluation on the learning accuracy as well as the circuit-level performance metrics at the run-time of online learning. Using multilayer perceptron as a case-study algorithm, we investigated the impact of the “analog” emerging nonvolatile memory (eNVM)’s “nonideal” device properties and benchmarked the tradeoffs between SRAM, digital, and analog eNVM-based architectures for online learning and offline classification.

Proceedings Article
29 Apr 2018
TL;DR: In this paper, a self-attention mechanism is employed for clinical time-series modeling, which employs a masked, self attention mechanism, and uses positional encoding and dense interpolation strategies for incorporating temporal order.
Abstract: With widespread adoption of electronic health records, there is an increased emphasis for predictive models that can effectively deal with clinical time-series data. Powered by Recurrent Neural Network (RNN) architectures with Long Short-Term Memory (LSTM) units, deep neural networks have achieved state-of-the-art results in several clinical prediction tasks. Despite the success of RNN, its sequential nature prohibits parallelized computing, thus making it inefficient particularly when processing long sequences. Recently, architectures which are based solely on attention mechanisms have shown remarkable success in transduction tasks in NLP, while being computationally superior. In this paper, for the first time, we utilize attention models for clinical time-series modeling, thereby dispensing recurrence entirely. We develop the SAnD (Simply Attend and Diagnose) architecture, which employs a masked, self-attention mechanism, and uses positional encoding and dense interpolation strategies for incorporating temporal order. Furthermore, we develop a multi-task variant of SAnD to jointly infer models with multiple diagnosis tasks. Using the recent MIMIC-III benchmark datasets, we demonstrate that the proposed approach achieves state-of-the-art performance in all tasks, outperforming LSTM models and classical baselines with hand-engineered features.

Journal ArticleDOI
01 Apr 2018
TL;DR: In this article, the potential applications of nanomaterials in advancing sustainable water treatment systems and their associated barriers are assessed and future areas of research necessary to realize safe deployment of promising Nanomaterial applications are also identified.
Abstract: Sustainable provision of safe, clean and adequate water supply is a global challenge. Water treatment and desalination technologies remain chemically and energy intensive, ineffective in removing key trace contaminants, and poorly suited to deployment in decentralized (distributed) water treatment systems globally. Several recent efforts have sought to leverage the reactive and tunable properties of nanomaterials to address these technological shortcomings. This Review assesses the potential applications of nanomaterials in advancing sustainable water treatment systems and proposes ways to evaluate the environmental risks and social acceptance of nanotechnology-enabled water treatment processes. Future areas of research necessary to realize safe deployment of promising nanomaterial applications are also identified. Despite recent technological progress, providing safe, clean and sufficient water sustainably for all remains challenging. This Review assesses the potential applications of nanomaterials in advancing the sustainability of water treatment systems, and their associated barriers.

Journal ArticleDOI
TL;DR: Weight stigma is likely to drive weight gain and poor health and thus should be eradicated, which can begin by training compassionate and knowledgeable healthcare providers who will deliver better care and ultimately lessen the negative effects of weight stigma.
Abstract: In an era when obesity prevalence is high throughout much of the world, there is a correspondingly pervasive and strong culture of weight stigma. For example, representative studies show that some forms of weight discrimination are more prevalent even than discrimination based on race or ethnicity. In this Opinion article, we review compelling evidence that weight stigma is harmful to health, over and above objective body mass index. Weight stigma is prospectively related to heightened mortality and other chronic diseases and conditions. Most ironically, it actually begets heightened risk of obesity through multiple obesogenic pathways. Weight stigma is particularly prevalent and detrimental in healthcare settings, with documented high levels of ‘anti-fat’ bias in healthcare providers, patients with obesity receiving poorer care and having worse outcomes, and medical students with obesity reporting high levels of alcohol and substance use to cope with internalized weight stigma. In terms of solutions, the most effective and ethical approaches should be aimed at changing the behaviors and attitudes of those who stigmatize, rather than towards the targets of weight stigma. Medical training must address weight bias, training healthcare professionals about how it is perpetuated and on its potentially harmful effects on their patients. Weight stigma is likely to drive weight gain and poor health and thus should be eradicated. This effort can begin by training compassionate and knowledgeable healthcare providers who will deliver better care and ultimately lessen the negative effects of weight stigma.

Journal ArticleDOI
TL;DR: The results presented here show that favorable climate conditions, particularly high precipitation, tend to increase both species richness and belowground biomass, which had a consistent positive effect on SOC storage in forests, shrublands, and grasslands.
Abstract: Despite evidence from experimental grasslands that plant diversity increases biomass production and soil organic carbon (SOC) storage, it remains unclear whether this is true in natural ecosystems, especially under climatic variations and human disturbances. Based on field observations from 6,098 forest, shrubland, and grassland sites across China and predictions from an integrative model combining multiple theories, we systematically examined the direct effects of climate, soils, and human impacts on SOC storage versus the indirect effects mediated by species richness (SR), aboveground net primary productivity (ANPP), and belowground biomass (BB). We found that favorable climates (high temperature and precipitation) had a consistent negative effect on SOC storage in forests and shrublands, but not in grasslands. Climate favorability, particularly high precipitation, was associated with both higher SR and higher BB, which had consistent positive effects on SOC storage, thus offsetting the direct negative effect of favorable climate on SOC. The indirect effects of climate on SOC storage depended on the relationships of SR with ANPP and BB, which were consistently positive in all biome types. In addition, human disturbance and soil pH had both direct and indirect effects on SOC storage, with the indirect effects mediated by changes in SR, ANPP, and BB. High soil pH had a consistently negative effect on SOC storage. Our findings have important implications for improving global carbon cycling models and ecosystem management: Maintaining high levels of diversity can enhance soil carbon sequestration and help sustain the benefits of plant diversity and productivity.

Journal ArticleDOI
TL;DR: A comprehensive overview of the current understanding of potential exoplanet biosignatures, including gaseous, surface, and temporal signatures, can be found in this article, with a focus on recent advances in assessing biosignature plausibility.
Abstract: In the coming years and decades, advanced space- and ground-based observatories will allow an unprecedented opportunity to probe the atmospheres and surfaces of potentially habitable exoplanets for signatures of life. Life on Earth, through its gaseous products and reflectance and scattering properties, has left its fingerprint on the spectrum of our planet. Aided by the universality of the laws of physics and chemistry, we turn to Earth's biosphere, both in the present and through geologic time, for analog signatures that will aid in the search for life elsewhere. Considering the insights gained from modern and ancient Earth, and the broader array of hypothetical exoplanet possibilities, we have compiled a comprehensive overview of our current understanding of potential exoplanet biosignatures, including gaseous, surface, and temporal biosignatures. We additionally survey biogenic spectral features that are well known in the specialist literature but have not yet been robustly vetted in the context of exoplanet biosignatures. We briefly review advances in assessing biosignature plausibility, including novel methods for determining chemical disequilibrium from remotely obtainable data and assessment tools for determining the minimum biomass required to maintain short-lived biogenic gases as atmospheric signatures. We focus particularly on advances made since the seminal review by Des Marais et al. The purpose of this work is not to propose new biosignature strategies, a goal left to companion articles in this series, but to review the current literature, draw meaningful connections between seemingly disparate areas, and clear the way for a path forward.

Journal ArticleDOI
TL;DR: A multi-laboratory study finds that single-molecule FRET is a reproducible and reliable approach for determining accurate distances in dye-labeled DNA duplexes.
Abstract: Single-molecule Forster resonance energy transfer (smFRET) is increasingly being used to determine distances, structures, and dynamics of biomolecules in vitro and in vivo. However, generalized protocols and FRET standards to ensure the reproducibility and accuracy of measurements of FRET efficiencies are currently lacking. Here we report the results of a comparative blind study in which 20 labs determined the FRET efficiencies (E) of several dye-labeled DNA duplexes. Using a unified, straightforward method, we obtained FRET efficiencies with s.d. between ±0.02 and ±0.05. We suggest experimental and computational procedures for converting FRET efficiencies into accurate distances, and discuss potential uncertainties in the experiment and the modeling. Our quantitative assessment of the reproducibility of intensity-based smFRET measurements and a unified correction procedure represents an important step toward the validation of distance networks, with the ultimate aim of achieving reliable structural models of biomolecular systems by smFRET-based hybrid methods.