scispace - formally typeset
Search or ask a question

Showing papers by "University of Notre Dame published in 2017"


Proceedings ArticleDOI
04 Aug 2017
TL;DR: Two scalable representation learning models, namely metapath2vec and metapATH2vec++, are developed that are able to not only outperform state-of-the-art embedding models in various heterogeneous network mining tasks, but also discern the structural and semantic correlations between diverse network objects.
Abstract: We study the problem of representation learning in heterogeneous networks. Its unique challenges come from the existence of multiple types of nodes and links, which limit the feasibility of the conventional network embedding techniques. We develop two scalable representation learning models, namely metapath2vec and metapath2vec++. The metapath2vec model formalizes meta-path-based random walks to construct the heterogeneous neighborhood of a node and then leverages a heterogeneous skip-gram model to perform node embeddings. The metapath2vec++ model further enables the simultaneous modeling of structural and semantic correlations in heterogeneous networks. Extensive experiments show that metapath2vec and metapath2vec++ are able to not only outperform state-of-the-art embedding models in various heterogeneous network mining tasks, such as node classification, clustering, and similarity search, but also discern the structural and semantic correlations between diverse network objects.

1,794 citations


Journal ArticleDOI
TL;DR: The HiTOP promises to improve research and clinical practice by addressing the aforementioned shortcomings of traditional nosologies and provides an effective way to summarize and convey information on risk factors, etiology, pathophysiology, phenomenology, illness course, and treatment response.
Abstract: The reliability and validity of traditional taxonomies are limited by arbitrary boundaries between psychopathology and normality, often unclear boundaries between disorders, frequent disorder co-occurrence, heterogeneity within disorders, and diagnostic instability. These taxonomies went beyond evidence available on the structure of psychopathology and were shaped by a variety of other considerations, which may explain the aforementioned shortcomings. The Hierarchical Taxonomy Of Psychopathology (HiTOP) model has emerged as a research effort to address these problems. It constructs psychopathological syndromes and their components/subtypes based on the observed covariation of symptoms, grouping related symptoms together and thus reducing heterogeneity. It also combines co-occurring syndromes into spectra, thereby mapping out comorbidity. Moreover, it characterizes these phenomena dimensionally, which addresses boundary problems and diagnostic instability. Here, we review the development of the HiTOP and the relevant evidence. The new classification already covers most forms of psychopathology. Dimensional measures have been developed to assess many of the identified components, syndromes, and spectra. Several domains of this model are ready for clinical and research applications. The HiTOP promises to improve research and clinical practice by addressing the aforementioned shortcomings of traditional nosologies. It also provides an effective way to summarize and convey information on risk factors, etiology, pathophysiology, phenomenology, illness course, and treatment response. This can greatly improve the utility of the diagnosis of mental disorders. The new classification remains a work in progress. However, it is developing rapidly and is poised to advance mental health research and care significantly as the relevant science matures. (PsycINFO Database Record

1,635 citations


Journal ArticleDOI
TL;DR: SDSS-IV as mentioned in this paper is a project encompassing three major spectroscopic programs: the Mapping Nearby Galaxies at Apache Point Observatory (MaNGA), the Extended Baryon Oscillation Spectroscopic Survey (eBOSS), and the Time Domain Spectroscopy Survey (TDSS).
Abstract: We describe the Sloan Digital Sky Survey IV (SDSS-IV), a project encompassing three major spectroscopic programs. The Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2) is observing hundreds of thousands of Milky Way stars at high resolution and high signal-to-noise ratios in the near-infrared. The Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey is obtaining spatially resolved spectroscopy for thousands of nearby galaxies (median $z\sim 0.03$). The extended Baryon Oscillation Spectroscopic Survey (eBOSS) is mapping the galaxy, quasar, and neutral gas distributions between $z\sim 0.6$ and 3.5 to constrain cosmology using baryon acoustic oscillations, redshift space distortions, and the shape of the power spectrum. Within eBOSS, we are conducting two major subprograms: the SPectroscopic IDentification of eROSITA Sources (SPIDERS), investigating X-ray AGNs and galaxies in X-ray clusters, and the Time Domain Spectroscopic Survey (TDSS), obtaining spectra of variable sources. All programs use the 2.5 m Sloan Foundation Telescope at the Apache Point Observatory; observations there began in Summer 2014. APOGEE-2 also operates a second near-infrared spectrograph at the 2.5 m du Pont Telescope at Las Campanas Observatory, with observations beginning in early 2017. Observations at both facilities are scheduled to continue through 2020. In keeping with previous SDSS policy, SDSS-IV provides regularly scheduled public data releases; the first one, Data Release 13, was made available in 2016 July.

1,200 citations


Journal ArticleDOI
TL;DR: The intent of this document is to provide an introduction to modal analysis that is accessible to the larger fluid dynamics community and presents a brief overview of several of the well-established techniques.
Abstract: Simple aerodynamic configurations under even modest conditions can exhibit complex flows with a wide range of temporal and spatial features. It has become common practice in the analysis of these flows to look for and extract physically important features, or modes, as a first step in the analysis. This step typically starts with a modal decomposition of an experimental or numerical dataset of the flowfield, or of an operator relevant to the system. We describe herein some of the dominant techniques for accomplishing these modal decompositions and analyses that have seen a surge of activity in recent decades [1–8]. For a nonexpert, keeping track of recent developments can be daunting, and the intent of this document is to provide an introduction to modal analysis that is accessible to the larger fluid dynamics community. In particular, we present a brief overview of several of the well-established techniques and clearly lay the framework of these methods using familiar linear algebra. The modal analysis techniques covered in this paper include the proper orthogonal decomposition (POD), balanced proper orthogonal decomposition (balanced POD), dynamic mode decomposition (DMD), Koopman analysis, global linear stability analysis, and resolvent analysis.

1,110 citations


Journal ArticleDOI
TL;DR: The use of eDNA metabarcoding for surveying animal and plant richness, and the challenges in using eDNA approaches to estimate relative abundance are reviewed, which distill what is known about the ability of different eDNA sample types to approximate richness in space and across time.
Abstract: The genomic revolution has fundamentally changed how we survey biodiversity on earth. High-throughput sequencing ("HTS") platforms now enable the rapid sequencing of DNA from diverse kinds of environmental samples (termed "environmental DNA" or "eDNA"). Coupling HTS with our ability to associate sequences from eDNA with a taxonomic name is called "eDNA metabarcoding" and offers a powerful molecular tool capable of noninvasively surveying species richness from many ecosystems. Here, we review the use of eDNA metabarcoding for surveying animal and plant richness, and the challenges in using eDNA approaches to estimate relative abundance. We highlight eDNA applications in freshwater, marine and terrestrial environments, and in this broad context, we distill what is known about the ability of different eDNA sample types to approximate richness in space and across time. We provide guiding questions for study design and discuss the eDNA metabarcoding workflow with a focus on primers and library preparation methods. We additionally discuss important criteria for consideration of bioinformatic filtering of data sets, with recommendations for increasing transparency. Finally, looking to the future, we discuss emerging applications of eDNA metabarcoding in ecology, conservation, invasion biology, biomonitoring, and how eDNA metabarcoding can empower citizen science and biodiversity education.

1,038 citations


Journal ArticleDOI
TL;DR: Raising global scientific and public awareness of the plight of the world’s primates and the costs of their loss to ecosystem health and human society is imperative.
Abstract: Nonhuman primates, our closest biological relatives, play important roles in the livelihoods, cultures, and religions of many societies and offer unique insights into human evolution, biology, behavior, and the threat of emerging diseases. They are an essential component of tropical biodiversity, contributing to forest regeneration and ecosystem health. Current information shows the existence of 504 species in 79 genera distributed in the Neotropics, mainland Africa, Madagascar, and Asia. Alarmingly, ~60% of primate species are now threatened with extinction and ~75% have declining populations. This situation is the result of escalating anthropogenic pressures on primates and their habitats—mainly global and local market demands, leading to extensive habitat loss through the expansion of industrial agriculture, large-scale cattle ranching, logging, oil and gas drilling, mining, dam building, and the construction of new road networks in primate range regions. Other important drivers are increased bushmeat hunting and the illegal trade of primates as pets and primate body parts, along with emerging threats, such as climate change and anthroponotic diseases. Often, these pressures act in synergy, exacerbating primate population declines. Given that primate range regions overlap extensively with a large, and rapidly growing, human population characterized by high levels of poverty, global attention is needed immediately to reverse the looming risk of primate extinctions and to attend to local human needs in sustainable ways. Raising global scientific and public awareness of the plight of the world’s primates and the costs of their loss to ecosystem health and human society is imperative.

893 citations


Journal ArticleDOI
Albert M. Sirunyan, Armen Tumasyan, Wolfgang Adam1, Ece Aşılar1  +2212 moreInstitutions (157)
TL;DR: A fully-fledged particle-flow reconstruction algorithm tuned to the CMS detector was developed and has been consistently used in physics analyses for the first time at a hadron collider as mentioned in this paper.
Abstract: The CMS apparatus was identified, a few years before the start of the LHC operation at CERN, to feature properties well suited to particle-flow (PF) reconstruction: a highly-segmented tracker, a fine-grained electromagnetic calorimeter, a hermetic hadron calorimeter, a strong magnetic field, and an excellent muon spectrometer. A fully-fledged PF reconstruction algorithm tuned to the CMS detector was therefore developed and has been consistently used in physics analyses for the first time at a hadron collider. For each collision, the comprehensive list of final-state particles identified and reconstructed by the algorithm provides a global event description that leads to unprecedented CMS performance for jet and hadronic τ decay reconstruction, missing transverse momentum determination, and electron and muon identification. This approach also allows particles from pileup interactions to be identified and enables efficient pileup mitigation methods. The data collected by CMS at a centre-of-mass energy of 8\TeV show excellent agreement with the simulation and confirm the superior PF performance at least up to an average of 20 pileup interactions.

719 citations


Journal ArticleDOI
TL;DR: Research on crisis management and resilience has sought to explain how individuals and organizations anticipate and respond to adversity, yet there has been little integration across different disciplines as discussed by the authors. But, surprisingly, there have been few integration across disciplines.
Abstract: Research on crisis management and resilience has sought to explain how individuals and organizations anticipate and respond to adversity, yet—surprisingly—there has been little integration across t...

702 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe the two interwoven paths by which citizen science can improve conservation efforts, natural resource management, and environmental protection, and describe the investments needed to create a citizen science program.

646 citations


Journal ArticleDOI
19 Jul 2017-Nature
TL;DR: The results of new excavations conducted at Madjedbebe, a rock shelter in northern Australia, set a new minimum age of around 65,000 years ago for the arrival of humans in Australia, the dispersal of modern humans out of Africa, and the subsequent interactions ofmodern humans with Neanderthals and Denisovans.
Abstract: The time of arrival of people in Australia is an unresolved question. It is relevant to debates about when modern humans first dispersed out of Africa and when their descendants incorporated genetic material from Neanderthals, Denisovans and possibly other hominins. Humans have also been implicated in the extinction of Australia’s megafauna. Here we report the results of new excavations conducted at Madjedbebe, a rock shelter in northern Australia. Artefacts in primary depositional context are concentrated in three dense bands, with the stratigraphic integrity of the deposit demonstrated by artefact refits and by optical dating and other analyses of the sediments. Human occupation began around 65,000 years ago, with a distinctive stone tool assemblage including grinding stones, ground ochres, reflective additives and ground-edge hatchet heads. This evidence sets a new minimum age for the arrival of humans in Australia, the dispersal of modern humans out of Africa, and the subsequent interactions of modern humans with Neanderthals and Denisovans. Optical dating of sediments containing stone artefacts newly excavated at Madjedbebe, Australia, indicate that human occupation began around 65,000 years ago, thereby setting a new minimum age for the arrival of people in Australia. When did humans first colonize Australia? The date of the initial landing on the continent that is now associated with cold lager and 'Waltzing Matilda' has been highly controversial. Dates from a site called Madjedbebe in northern Australia had put the presence of modern humans in Australia at between 60,000 and 50,000 years ago, but these results have since been hotly contested. Here, the results from a comprehensive program of dating of new excavations at the site confirm that people first arrived there around 65,000 years ago. The results show that humans reached Australia well before the extinction of the Australian megafauna and the disappearance of Homo floresiensis in neighbouring Indonesia.

597 citations


Journal ArticleDOI
01 Sep 2017-Science
TL;DR: It is demonstrated that under reaction conditions, mobilized Cu ions can travel through zeolite windows and form transient ion pairs that participate in an oxygen (O2)–mediated CuI→CuII redox step integral to SCR.
Abstract: Copper ions exchanged into zeolites are active for the selective catalytic reduction (SCR) of nitrogen oxides (NO x ) with ammonia (NH3), but the low-temperature rate dependence on copper (Cu) volumetric density is inconsistent with reaction at single sites. We combine steady-state and transient kinetic measurements, x-ray absorption spectroscopy, and first-principles calculations to demonstrate that under reaction conditions, mobilized Cu ions can travel through zeolite windows and form transient ion pairs that participate in an oxygen (O2)-mediated CuI→CuII redox step integral to SCR. Electrostatic tethering to framework aluminum centers limits the volume that each ion can explore and thus its capacity to form an ion pair. The dynamic, reversible formation of multinuclear sites from mobilized single atoms represents a distinct phenomenon that falls outside the conventional boundaries of a heterogeneous or homogeneous catalyst.

Journal ArticleDOI
22 Dec 2017-Science
TL;DR: In this paper, the authors present ultraviolet, optical, and infrared light curves of SSS17a extending from 10.9 hours to 18 days post-merger, showing that the late-time light curve indicates that SSS 17a produced at least 0.05 solar masses of heavy elements, demonstrating that neutron star mergers play a role in rapid neutron capture (r-process) nucleosynthesis in the universe.
Abstract: On 17 August 2017, gravitational waves (GWs) were detected from a binary neutron star merger, GW170817, along with a coincident short gamma-ray burst, GRB 170817A. An optical transient source, Swope Supernova Survey 17a (SSS17a), was subsequently identified as the counterpart of this event. We present ultraviolet, optical, and infrared light curves of SSS17a extending from 10.9 hours to 18 days postmerger. We constrain the radioactively powered transient resulting from the ejection of neutron-rich material. The fast rise of the light curves, subsequent decay, and rapid color evolution are consistent with multiple ejecta components of differing lanthanide abundance. The late-time light curve indicates that SSS17a produced at least ~0.05 solar masses of heavy elements, demonstrating that neutron star mergers play a role in rapid neutron capture (r-process) nucleosynthesis in the universe.

Journal ArticleDOI
TL;DR: In this paper, the trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions.
Abstract: This paper describes the CMS trigger system and its performance during Run 1 of the LHC. The trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions. The first level of the trigger is implemented in hardware, and selects events containing detector signals consistent with an electron, photon, muon, tau lepton, jet, or missing transverse energy. A programmable menu of up to 128 object-based algorithms is used to select events for subsequent processing. The trigger thresholds are adjusted to the LHC instantaneous luminosity during data taking in order to restrict the output rate to 100 kHz, the upper limit imposed by the CMS readout electronics. The second level, implemented in software, further refines the purity of the output stream, selecting an average rate of 400 Hz for offline event storage. The objectives, strategy and performance of the trigger system during the LHC Run 1 are described.

Journal ArticleDOI
TL;DR: Data Release 13 (DR13) as discussed by the authors provides the first 1390 spatially resolved integral field unit observations of nearby galaxies from the Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2), Mapping Nearby Galaxies at APO (MaNGA), and the Extended Baryon Oscillation Spectroscopic Survey (eBOSS).
Abstract: The fourth generation of the Sloan Digital Sky Survey (SDSS-IV) began observations in 2014 July. It pursues three core programs: the Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2), Mapping Nearby Galaxies at APO (MaNGA), and the Extended Baryon Oscillation Spectroscopic Survey (eBOSS). As well as its core program, eBOSS contains two major subprograms: the Time Domain Spectroscopic Survey (TDSS) and the SPectroscopic IDentification of ERosita Sources (SPIDERS). This paper describes the first data release from SDSS-IV, Data Release 13 (DR13). DR13 makes publicly available the first 1390 spatially resolved integral field unit observations of nearby galaxies from MaNGA. It includes new observations from eBOSS, completing the Sloan Extended QUasar, Emission-line galaxy, Luminous red galaxy Survey (SEQUELS), which also targeted variability-selected objects and X-ray-selected objects. DR13 includes new reductions of the SDSS-III BOSS data, improving the spectrophotometric calibration and redshift classification, and new reductions of the SDSS-III APOGEE-1 data, improving stellar parameters for dwarf stars and cooler stars. DR13 provides more robust and precise photometric calibrations. Value-added target catalogs relevant for eBOSS, TDSS, and SPIDERS and an updated red-clump catalog for APOGEE are also available. This paper describes the location and format of the data and provides references to important technical papers. The SDSS web site, http://www.sdss.org, provides links to the data, tutorials, examples of data access, and extensive documentation of the reduction and analysis procedures. DR13 is the first of a scheduled set that will contain new data and analyses from the planned ∼6 yr operations of SDSS-IV.

Journal ArticleDOI
30 Mar 2017-Oncogene
TL;DR: It is shown that CAFs exposed to chemotherapy have an active role in regulating the survival and proliferation of cancer cells and the potential for exosome inhibitors as treatment options alongside chemotherapy for overcoming PDAC chemoresistance is shown.
Abstract: Cancer-associated fibroblasts (CAFs) comprise the majority of the tumor bulk of pancreatic ductal adenocarcinomas (PDACs). Current efforts to eradicate these tumors focus predominantly on targeting the proliferation of rapidly growing cancer epithelial cells. We know that this is largely ineffective with resistance arising in most tumors following exposure to chemotherapy. Despite the long-standing recognition of the prominence of CAFs in PDAC, the effect of chemotherapy on CAFs and how they may contribute to drug resistance in neighboring cancer cells is not well characterized. Here, we show that CAFs exposed to chemotherapy have an active role in regulating the survival and proliferation of cancer cells. We found that CAFs are intrinsically resistant to gemcitabine, the chemotherapeutic standard of care for PDAC. Further, CAFs exposed to gemcitabine significantly increase the release of extracellular vesicles called exosomes. These exosomes increased chemoresistance-inducing factor, Snail, in recipient epithelial cells and promote proliferation and drug resistance. Finally, treatment of gemcitabine-exposed CAFs with an inhibitor of exosome release, GW4869, significantly reduces survival in co-cultured epithelial cells, signifying an important role of CAF exosomes in chemotherapeutic drug resistance. Collectively, these findings show the potential for exosome inhibitors as treatment options alongside chemotherapy for overcoming PDAC chemoresistance.

Journal ArticleDOI
Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2285 moreInstitutions (147)
TL;DR: In this paper, an improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fb^(-1) collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented.
Abstract: Improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fb^(-1) collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented. The corrections as a function of pseudorapidity η and transverse momentum p_T are extracted from data and simulated events combining several channels and methods. They account successively for the effects of pileup, uniformity of the detector response, and residual data-simulation jet energy scale differences. Further corrections, depending on the jet flavor and distance parameter (jet size) R, are also presented. The jet energy resolution is measured in data and simulated events and is studied as a function of pileup, jet size, and jet flavor. Typical jet energy resolutions at the central rapidities are 15–20% at 30 GeV, about 10% at 100 GeV, and 5% at 1 TeV. The studies exploit events with dijet topology, as well as photon+jet, Z+jet and multijet events. Several new techniques are used to account for the various sources of jet energy scale corrections, and a full set of uncertainties, and their correlations, are provided. The final uncertainties on the jet energy scale are below 3% across the phase space considered by most analyses (p_T > 30 GeV and 0|η| 30 GeV is reached, when excluding the jet flavor uncertainties, which are provided separately for different jet flavors. A new benchmark for jet energy scale determination at hadron colliders is achieved with 0.32% uncertainty for jets with p_T of the order of 165–330 GeV, and |η| < 0.8.

Journal ArticleDOI
TL;DR: In this article, the authors empirically test an integrative model linking tourists' emotional experiences, perceived overall image, satisfaction, and intention to recommend using data collected from domestic tourists visiting Sardinia, Italy.
Abstract: The purpose of this study is to empirically test an integrative model linking tourists’ emotional experiences, perceived overall image, satisfaction, and intention to recommend. The model was tested using data collected from domestic tourists visiting Sardinia, Italy. Results show that tourists’ emotional experiences act as antecedents of perceived overall image and satisfaction evaluations. In addition, overall image has a positive influence on tourist satisfaction and intention to recommend. The study expands current theorizations by examining the merits of emotions in tourist behavior models. From a practical perspective, the study offers important implications for destination marketers.

Journal ArticleDOI
TL;DR: This study examined univariate and multivariate skewness and kurtosis collected from authors of articles published in Psychological Science and the American Education Research Journal and found that 74 % of univariate distributions and 68 % multivariate distributions deviated from normal distributions.
Abstract: Nonnormality of univariate data has been extensively examined previously (Blanca et al., Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 9(2), 78-84, 2013; Miceeri, Psychological Bulletin, 105(1), 156, 1989). However, less is known of the potential nonnormality of multivariate data although multivariate analysis is commonly used in psychological and educational research. Using univariate and multivariate skewness and kurtosis as measures of nonnormality, this study examined 1,567 univariate distriubtions and 254 multivariate distributions collected from authors of articles published in Psychological Science and the American Education Research Journal. We found that 74 % of univariate distributions and 68 % multivariate distributions deviated from normal distributions. In a simulation study using typical values of skewness and kurtosis that we collected, we found that the resulting type I error rates were 17 % in a t-test and 30 % in a factor analysis under some conditions. Hence, we argue that it is time to routinely report skewness and kurtosis along with other summary statistics such as means and variances. To facilitate future report of skewness and kurtosis, we provide a tutorial on how to compute univariate and multivariate skewness and kurtosis by SAS, SPSS, R and a newly developed Web application.

Posted Content
Yonit Hochberg1, Yonit Hochberg2, A. N. Villano3, Andrei Afanasev4  +238 moreInstitutions (98)
TL;DR: The white paper summarizes the workshop "U.S. Cosmic Visions: New Ideas in Dark Matter" held at University of Maryland on March 23-25, 2017.
Abstract: This white paper summarizes the workshop "U.S. Cosmic Visions: New Ideas in Dark Matter" held at University of Maryland on March 23-25, 2017.

Journal ArticleDOI
TL;DR: The need for refined diagnostic tools and effective control options to scale up public health interventions and improve clinical detection and management of soil-transmitted helminthiasis is highlighted.

Book ChapterDOI
Lin Yang1, Yizhe Zhang1, Jianxu Chen1, Siyuan Zhang1, Danny Z. Chen1 
10 Sep 2017
TL;DR: A deep active learning framework that combines fully convolutional network (FCN) and active learning to significantly reduce annotation effort by making judicious suggestions on the most effective annotation areas is presented.
Abstract: Image segmentation is a fundamental problem in biomedical image analysis. Recent advances in deep learning have achieved promising results on many biomedical image segmentation benchmarks. However, due to large variations in biomedical images (different modalities, image settings, objects, noise, etc.), to utilize deep learning on a new application, it usually needs a new set of training data. This can incur a great deal of annotation effort and cost, because only biomedical experts can annotate effectively, and often there are too many instances in images (e.g., cells) to annotate. In this paper, we aim to address the following question: With limited effort (e.g., time) for annotation, what instances should be annotated in order to attain the best performance? We present a deep active learning framework that combines fully convolutional network (FCN) and active learning to significantly reduce annotation effort by making judicious suggestions on the most effective annotation areas. We utilize uncertainty and similarity information provided by FCN and formulate a generalized version of the maximum set cover problem to determine the most representative and uncertain areas for annotation. Extensive experiments using the 2015 MICCAI Gland Challenge dataset and a lymph node ultrasound image segmentation dataset show that, using annotation suggestions by our method, state-of-the-art segmentation performance can be achieved by using only 50% of training data.

Book ChapterDOI
01 Jan 2017
TL;DR: This chapter provides a context for the study of benthic algal biomass, discusses in detail some of the more commonly used approaches to measure bentho-algae biomass, and describes a field exercise to examine the influence of irradiance onAlgal biomass.
Abstract: Biomass is one of the most fundamental measurements made in ecology. In stream ecology, biomass is frequently used to estimate the abundance of benthic primary producers, both autotrophic and heterotrophic. In this chapter, we (1) provide a context for the study of benthic algal biomass; (2) discuss in detail some of the more commonly used approaches to measure benthic algal biomass; and (3) describe a field exercise to examine the influence of irradiance on algal biomass, whereby these approaches can be employed and compared with each other to assess their individual performance.

Journal ArticleDOI
TL;DR: This work identifies four key issues that present challenges to understanding and classifying mental disorder and discusses how the three systems’ approaches to these key issues correspond or diverge as a result of their different histories, purposes, and constituencies.
Abstract: The diagnosis of mental disorder initially appears relatively straightforward: Patients present with symptoms or visible signs of illness; health professionals make diagnoses based primarily on these symptoms and signs; and they prescribe medication, psychotherapy, or both, accordingly. However, despite a dramatic expansion of knowledge about mental disorders during the past half century, understanding of their components and processes remains rudimentary. We provide histories and descriptions of three systems with different purposes relevant to understanding and classifying mental disorder. Two major diagnostic manuals-the International Classification of Diseases and the Diagnostic and Statistical Manual of Mental Disorders-provide classification systems relevant to public health, clinical diagnosis, service provision, and specific research applications, the former internationally and the latter primarily for the United States. In contrast, the National Institute of Mental Health's Research Domain Criteria provides a framework that emphasizes integration of basic behavioral and neuroscience research to deepen the understanding of mental disorder. We identify four key issues that present challenges to understanding and classifying mental disorder: etiology, including the multiple causality of mental disorder; whether the relevant phenomena are discrete categories or dimensions; thresholds, which set the boundaries between disorder and nondisorder; and comorbidity, the fact that individuals with mental illness often meet diagnostic requirements for multiple conditions. We discuss how the three systems' approaches to these key issues correspond or diverge as a result of their different histories, purposes, and constituencies. Although the systems have varying degrees of overlap and distinguishing features, they share the goal of reducing the burden of suffering due to mental disorder.

Proceedings ArticleDOI
01 Dec 2017
TL;DR: A transient Presiach model is developed that accurately predicts minor loop trajectories and remnant polarization charge for arbitrary pulse width, voltage, and history of FeFET synapses and reveals a 103 to 106 acceleration in online learning latency over multi-state RRAM based analog synapses.
Abstract: The memory requirement of at-scale deep neural networks (DNN) dictate that synaptic weight values be stored and updated in off-chip memory such as DRAM, limiting the energy efficiency and training time. Monolithic cross-bar / pseudo cross-bar arrays with analog non-volatile memories capable of storing and updating weights on-chip offer the possibility of accelerating DNN training. Here, we harness the dynamics of voltage controlled partial polarization switching in ferroelectric-FETs (FeFET) to demonstrate such an analog synapse. We develop a transient Presiach model that accurately predicts minor loop trajectories and remnant polarization charge (P r ) for arbitrary pulse width, voltage, and history. We experimentally demonstrate a 5-bit FeFET synapse with symmetric potentiation and depression characteristics, and a 45x tunable range in conductance with 75ns update pulse. A circuit macro-model is used to evaluate and benchmark on-chip learning performance (area, latency, energy, accuracy) of FeFET synaptic core revealing a 103 to 106 acceleration in online learning latency over multi-state RRAM based analog synapses.

Journal ArticleDOI
TL;DR: A state-of-the-art platform in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers is discussed that can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations.

Journal ArticleDOI
TL;DR: Current status of the rapidly evolving field of microvesicle biology is reviewed, highlighting critical regulatory roles for several small GTPases in the biology and biogenesis of shed microvesicles.
Abstract: The ability of cells to transmit bioactive molecules to recipient cells and the extracellular environment is a fundamental requirement for both normal physiology and disease pathogenesis. It has traditionally been thought that soluble factors released from cells were responsible for this cellular signaling but recent research has revealed a fundamental role for microvesicles in this process. Microvesicles are heterogeneous membrane-bound sacs that are shed from the surface of cells into the extracellular environment in a highly regulated process. They are shed following the selective incorporation of a host of molecular cargo including multiple types of proteins and nucleic acids. In addition to providing new insight into the etiology of complex human diseases, microvesicles also show great promise as a tool for advanced diagnosis and therapy as we move forward into a new age of personalized medicine. Here we review current status of the rapidly evolving field of microvesicle biology, highlighting critical regulatory roles for several small GTPases in the biology and biogenesis of shed microvesicles.

Journal ArticleDOI
TL;DR: This model reveals that mixed halide perovskites can be stabilized against phase separation by deliberately engineering carrier diffusion lengths and injected carrier densities, and explains observed non-linear intensity dependencies, as well as self-limited growth of iodide-rich domains.
Abstract: Mixed halide hybrid perovskites, CH3NH3Pb(I1−x Br x )3, represent good candidates for low-cost, high efficiency photovoltaic, and light-emitting devices. Their band gaps can be tuned from 1.6 to 2.3 eV, by changing the halide anion identity. Unfortunately, mixed halide perovskites undergo phase separation under illumination. This leads to iodide- and bromide-rich domains along with corresponding changes to the material’s optical/electrical response. Here, using combined spectroscopic measurements and theoretical modeling, we quantitatively rationalize all microscopic processes that occur during phase separation. Our model suggests that the driving force behind phase separation is the bandgap reduction of iodide-rich phases. It additionally explains observed non-linear intensity dependencies, as well as self-limited growth of iodide-rich domains. Most importantly, our model reveals that mixed halide perovskites can be stabilized against phase separation by deliberately engineering carrier diffusion lengths and injected carrier densities. Mixed halide hybrid perovskites possess tunable band gaps, however, under illumination they undergo phase separation. Using spectroscopic measurements and theoretical modelling, Draguta and Sharia et al. quantitatively rationalize the microscopic processes that occur during phase separation.

Journal ArticleDOI
J. P. Lees1, V. Poireau1, V. Tisserand1, E. Grauges2  +231 moreInstitutions (54)
TL;DR: Limits on the coupling strength of A^{'} to e^{+}e^{-} in the mass range m_{A^{'}}≤8 GeV are set, which exclude the values of the A^' coupling suggested by thedark-photon interpretation of the muon (g-2)_{μ} anomaly, as well as a broad range of parameters for the dark-sector models.
Abstract: We search for single-photon events in 53 fb^{-1} of e^{+}e^{-} collision data collected with the BABAR detector at the PEP-II B-Factory. We look for events with a single high-energy photon and a large missing momentum and energy, consistent with production of a spin-1 particle A^{'} through the process e^{+}e^{-}→γA^{'}; A^{'}→invisible. Such particles, referred to as "dark photons," are motivated by theories applying a U(1) gauge symmetry to dark matter. We find no evidence for such processes and set 90% confidence level upper limits on the coupling strength of A^{'} to e^{+}e^{-} in the mass range m_{A^{'}}≤8 GeV. In particular, our limits exclude the values of the A^{'} coupling suggested by the dark-photon interpretation of the muon (g-2)_{μ} anomaly, as well as a broad range of parameters for the dark-sector models.

Journal ArticleDOI
TL;DR: The prevalence of fluorinated chemicals in fast food packaging demonstrates their potentially significant contribution to dietary PFAS exposure and environmental contamination during production and disposal.
Abstract: Per- and polyfluoroalkyl substances (PFASs) are highly persistent synthetic chemicals, some of which have been associated with cancer, developmental toxicity, immunotoxicity, and other health effects. PFASs in grease-resistant food packaging can leach into food and increase dietary exposure. We collected ∼400 samples of food contact papers, paperboard containers, and beverage containers from fast food restaurants throughout the United States and measured total fluorine using particle-induced γ-ray emission (PIGE) spectroscopy. PIGE can rapidly and inexpensively measure total fluorine in solid-phase samples. We found that 46% of food contact papers and 20% of paperboard samples contained detectable fluorine (>16 nmol/cm2). Liquid chromatography/high-resolution mass spectrometry analysis of a subset of 20 samples found perfluorocarboxylates, perfluorosulfonates, and other known PFASs and/or unidentified polyfluorinated compounds (based on nontargeted analysis). The total peak area for PFASs was higher in 70...

Journal ArticleDOI
TL;DR: The late-time light curve indicates that SSS17a produced at least ~0.05 solar masses of heavy elements, demonstrating that neutron star mergers play a role in rapid neutron capture (r-process) nucleosynthesis in the universe.
Abstract: On 2017 August 17, gravitational waves were detected from a binary neutron star merger, GW170817, along with a coincident short gamma-ray burst, GRB170817A. An optical transient source, Swope Supernova Survey 17a (SSS17a), was subsequently identified as the counterpart of this event. We present ultraviolet, optical and infrared light curves of SSS17a extending from 10.9 hours to 18 days post-merger. We constrain the radioactively-powered transient resulting from the ejection of neutron-rich material. The fast rise of the light curves, subsequent decay, and rapid color evolution are consistent with multiple ejecta components of differing lanthanide abundance. The late-time light curve indicates that SSS17a produced at least ~0.05 solar masses of heavy elements, demonstrating that neutron star mergers play a role in r-process nucleosynthesis in the Universe.