Showing papers by "University of Wisconsin-Madison published in 2017"
••
TL;DR: ImageJ2 as mentioned in this paper is the next generation of ImageJ, which provides a host of new functionality and separates concerns, fully decoupling the data model from the user interface.
Abstract: ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software’s ability to handle the requirements of modern science. We rewrote the entire ImageJ codebase, engineering a redesigned plugin mechanism intended to facilitate extensibility at every level, with the goal of creating a more powerful tool that continues to serve the existing community while addressing a wider range of scientific requirements. This next-generation ImageJ, called “ImageJ2” in places where the distinction matters, provides a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. Scientific imaging benefits from open-source programs that advance new method development and deployment to a diverse audience. ImageJ has continuously evolved with this idea in mind; however, new and emerging scientific requirements have posed corresponding challenges for ImageJ’s development. The described improvements provide a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs. Future efforts will focus on implementing new algorithms in this framework and expanding collaborations with other popular scientific software suites.
4,093 citations
••
02 Apr 2017
TL;DR: This work introduces the first practical demonstration of an attacker controlling a remotely hosted DNN with no such knowledge, and finds that this black-box attack strategy is capable of evading defense strategies previously found to make adversarial example crafting harder.
Abstract: Machine learning (ML) models, e.g., deep neural networks (DNNs), are vulnerable to adversarial examples: malicious inputs modified to yield erroneous model outputs, while appearing unmodified to human observers. Potential attacks include having malicious content like malware identified as legitimate or controlling vehicle behavior. Yet, all existing adversarial example attacks require knowledge of either the model internals or its training data. We introduce the first practical demonstration of an attacker controlling a remotely hosted DNN with no such knowledge. Indeed, the only capability of our black-box adversary is to observe labels given by the DNN to chosen inputs. Our attack strategy consists in training a local model to substitute for the target DNN, using inputs synthetically generated by an adversary and labeled by the target DNN. We use the local substitute to craft adversarial examples, and find that they are misclassified by the targeted DNN. To perform a real-world and properly-blinded evaluation, we attack a DNN hosted by MetaMind, an online deep learning API. We find that their DNN misclassifies 84.24% of the adversarial examples crafted with our substitute. We demonstrate the general applicability of our strategy to many ML techniques by conducting the same attack against models hosted by Amazon and Google, using logistic regression substitutes. They yield adversarial examples misclassified by Amazon and Google at rates of 96.19% and 88.94%. We also find that this black-box attack strategy is capable of evading defense strategies previously found to make adversarial example crafting harder.
2,712 citations
••
Carnegie Mellon University1, Leibniz Institute for Astrophysics Potsdam2, Lawrence Berkeley National Laboratory3, Sternberg Astronomical Institute4, New Mexico State University5, Ohio State University6, University of Utah7, Yale University8, Autonomous University of Madrid9, University of Barcelona10, Harvard University11, Aix-Marseille University12, University of Paris13, Pierre-and-Marie-Curie University14, Max Planck Society15, University of California, Berkeley16, University of California, Irvine17, University of Portsmouth18, University of Cambridge19, Spanish National Research Council20, University of La Laguna21, Institut d'Astrophysique de Paris22, Princeton University23, University of Edinburgh24, Sejong University25, Kansas State University26, Pennsylvania State University27, National University of La Plata28, National Scientific and Technical Research Council29, Ohio University30, Brookhaven National Laboratory31, New York University32, University of St Andrews33, National Autonomous University of Mexico34, Open University35, University of Wisconsin-Madison36, Chinese Academy of Sciences37, University of Pittsburgh38, Case Western Reserve University39
TL;DR: In this article, the authors present cosmological results from the final galaxy clustering data set of the Baryon Oscillation Spectroscopic Survey, part of the Sloan Digital Sky Survey III.
Abstract: We present cosmological results from the final galaxy clustering data set of the Baryon Oscillation Spectroscopic Survey, part of the Sloan Digital Sky Survey III. Our combined galaxy sample comprises 1.2 million massive galaxies over an effective area of 9329 deg^2 and volume of 18.7 Gpc^3, divided into three partially overlapping redshift slices centred at effective redshifts 0.38, 0.51 and 0.61. We measure the angular diameter distance and Hubble parameter H from the baryon acoustic oscillation (BAO) method, in combination with a cosmic microwave background prior on the sound horizon scale, after applying reconstruction to reduce non-linear effects on the BAO feature. Using the anisotropic clustering of the pre-reconstruction density field, we measure the product D_MH from the Alcock–Paczynski (AP) effect and the growth of structure, quantified by fσ_8(z), from redshift-space distortions (RSD). We combine individual measurements presented in seven companion papers into a set of consensus values and likelihoods, obtaining constraints that are tighter and more robust than those from any one method; in particular, the AP measurement from sub-BAO scales sharpens constraints from post-reconstruction BAOs by breaking degeneracy between D_M and H. Combined with Planck 2016 cosmic microwave background measurements, our distance scale measurements simultaneously imply curvature Ω_K = 0.0003 ± 0.0026 and a dark energy equation-of-state parameter w = −1.01 ± 0.06, in strong affirmation of the spatially flat cold dark matter (CDM) model with a cosmological constant (ΛCDM). Our RSD measurements of fσ_8, at 6 per cent precision, are similarly consistent with this model. When combined with supernova Ia data, we find H_0 = 67.3 ± 1.0 km s^−1 Mpc^−1 even for our most general dark energy model, in tension with some direct measurements. Adding extra relativistic species as a degree of freedom loosens the constraint only slightly, to H_0 = 67.8 ± 1.2 km s^−1 Mpc^−1. Assuming flat ΛCDM, we find Ω_m = 0.310 ± 0.005 and H_0 = 67.6 ± 0.5 km s^−1 Mpc^−1, and we find a 95 per cent upper limit of 0.16 eV c^−2 on the neutrino mass sum.
2,413 citations
••
TL;DR: TrackMate is an extensible platform where developers can easily write their own detection, particle linking, visualization or analysis algorithms within the TrackMate environment and is validated for quantitative lifetime analysis of clathrin-mediated endocytosis in plant cells.
2,356 citations
••
University of the Basque Country1, University of Barcelona2, Technical University of Denmark3, Malmö University4, University of Copenhagen5, SINTEF6, Aarhus University7, Brown University8, University of Wisconsin-Madison9, University of Warwick10, Carnegie Mellon University11, Purdue University12, Karlsruhe Institute of Technology13, ETH Zurich14, University of Freiburg15
TL;DR: The atomic simulation environment (ASE) provides modules for performing many standard simulation tasks such as structure optimization, molecular dynamics, handling of constraints and performing nudged elastic band calculations.
Abstract: The Atomic Simulation Environment (ASE) is a software package written in the Python programming language with the aim of setting up, steering, and analyzing atomistic simula- tions. In ASE, tasks are fully scripted in Python. The powerful syntax of Python combined with the NumPy array library make it possible to perform very complex simulation tasks. For example, a sequence of calculations may be performed with the use of a simple "for-loop" construction. Calculations of energy, forces, stresses and other quantities are performed through interfaces to many external electronic structure codes or force fields using a uniform interface. On top of this calculator interface, ASE provides modules for performing many standard simulation tasks such as structure optimization, molecular dynamics, handling of constraints and performing nudged elastic band calculations.
2,282 citations
•
TL;DR: The entire ImageJ codebase was rewrote, engineering a redesigned plugin mechanism intended to facilitate extensibility at every level, with the goal of creating a more powerful tool that continues to serve the existing community while addressing a wider range of scientific requirements.
Abstract: ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software's ability to handle the requirements of modern science. Due to these new and emerging challenges in scientific imaging, ImageJ is at a critical development crossroads.
We present ImageJ2, a total redesign of ImageJ offering a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. ImageJ2 provides a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs.
2,156 citations
••
Case Western Reserve University1, University of Wisconsin-Madison2, Imperial College London3, South Dakota School of Mines and Technology4, University of Maryland, College Park5, University of California, Berkeley6, Lawrence Livermore National Laboratory7, University of Coimbra8, University of South Dakota9, Yale University10, University of California, Santa Barbara11, Brown University12, University of California, Davis13, Lawrence Berkeley National Laboratory14, University College London15, University of Rochester16, SLAC National Accelerator Laboratory17, Texas A&M University18, State University of New York System19, University of Edinburgh20
TL;DR: This search yields no evidence of WIMP nuclear recoils and constraints on spin-independent weakly interacting massive particle (WIMP)-nucleon scattering using a 3.35×10^{4} kg day exposure of the Large Underground Xenon experiment are reported.
Abstract: We report constraints on spin-independent weakly interacting massive particle (WIMP)-nucleon scattering using a 3.35×10^{4} kg day exposure of the Large Underground Xenon (LUX) experiment. A dual-phase xenon time projection chamber with 250 kg of active mass is operated at the Sanford Underground Research Facility under Lead, South Dakota (USA). With roughly fourfold improvement in sensitivity for high WIMP masses relative to our previous results, this search yields no evidence of WIMP nuclear recoils. At a WIMP mass of 50 GeV c^{-2}, WIMP-nucleon spin-independent cross sections above 2.2×10^{-46} cm^{2} are excluded at the 90% confidence level. When combined with the previously reported LUX exposure, this exclusion strengthens to 1.1×10^{-46} cm^{2} at 50 GeV c^{-2}.
1,844 citations
••
TL;DR: The Trainable Weka Segmentation (TWS), a machine learning tool that leverages a limited number of manual annotations in order to train a classifier and segment the remaining data automatically, is introduced.
Abstract: Summary State-of-the-art light and electron microscopes are capable of acquiring large image datasets, but quantitatively evaluating the data often involves manually annotating structures of interest. This process is time-consuming and often a major bottleneck in the evaluation pipeline. To overcome this problem, we have introduced the Trainable Weka Segmentation (TWS), a machine learning tool that leverages a limited number of manual annotations in order to train a classifier and segment the remaining data automatically. In addition, TWS can provide unsupervised segmentation learning schemes (clustering) and can be customized to employ user-designed image features or classifiers. Availability and implementation TWS is distributed as open-source software as part of the Fiji image processing distribution of ImageJ at http://imagej.net/Trainable_Weka_Segmentation . Contact ignacio.arganda@ehu.eus. Supplementary information Supplementary data are available at Bioinformatics online.
1,416 citations
••
Michael R. Blanton1, Matthew A. Bershady2, Bela Abolfathi3, Franco D. Albareti4 +412 more•Institutions (91)
TL;DR: SDSS-IV as mentioned in this paper is a project encompassing three major spectroscopic programs: the Mapping Nearby Galaxies at Apache Point Observatory (MaNGA), the Extended Baryon Oscillation Spectroscopic Survey (eBOSS), and the Time Domain Spectroscopy Survey (TDSS).
Abstract: We describe the Sloan Digital Sky Survey IV (SDSS-IV), a project encompassing three major spectroscopic programs. The Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2) is observing hundreds of thousands of Milky Way stars at high resolution and high signal-to-noise ratios in the near-infrared. The Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey is obtaining spatially resolved spectroscopy for thousands of nearby galaxies (median $z\sim 0.03$). The extended Baryon Oscillation Spectroscopic Survey (eBOSS) is mapping the galaxy, quasar, and neutral gas distributions between $z\sim 0.6$ and 3.5 to constrain cosmology using baryon acoustic oscillations, redshift space distortions, and the shape of the power spectrum. Within eBOSS, we are conducting two major subprograms: the SPectroscopic IDentification of eROSITA Sources (SPIDERS), investigating X-ray AGNs and galaxies in X-ray clusters, and the Time Domain Spectroscopic Survey (TDSS), obtaining spectra of variable sources. All programs use the 2.5 m Sloan Foundation Telescope at the Apache Point Observatory; observations there began in Summer 2014. APOGEE-2 also operates a second near-infrared spectrograph at the 2.5 m du Pont Telescope at Las Campanas Observatory, with observations beginning in early 2017. Observations at both facilities are scheduled to continue through 2020. In keeping with previous SDSS policy, SDSS-IV provides regularly scheduled public data releases; the first one, Data Release 13, was made available in 2016 July.
1,200 citations
••
University of Virginia1, Liverpool John Moores University2, Texas Christian University3, Spanish National Research Council4, University of La Laguna5, Johns Hopkins University6, New Mexico State University7, Sternberg Astronomical Institute8, University of Arizona9, Ohio State University10, Pennsylvania State University11, University of Wisconsin-Madison12, Eötvös Loránd University13, University of Toronto14, University of Michigan15, University of Texas at Austin16, Leibniz Institute for Astrophysics Potsdam17, Yale University18, University of Colorado Boulder19, New York University20, Princeton University21, University of Utah22, Goddard Space Flight Center23, Aarhus University24, University of Birmingham25, Harvard University26, Space Telescope Science Institute27, Computer Sciences Corporation28, Paris Diderot University29, INAF30, Max Planck Society31, Space Science Institute32, Pierre-and-Marie-Curie University33, University of Franche-Comté34, Federal University of Rio de Janeiro35, University of Nice Sophia Antipolis36
TL;DR: In this article, the Hungarian National Research, Development and Innovation Office (K-119517) and Hungarian National Science Foundation (KNFI) have proposed a method to detect the presence of asteroids in Earth's magnetic field.
Abstract: National Science Foundation [AST-1109178, AST-1616636]; Gemini Observatory; Spanish Ministry of Economy and Competitiveness [AYA-2011-27754]; NASA [NNX12AE17G]; Hungarian Academy of Sciences; Hungarian NKFI of the Hungarian National Research, Development and Innovation Office [K-119517]; Alfred P. Sloan Foundation; National Science Foundation; U.S. Department of Energy Office of Science
1,193 citations
••
TL;DR: The gender difference in depression represents a health disparity, especially in adolescence, yet the magnitude of the difference indicates that depression in men should not be overlooked, yet cross-national analyses indicated that larger gender differences were found in nations with greater gender equity, for major depression, but not depression symptoms.
Abstract: In 2 meta-analyses on gender differences in depression in nationally representative samples, we advance previous work by including studies of depression diagnoses and symptoms to (a) estimate the magnitude of the gender difference in depression across a wide array of nations and ages; (b) use a developmental perspective to elucidate patterns of gender differences across the life span; and (c) incorporate additional theory-driven moderators (e.g., gender equity). For major depression diagnoses and depression symptoms, respectively, we meta-analyzed data from 65 and 95 articles and their corresponding national data sets, representing data from 1,716,195 and 1,922,064 people in over 90 different nations. Overall, odds ratio (OR) = 1.95, 95% confidence interval (CI) [1.88, 2.03], and d = 0.27 [0.26, 0.29]. Age was the strongest predictor of effect size. The gender difference for diagnoses emerged earlier than previously thought, with OR = 2.37 at age 12. For both meta-analyses, the gender difference peaked in adolescence (OR = 3.02 for ages 13-15, and d = 0.47 for age 16) but then declined and remained stable in adulthood. Cross-national analyses indicated that larger gender differences were found in nations with greater gender equity, for major depression, but not depression symptoms. The gender difference in depression represents a health disparity, especially in adolescence, yet the magnitude of the difference indicates that depression in men should not be overlooked. (PsycINFO Database Record
••
TL;DR: The gut microbiome of AD participants has decreased microbial diversity and is compositionally distinct from control age- and sex-matched individuals, which adds AD to the growing list of diseases associated with gut microbial alterations, as well as suggest that gut bacterial communities may be a target for therapeutic intervention.
Abstract: Alzheimer’s disease (AD) is the most common form of dementia. However, the etiopathogenesis of this devastating disease is not fully understood. Recent studies in rodents suggest that alterations in the gut microbiome may contribute to amyloid deposition, yet the microbial communities associated with AD have not been characterized in humans. Towards this end, we characterized the bacterial taxonomic composition of fecal samples from participants with and without a diagnosis of dementia due to AD. Our analyses revealed that the gut microbiome of AD participants has decreased microbial diversity and is compositionally distinct from control age- and sex-matched individuals. We identified phylum- through genus-wide differences in bacterial abundance including decreased Firmicutes, increased Bacteroidetes, and decreased Bifidobacterium in the microbiome of AD participants. Furthermore, we observed correlations between levels of differentially abundant genera and cerebrospinal fluid (CSF) biomarkers of AD. These findings add AD to the growing list of diseases associated with gut microbial alterations, as well as suggest that gut bacterial communities may be a target for therapeutic intervention.
••
TL;DR: This Review summarizes key findings and issues arising from a decade of research into the neurocognitive and neurocomputational underpinnings of semantic cognition, leading to a new framework that is term controlled semantic cognition (CSC).
Abstract: Semantic cognition refers to our ability to use, manipulate and generalize knowledge that is acquired over the lifespan to support innumerable verbal and non-verbal behaviours. This Review summarizes key findings and issues arising from a decade of research into the neurocognitive and neurocomputational underpinnings of this ability, leading to a new framework that we term controlled semantic cognition (CSC). CSC offers solutions to long-standing queries in philosophy and cognitive science, and yields a convergent framework for understanding the neural and computational bases of healthy semantic cognition and its dysfunction in brain disorders.
••
Pierre-and-Marie-Curie University1, Nest Labs2, University of Leeds3, SLAC National Accelerator Laboratory4, University of Wisconsin-Madison5, Lancaster University6, Helmholtz-Zentrum Dresden-Rossendorf7, University of Liverpool8, Centro de Investigaciones en Optica9, University of Glasgow10, Imperial College London11, University of Tokyo12, University of Marburg13, Yale University14, University of Regensburg15, University at Buffalo16, University of California, Los Angeles17, University of Western Australia18, Syracuse University19, Jet Propulsion Laboratory20, California Institute of Technology21, Goethe University Frankfurt22, University College London23, University of Duisburg-Essen24, National Physical Laboratory25, University of Oxford26
TL;DR: The 2017 roadmap of terahertz frequency electromagnetic radiation (100 GHz-30 THz) as discussed by the authors provides a snapshot of the present state of THz science and technology in 2017, and provides an opinion on the challenges and opportunities that the future holds.
Abstract: Science and technologies based on terahertz frequency electromagnetic radiation (100 GHz–30 THz) have developed rapidly over the last 30 years. For most of the 20th Century, terahertz radiation, then referred to as sub-millimeter wave or far-infrared radiation, was mainly utilized by astronomers and some spectroscopists. Following the development of laser based terahertz time-domain spectroscopy in the 1980s and 1990s the field of THz science and technology expanded rapidly, to the extent that it now touches many areas from fundamental science to 'real world' applications. For example THz radiation is being used to optimize materials for new solar cells, and may also be a key technology for the next generation of airport security scanners. While the field was emerging it was possible to keep track of all new developments, however now the field has grown so much that it is increasingly difficult to follow the diverse range of new discoveries and applications that are appearing. At this point in time, when the field of THz science and technology is moving from an emerging to a more established and interdisciplinary field, it is apt to present a roadmap to help identify the breadth and future directions of the field. The aim of this roadmap is to present a snapshot of the present state of THz science and technology in 2017, and provide an opinion on the challenges and opportunities that the future holds. To be able to achieve this aim, we have invited a group of international experts to write 18 sections that cover most of the key areas of THz science and technology. We hope that The 2017 Roadmap on THz science and technology will prove to be a useful resource by providing a wide ranging introduction to the capabilities of THz radiation for those outside or just entering the field as well as providing perspective and breadth for those who are well established. We also feel that this review should serve as a useful guide for government and funding agencies.
••
TL;DR: The thoroughly updated antiSMASH version 4 is presented, which adds several novel features, including prediction of gene cluster boundaries using the ClusterFinder method or the newly integrated CASSIS algorithm, improved substrate specificity prediction for non-ribosomal peptide synthetase adenylation domains based on the new SANDPUMA algorithm, and several usability features have been updated and improved.
Abstract: Many antibiotics, chemotherapeutics, crop protection agents and food preservatives originate from molecules produced by bacteria, fungi or plants. In recent years, genome mining methodologies have been widely adopted to identify and characterize the biosynthetic gene clusters encoding the production of such compounds. Since 2011, the â € antibiotics and secondary metabolite analysis shell - antiSMASH' has assisted researchers in efficiently performing this, both as a web server and a standalone tool. Here, we present the thoroughly updated antiSMASH version 4, which adds several novel features, including prediction of gene cluster boundaries using the ClusterFinder method or the newly integrated CASSIS algorithm, improved substrate specificity prediction for non-ribosomal peptide synthetase adenylation domains based on the new SANDPUMA algorithm, improved predictions for terpene and ribosomally synthesized and post-translationally modified peptides cluster products, reporting of sequence similarity to proteins encoded in experimentally characterized gene clusters on a per-protein basis and a domain-level alignment tool for comparative analysis of trans-AT polyketide synthase assembly line architectures. Additionally, several usability features have been updated and improved. Together, these improvements make antiSMASH up-to-date with the latest developments in natural product research and will further facilitate computational genome mining for the discovery of novel bioactive molecules.
••
Johns Hopkins University1, Seattle Cancer Care Alliance2, University of Colorado Boulder3, University of Utah4, Fox Chase Cancer Center5, Brigham and Women's Hospital6, Duke University7, Northwestern University8, University of South Florida9, University of Alabama at Birmingham10, Washington University in St. Louis11, University of California, San Francisco12, Roswell Park Cancer Institute13, Vanderbilt University14, University of Texas MD Anderson Cancer Center15, Harvard University16, University of Wisconsin-Madison17, Yale Cancer Center18, University of Michigan19, Stanford University20, Ohio State University21, City of Hope National Medical Center22, Memorial Sloan Kettering Cancer Center23, Mayo Clinic24, Case Western Reserve University25, University Of Tennessee System26
TL;DR: This selection from the NCCN Guidelines for Non-Small Cell Lung Cancer (NSCLC) focuses on targeted therapies and immunotherapies for metastatic NSCLC, because therapeutic recommendations are rapidly changing for metastasis disease.
Abstract: This selection from the NCCN Guidelines for Non-Small Cell Lung Cancer (NSCLC) focuses on targeted therapies and immunotherapies for metastatic NSCLC, because therapeutic recommendations are rapidly changing for metastatic disease. For example, new recommendations were added for atezolizumab, ceritinib, osimertinib, and pembrolizumab for the 2017 updates.
•
TL;DR: The proposed ODIN method, based on the observation that using temperature scaling and adding small perturbations to the input can separate the softmax score distributions between in- and out-of-distribution images, allowing for more effective detection, consistently outperforms the baseline approach by a large margin.
Abstract: We consider the problem of detecting out-of-distribution images in neural networks. We propose ODIN, a simple and effective method that does not require any change to a pre-trained neural network. Our method is based on the observation that using temperature scaling and adding small perturbations to the input can separate the softmax score distributions between in- and out-of-distribution images, allowing for more effective detection. We show in a series of experiments that ODIN is compatible with diverse network architectures and datasets. It consistently outperforms the baseline approach by a large margin, establishing a new state-of-the-art performance on this task. For example, ODIN reduces the false positive rate from the baseline 34.7% to 4.3% on the DenseNet (applied to CIFAR-10) when the true positive rate is 95%.
••
University of California, Los Angeles1, University of Utah2, University of South Florida3, University of Helsinki4, Primary Children's Hospital5, University of Groningen6, Norfolk and Norwich University Hospital7, Lund University8, Netherlands Cancer Institute9, University of Michigan10, Wake Forest University11, Ohio State University12, Peter MacCallum Cancer Centre13, University of Zurich14, University of Padua15, Pennsylvania State University16, Saint Louis University17, Tom Baker Cancer Centre18, University of Washington19, University of Lausanne20, Guy's and St Thomas' NHS Foundation Trust21, University of Kiel22, Thomas Jefferson University23, Sunnybrook Research Institute24, Vanderbilt University25, University of Queensland26, Fox Chase Cancer Center27, Greenville Health System28, Stony Brook University29, University Health Network30, Memorial Sloan Kettering Cancer Center31, Roswell Park Cancer Institute32, Northwestern University33, University of Wisconsin-Madison34, Rush University Medical Center35, Tel Aviv Sourasky Medical Center36, Dartmouth College37, Johns Hopkins University38, University of Louisville39, University of Barcelona40, University of Sydney41
TL;DR: Immediate completion lymph‐node dissection increased the rate of regional disease control and provided prognostic information but did not increase melanoma‐specific survival among patients with melanoma and sentinel‐node metastases.
Abstract: BackgroundSentinel-lymph-node biopsy is associated with increased melanoma-specific survival (i.e., survival until death from melanoma) among patients with node-positive intermediate-thickness melanomas (1.2 to 3.5 mm). The value of completion lymph-node dissection for patients with sentinel-node metastases is not clear. MethodsIn an international trial, we randomly assigned patients with sentinel-node metastases detected by means of standard pathological assessment or a multimarker molecular assay to immediate completion lymph-node dissection (dissection group) or nodal observation with ultrasonography (observation group). The primary end point was melanoma-specific survival. Secondary end points included disease-free survival and the cumulative rate of nonsentinel-node metastasis. ResultsImmediate completion lymph-node dissection was not associated with increased melanoma-specific survival among 1934 patients with data that could be evaluated in an intention-to-treat analysis or among 1755 patients in t...
••
TL;DR: This paper showed that for typical psychological and psycholinguistic data, higher power is achieved without inflating Type I error rate if a model selection criterion is used to select a random effect structure that is supported by the data.
••
Federal University of Ceará1, Deakin University2, University of Toronto3, Universidade Federal do Rio Grande do Sul4, South London and Maudsley NHS Foundation Trust5, King's College London6, University of Padua7, Sunnybrook Research Institute8, University of Wisconsin-Madison9, Georgia Regents University10
TL;DR: A systematic review and meta‐analysis of studies that measured cytokine and chemokine levels in individuals with major depressive disorder (MDD) compared to healthy controls (HCs) is conducted.
Abstract: Objective
To conduct a systematic review and meta-analysis of studies that measured cytokine and chemokine levels in individuals with major depressive disorder (MDD) compared to healthy controls (HCs).
Method
The PubMed/MEDLINE, EMBASE, and PsycINFO databases were searched up until May 30, 2016. Effect sizes were estimated with random-effects models.
Result
Eighty-two studies comprising 3212 participants with MDD and 2798 HCs met inclusion criteria. Peripheral levels of interleukin-6 (IL-6), tumor necrosis factor (TNF)-alpha, IL-10, the soluble IL-2 receptor, C-C chemokine ligand 2, IL-13, IL-18, IL-12, the IL-1 receptor antagonist, and the soluble TNF receptor 2 were elevated in patients with MDD compared to HCs, whereas interferon-gamma levels were lower in MDD (Hedge's g = −0.477, P = 0.043). Levels of IL-1β, IL-2, IL-4, IL-8, the soluble IL-6 receptor (sIL-6R), IL-5, CCL-3, IL-17, and transforming growth factor-beta 1 were not significantly altered in individuals with MDD compared to HCs. Heterogeneity was large (I2: 51.6–97.7%), and sources of heterogeneity were explored (e.g., age, smoking status, and body mass index).
Conclusion
Our results further characterize a cytokine/chemokine profile associated with MDD. Future studies are warranted to further elucidate sources of heterogeneity, as well as biosignature cytokines secreted by other immune cells.
••
••
Good Samaritan Hospital1, Hospital Universitari Arnau de Vilanova2, University of Wisconsin-Madison3, Ohio State University4, Brigham and Women's Hospital5, University of California, San Diego6, Instituto Politécnico Nacional7, Cleveland Clinic Lerner College of Medicine8, University of Pennsylvania9, Johns Hopkins University School of Medicine10, Harvard University11, Mayo Clinic12
TL;DR: Evidence supports a causal association of sleep apnea with the incidence and morbidity of hypertension, coronary heart disease, arrhythmia, heart failure, and stroke, and research that has addressed the effect ofSleep apnea treatment on cardiovascular disease and clinical endpoints is reviewed.
••
University of California, San Francisco1, Moffitt Cancer Center2, University of Michigan3, Mayo Clinic4, Roswell Park Cancer Institute5, University of Tennessee Health Science Center6, Northwestern University7, Washington University in St. Louis8, Vanderbilt University9, Yale Cancer Center10, Seattle Cancer Care Alliance11, City of Hope National Medical Center12, Duke University13, Ohio State University14, Fox Chase Cancer Center15, Harvard University16, Case Western Reserve University17, University of Texas MD Anderson Cancer Center18, Stanford University19, University of Wisconsin-Madison20, University of California, San Diego21, Pancreatic Cancer Action Network22, Memorial Sloan Kettering Cancer Center23, University of Alabama at Birmingham24, University of Utah25, University of Colorado Boulder26, Dana Corporation27
TL;DR: The NCCN Guidelines for Pancreatic Adenocarcinoma focus on diagnosis and treatment with systemic therapy, radiation therapy, and surgical resection, as well as on management of locally advanced unresectable and metastatic disease.
Abstract: Ductal adenocarcinoma and its variants account for most pancreatic malignancies. High-quality multiphase imaging can help to preoperatively distinguish between patients eligible for resection with curative intent and those with unresectable disease. Systemic therapy is used in the neoadjuvant or adjuvant pancreatic cancer setting, as well as in the management of locally advanced unresectable and metastatic disease. Clinical trials are critical for making progress in treatment of pancreatic cancer. The NCCN Guidelines for Pancreatic Adenocarcinoma focus on diagnosis and treatment with systemic therapy, radiation therapy, and surgical resection.
••
University of Manchester1, Imperial College London2, Central Manchester University Hospitals NHS Foundation Trust3, Harvard University4, Ford Motor Company5, King's College London6, University Medical Center Groningen7, University of Cambridge8, University of Oxford9, The Royal Marsden NHS Foundation Trust10, University of Leeds11, University of Michigan12, European Organisation for Research and Treatment of Cancer13, Institute of Cancer Research14, University College London15, United States Military Academy16, VU University Amsterdam17, University of Wisconsin-Madison18, Maastricht University19, Institut Gustave Roussy20, Robarts Research Institute21, Memorial Sloan Kettering Cancer Center22, Newcastle University23, University of Leicester24, Mount Vernon Hospital25, Hofstra University26, Johns Hopkins University27, University of Birmingham28, University of Antwerp29, Duke University30, Brighton and Sussex Medical School31, University of Sheffield32, University of Texas at Austin33
TL;DR: Experts assembled to review, debate and summarize the challenges of IB validation and qualification produced 14 key recommendations for accelerating the clinical translation of IBs, which highlight the role of parallel (rather than sequential) tracks of technical validation, biological/clinical validation and assessment of cost-effectiveness.
Abstract: Imaging biomarkers (IBs) are integral to the routine management of patients with cancer. IBs used daily in oncology include clinical TNM stage, objective response and left ventricular ejection fraction. Other CT, MRI, PET and ultrasonography biomarkers are used extensively in cancer research and drug development. New IBs need to be established either as useful tools for testing research hypotheses in clinical trials and research studies, or as clinical decision-making tools for use in healthcare, by crossing 'translational gaps' through validation and qualification. Important differences exist between IBs and biospecimen-derived biomarkers and, therefore, the development of IBs requires a tailored 'roadmap'. Recognizing this need, Cancer Research UK (CRUK) and the European Organisation for Research and Treatment of Cancer (EORTC) assembled experts to review, debate and summarize the challenges of IB validation and qualification. This consensus group has produced 14 key recommendations for accelerating the clinical translation of IBs, which highlight the role of parallel (rather than sequential) tracks of technical (assay) validation, biological/clinical validation and assessment of cost-effectiveness; the need for IB standardization and accreditation systems; the need to continually revisit IB precision; an alternative framework for biological/clinical validation of IBs; and the essential requirements for multicentre studies to qualify IBs for clinical use.
••
TL;DR: This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation approach to describe the strength of recommendations and the quality of evidence on pediatric obesity.
Abstract: Cosponsoring associations The European Society of Endocrinology and the Pediatric Endocrine Society. This guideline was funded by the Endocrine Society. Objective To formulate clinical practice guidelines for the assessment, treatment, and prevention of pediatric obesity. Participants The participants include an Endocrine Society-appointed Task Force of 6 experts, a methodologist, and a medical writer. Evidence This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation approach to describe the strength of recommendations and the quality of evidence. The Task Force commissioned 2 systematic reviews and used the best available evidence from other published systematic reviews and individual studies. Consensus process One group meeting, several conference calls, and e-mail communications enabled consensus. Endocrine Society committees and members and co-sponsoring organizations reviewed and commented on preliminary drafts of this guideline. Conclusion Pediatric obesity remains an ongoing serious international health concern affecting ∼17% of US children and adolescents, threatening their adult health and longevity. Pediatric obesity has its basis in genetic susceptibilities influenced by a permissive environment starting in utero and extending through childhood and adolescence. Endocrine etiologies for obesity are rare and usually are accompanied by attenuated growth patterns. Pediatric comorbidities are common and long-term health complications often result; screening for comorbidities of obesity should be applied in a hierarchal, logical manner for early identification before more serious complications result. Genetic screening for rare syndromes is indicated only in the presence of specific historical or physical features. The psychological toll of pediatric obesity on the individual and family necessitates screening for mental health issues and counseling as indicated. The prevention of pediatric obesity by promoting healthful diet, activity, and environment should be a primary goal, as achieving effective, long-lasting results with lifestyle modification once obesity occurs is difficult. Although some behavioral and pharmacotherapy studies report modest success, additional research into accessible and effective methods for preventing and treating pediatric obesity is needed. The use of weight loss medications during childhood and adolescence should be restricted to clinical trials. Increasing evidence demonstrates the effectiveness of bariatric surgery in the most seriously affected mature teenagers who have failed lifestyle modification, but the use of surgery requires experienced teams with resources for long-term follow-up. Adolescents undergoing lifestyle therapy, medication regimens, or bariatric surgery for obesity will need cohesive planning to help them effectively transition to adult care, with continued necessary monitoring, support, and intervention. Transition programs for obesity are an uncharted area requiring further research for efficacy. Despite a significant increase in research on pediatric obesity since the initial publication of these guidelines 8 years ago, further study is needed of the genetic and biological factors that increase the risk of weight gain and influence the response to therapeutic interventions. Also needed are more studies to better understand the genetic and biological factors that cause an obese individual to manifest one comorbidity vs another or to be free of comorbidities. Furthermore, continued investigation into the most effective methods of preventing and treating obesity and into methods for changing environmental and economic factors that will lead to worldwide cultural changes in diet and activity should be priorities. Particular attention to determining ways to effect systemic changes in food environments and total daily mobility, as well as methods for sustaining healthy body mass index changes, is of importance.
••
25 May 2017TL;DR: In this paper, O.R. van Vliet et al. discuss the chemical interactions of components in food systems, Z.E. Strasburg, Y. Xiong, and W.W. Ghai.
Abstract: Introduction to Food Chemistry, O.R. Fennema, S. Damodaran, and K.L. Parkin Major Food Components Water and Ice, D.S. Reid and O.R. Fennema Carbohydrates, J.N. BeMiller and K.L. Huber Lipids, D.J. McClements and E.A. Decker Amino Acids, Peptides, and Proteins, S. Damodaran Enzymes, K.L. Parkin Minor Food Components Vitamins, J.F. Gregory III Minerals, D.D. Miller Colorants, S.J. Schwartz, J.H. von Elbe, and M.M. Giusti Flavors, R.C. Lindsay Food Additives, R.C. Lindsay Bioactive Substances: Nutraceuticals and Toxicants, C.T. Ho, M.M. Rafi, and G. Ghai Food Systems Dispersed Systems: Basic Considerations, P. Walstra and T. van Vliet Physical and Chemical Interactions of Components in Food Systems, Z.E. Sikorski, J. Pokorny, and S. Damodaran Characteristics of Milk, H.E. Swaisgood Postmortem Physiology of Edible Muscle Tissues, G. Strasburg, Y.L. Xiong, and W. Chiang Postharvest Physiology of Edible Plant Tissues, J.K. Brecht, M.A. Ritenour, N.F. Haard, and G.W. Chism Impact of Biotechnology on Food Supply and Quality, M. Newell-McGloughlin Appendices Appendix A: International System of Units (SI), The Modernized Metric System Appendix B: Conversion Factors (Non-SI Units to SI Units) Appendix C: Greek Alphabet Appendix D: Calculating Relative Polarities of Compounds Using Fragmental Constant Approach to Predict Log P Values. Index
••
TL;DR: Three new genome-wide significant nonsynonymous variants associated with Alzheimer's disease are observed, providing additional evidence that the microglia-mediated innate immune response contributes directly to the development of Alzheimer's Disease.
Abstract: We identified rare coding variants associated with Alzheimer's disease in a three-stage case–control study of 85,133 subjects. In stage 1, we genotyped 34,174 samples using a whole-exome microarray. In stage 2, we tested associated variants (P < 1 × 10−4) in 35,962 independent samples using de novo genotyping and imputed genotypes. In stage 3, we used an additional 14,997 samples to test the most significant stage 2 associations (P < 5 × 10−8) using imputed genotypes. We observed three new genome-wide significant nonsynonymous variants associated with Alzheimer's disease: a protective variant in PLCG2 (rs72824905: p.Pro522Arg, P = 5.38 × 10−10, odds ratio (OR) = 0.68, minor allele frequency (MAF)cases = 0.0059, MAFcontrols = 0.0093), a risk variant in ABI3 (rs616338: p.Ser209Phe, P = 4.56 × 10−10, OR = 1.43, MAFcases = 0.011, MAFcontrols = 0.008), and a new genome-wide significant variant in TREM2 (rs143332484: p.Arg62His, P = 1.55 × 10−14, OR = 1.67, MAFcases = 0.0143, MAFcontrols = 0.0089), a known susceptibility gene for Alzheimer's disease. These protein-altering changes are in genes highly expressed in microglia and highlight an immune-related protein–protein interaction network enriched for previously identified risk genes in Alzheimer's disease. These genetic findings provide additional evidence that the microglia-mediated innate immune response contributes directly to the development of Alzheimer's disease.
••
TL;DR: Six mycotoxins are regularly found in food, posing unpredictable and ongoing food safety problems worldwide, and the toxicity of the six, foods commonly contaminated by one or more of them are summarized.
Abstract: Mycotoxins are toxic secondary metabolites produced by certain filamentous fungi (molds). These low molecular weight compounds (usually less than 1000 Daltons) are naturally occurring and practically unavoidable. They can enter our food chain either directly from plant-based food components contaminated with mycotoxins or by indirect contamination from the growth of toxigenic fungi on food. Mycotoxins can accumulate in maturing corn, cereals, soybeans, sorghum, peanuts, and other food and feed crops in the field and in grain during transportation. Consumption of mycotoxin-contaminated food or feed can cause acute or chronic toxicity in human and animals. In addition to concerns over adverse effects from direct consumption of mycotoxin-contaminated foods and feeds, there is also public health concern over the potential ingestion of animal-derived food products, such as meat, milk, or eggs, containing residues or metabolites of mycotoxins. Members of three fungal genera, Aspergillus, Fusarium, and Penicillium, are the major mycotoxin producers. While over 300 mycotoxins have been identified, six (aflatoxins, trichothecenes, zearalenone, fumonisins, ochratoxins, and patulin) are regularly found in food, posing unpredictable and ongoing food safety problems worldwide. This review summarizes the toxicity of the six mycotoxins, foods commonly contaminated by one or more of them, and the current methods for detection and analysis of these mycotoxins.
••
TL;DR: A fully-fledged particle-flow reconstruction algorithm tuned to the CMS detector was developed and has been consistently used in physics analyses for the first time at a hadron collider as mentioned in this paper.
Abstract: The CMS apparatus was identified, a few years before the start of the LHC operation at CERN, to feature properties well suited to particle-flow (PF) reconstruction: a highly-segmented tracker, a fine-grained electromagnetic calorimeter, a hermetic hadron calorimeter, a strong magnetic field, and an excellent muon spectrometer. A fully-fledged PF reconstruction algorithm tuned to the CMS detector was therefore developed and has been consistently used in physics analyses for the first time at a hadron collider. For each collision, the comprehensive list of final-state particles identified and reconstructed by the algorithm provides a global event description that leads to unprecedented CMS performance for jet and hadronic τ decay reconstruction, missing transverse momentum determination, and electron and muon identification. This approach also allows particles from pileup interactions to be identified and enables efficient pileup mitigation methods. The data collected by CMS at a centre-of-mass energy of 8\TeV show excellent agreement with the simulation and confirm the superior PF performance at least up to an average of 20 pileup interactions.
••
TL;DR: This guideline is aimed to promote knowledge and education in the preoperative, intraoperative and postoperative setting not only among anaesthesiologists but also among all other healthcare professionals involved in the care of surgical patients.
Abstract: The purpose of this guideline is to present evidence-based and consensus-based recommendations for the prevention and treatment of postoperative delirium. The cornerstones of the guideline are the preoperative identification and handling of patients at risk, adequate intraoperative care, postoperative detection of delirium and management of delirious patients. The scope of this guideline is not to cover ICU delirium. Considering that many medical disciplines are involved in the treatment of surgical patients, a team-based approach should be implemented into daily practice. This guideline is aimed to promote knowledge and education in the preoperative, intraoperative and postoperative setting not only among anaesthesiologists but also among all other healthcare professionals involved in the care of surgical patients.