scispace - formally typeset
Search or ask a question

Showing papers by "University of Trento published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy3  +970 moreInstitutions (114)
TL;DR: This second gravitational-wave observation provides improved constraints on stellar populations and on deviations from general relativity.
Abstract: We report the observation of a gravitational-wave signal produced by the coalescence of two stellar-mass black holes. The signal, GW151226, was observed by the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) on December 26, 2015 at 03:38:53 UTC. The signal was initially identified within 70 s by an online matched-filter search targeting binary coalescences. Subsequent off-line analyses recovered GW151226 with a network signal-to-noise ratio of 13 and a significance greater than 5 σ. The signal persisted in the LIGO frequency band for approximately 1 s, increasing in frequency and amplitude over about 55 cycles from 35 to 450 Hz, and reached a peak gravitational strain of 3.4+0.7−0.9×10−22. The inferred source-frame initial black hole masses are 14.2+8.3−3.7M⊙ and 7.5+2.3−2.3M⊙ and the final black hole mass is 20.8+6.1−1.7M⊙. We find that at least one of the component black holes has spin greater than 0.2. This source is located at a luminosity distance of 440+180−190 Mpc corresponding to a redshift 0.09+0.03−0.04. All uncertainties define a 90 % credible interval. This second gravitational-wave observation provides improved constraints on stellar populations and on deviations from general relativity.

3,448 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy1  +976 moreInstitutions (107)
TL;DR: It is found that the final remnant's mass and spin, as determined from the low-frequency and high-frequency phases of the signal, are mutually consistent with the binary black-hole solution in general relativity.
Abstract: The LIGO detection of GW150914 provides an unprecedented opportunity to study the two-body motion of a compact-object binary in the large-velocity, highly nonlinear regime, and to witness the final merger of the binary and the excitation of uniquely relativistic modes of the gravitational field. We carry out several investigations to determine whether GW150914 is consistent with a binary black-hole merger in general relativity. We find that the final remnant’s mass and spin, as determined from the low-frequency (inspiral) and high-frequency (postinspiral) phases of the signal, are mutually consistent with the binary black-hole solution in general relativity. Furthermore, the data following the peak of GW150914 are consistent with the least-damped quasinormal mode inferred from the mass and spin of the remnant black hole. By using waveform models that allow for parametrized general-relativity violations during the inspiral and merger phases, we perform quantitative tests on the gravitational-wave phase in the dynamical regime and we determine the first empirical bounds on several high-order post-Newtonian coefficients. We constrain the graviton Compton wavelength, assuming that gravitons are dispersed in vacuum in the same way as particles with mass, obtaining a 90%-confidence lower bound of 1013 km. In conclusion, within our statistical uncertainties, we find no evidence for violations of general relativity in the genuinely strong-field regime of gravity.

1,421 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy3  +978 moreInstitutions (112)
TL;DR: The first observational run of the Advanced LIGO detectors, from September 12, 2015 to January 19, 2016, saw the first detections of gravitational waves from binary black hole mergers as discussed by the authors.
Abstract: The first observational run of the Advanced LIGO detectors, from September 12, 2015 to January 19, 2016, saw the first detections of gravitational waves from binary black hole mergers. In this paper we present full results from a search for binary black hole merger signals with total masses up to 100M⊙ and detailed implications from our observations of these systems. Our search, based on general-relativistic models of gravitational wave signals from binary black hole systems, unambiguously identified two signals, GW150914 and GW151226, with a significance of greater than 5σ over the observing period. It also identified a third possible signal, LVT151012, with substantially lower significance, which has a 87% probability of being of astrophysical origin. We provide detailed estimates of the parameters of the observed systems. Both GW150914 and GW151226 provide an unprecedented opportunity to study the two-body motion of a compact-object binary in the large velocity, highly nonlinear regime. We do not observe any deviations from general relativity, and place improved empirical bounds on several high-order post-Newtonian coefficients. From our observations we infer stellar-mass binary black hole merger rates lying in the range 9−240Gpc−3yr−1. These observations are beginning to inform astrophysical predictions of binary black hole formation rates, and indicate that future observing runs of the Advanced detector network will yield many more gravitational wave detections.

1,172 citations


Journal ArticleDOI
TL;DR: A new approach to the development of plant disease recognition model, based on leaf image classification, by the use of deep convolutional networks, which is able to recognize 13 different types of plant diseases out of healthy leaves.
Abstract: The latest generation of convolutional neural networks (CNNs) has achieved impressive results in the field of image classification. This paper is concerned with a new approach to the development of plant disease recognition model, based on leaf image classification, by the use of deep convolutional networks. Novel way of training and the methodology used facilitate a quick and easy system implementation in practice. The developed model is able to recognize 13 different types of plant diseases out of healthy leaves, with the ability to distinguish plant leaves from their surroundings. According to our knowledge, this method for plant disease recognition has been proposed for the first time. All essential steps required for implementing this disease recognition model are fully described throughout the paper, starting from gathering images in order to create a database, assessed by agricultural experts. Caffe, a deep learning framework developed by Berkley Vision and Learning Centre, was used to perform the deep CNN training. The experimental results on the developed model achieved precision between 91% and 98%, for separate class tests, on average 96.3%.

1,135 citations


Journal ArticleDOI
TL;DR: Analysis of whole-exome sequencing data of metastatic biopsies from patients observed substantial genomic overlap between castration-resistant tumors that were histologically characterized as prostate adenocarcinomas and neuroendocrine prostate cancer (CRPC-NE), supporting the emergence of an alternative, 'AR-indifferent' cell state through divergent clonal evolution as a mechanism of treatment resistance in advanced prostate cancer.
Abstract: An increasingly recognized resistance mechanism to androgen receptor (AR)-directed therapy in prostate cancer involves epithelial plasticity, in which tumor cells demonstrate low to absent AR expression and often have neuroendocrine features. The etiology and molecular basis for this 'alternative' treatment-resistant cell state remain incompletely understood. Here, by analyzing whole-exome sequencing data of metastatic biopsies from patients, we observed substantial genomic overlap between castration-resistant tumors that were histologically characterized as prostate adenocarcinomas (CRPC-Adeno) and neuroendocrine prostate cancer (CRPC-NE); analysis of biopsy samples from the same individuals over time points to a model most consistent with divergent clonal evolution. Genome-wide DNA methylation analysis revealed marked epigenetic differences between CRPC-NE tumors and CRPC-Adeno, and also designated samples of CRPC-Adeno with clinical features of AR independence as CRPC-NE, suggesting that epigenetic modifiers may play a role in the induction and/or maintenance of this treatment-resistant state. This study supports the emergence of an alternative, 'AR-indifferent' cell state through divergent clonal evolution as a mechanism of treatment resistance in advanced prostate cancer.

1,095 citations


Journal ArticleDOI
TL;DR: It is shown that schizophrenia is polygenic and the utility of this resource of gene expression and its genetic regulation for mechanistic interpretations of genetic liability for brain diseases is highlighted.
Abstract: Over 100 genetic loci harbor schizophrenia associated variants, yet how these variants confer liability is uncertain. The CommonMind Consortium sequenced RNA from dorsolateral prefrontal cortex of schizophrenia cases (N = 258) and control subjects (N = 279), creating a resource of gene expression and its genetic regulation. Using this resource, ~20% of schizophrenia loci have variants that could contribute to altered gene expression and liability. In five loci, only a single gene was involved: FURIN, TSNARE1, CNTN4, CLCN3, or SNAP91. Altering expression of FURIN, TSNARE1, or CNTN4 changes neurodevelopment in zebrafish; knockdown of FURIN in human neural progenitor cells yields abnormal migration. Of 693 genes showing significant case/control differential expression, their fold changes are ≤ 1.33, and an independent cohort yields similar results. Gene co-expression implicates a network relevant for schizophrenia. Our findings show schizophrenia is polygenic and highlight the utility of this resource for mechanistic interpretations of genetic liability for brain diseases.

907 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +984 moreInstitutions (116)
TL;DR: The data around the time of the event were analyzed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity.
Abstract: On September 14, 2015, the Laser Interferometer Gravitational-wave Observatory (LIGO) detected a gravitational-wave transient (GW150914); we characterise the properties of the source and its parameters. The data around the time of the event were analysed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity. GW150914 was produced by a nearly equal mass binary black hole of $36^{+5}_{-4} M_\odot$ and $29^{+4}_{-4} M_\odot$ (for each parameter we report the median value and the range of the 90% credible interval). The dimensionless spin magnitude of the more massive black hole is bound to be $0.7$ (at 90% probability). The luminosity distance to the source is $410^{+160}_{-180}$ Mpc, corresponding to a redshift $0.09^{+0.03}_{-0.04}$ assuming standard cosmology. The source location is constrained to an annulus section of $590$ deg$^2$, primarily in the southern hemisphere. The binary merges into a black hole of $62^{+4}_{-4} M_\odot$ and spin $0.67^{+0.05}_{-0.07}$. This black hole is significantly more massive than any other known in the stellar-mass regime.

874 citations


Journal ArticleDOI
Michele Armano1, Heather Audley2, G. Auger3, J. Baird4, Massimo Bassan5, Pierre Binétruy3, M. Born2, Daniele Bortoluzzi6, N. Brandt7, M. Caleno1, L. Carbone6, Antonella Cavalleri8, A. Cesarini6, Giacomo Ciani6, G. Congedo6, A. M. Cruise9, Karsten Danzmann2, M. de Deus Silva1, R. De Rosa, M. Diaz-Aguilo10, L. Di Fiore, Ingo Diepholz2, G. Dixon9, Rita Dolesi6, N. Dunbar7, Luigi Ferraioli11, Valerio Ferroni6, Walter Fichter, E. D. Fitzsimons12, R. Flatscher7, M. Freschi1, A. F. García Marín2, C. García Marirrodriga1, R. Gerndt7, Lluis Gesa10, Ferran Gibert6, Domenico Giardini11, R. Giusteri6, F. Guzmán2, Aniello Grado13, Catia Grimani14, A. Grynagier, J. Grzymisch1, I. Harrison15, Gerhard Heinzel2, M. Hewitson2, Daniel Hollington4, D. Hoyland9, Mauro Hueller6, Henri Inchauspe3, Oliver Jennrich1, Ph. Jetzer16, Ulrich Johann7, B. Johlander1, Nikolaos Karnesis2, B. Kaune2, N. Korsakova2, Christian J. Killow17, J. A. Lobo10, Ivan Lloro10, L. Liu6, J. P. López-Zaragoza10, R. Maarschalkerweerd15, Davor Mance11, V. Martín10, L. Martin-Polo1, J. Martino3, F. Martin-Porqueras1, S. Madden1, Ignacio Mateos10, Paul McNamara1, José F. F. Mendes15, L. Mendes1, A. Monsky2, Daniele Nicolodi6, Miquel Nofrarías10, S. Paczkowski2, Michael Perreur-Lloyd17, Antoine Petiteau3, P. Pivato6, Eric Plagnol3, P. Prat3, U. Ragnit1, B. Rais3, Juan Ramos-Castro18, J. Reiche2, D. I. Robertson17, H. Rozemeijer1, F. Rivas10, G. Russano6, J Sanjuán10, P. Sarra, A. Schleicher7, D. Shaul4, Jacob Slutsky19, Carlos F. Sopuerta10, Ruggero Stanga20, F. Steier2, T. J. Sumner4, D. Texier1, James Ira Thorpe19, C. Trenkel7, Michael Tröbs2, H. B. Tu6, Daniele Vetrugno6, Stefano Vitale6, V Wand2, Gudrun Wanner2, H. Ward17, C. Warren7, Peter Wass4, D. Wealthy7, W. J. Weber6, L. Wissel2, A. Wittchen2, A. Zambotti6, C. Zanoni6, Tobias Ziegler7, Peter Zweifel11 
TL;DR: The first results of the LISA Pathfinder in-flight experiment demonstrate that two free-falling reference test masses, such as those needed for a space-based gravitational wave observatory like LISA, can be put in free fall with a relative acceleration noise with a square root of the power spectral density.
Abstract: We report the first results of the LISA Pathfinder in-flight experiment. The results demonstrate that two free-falling reference test masses, such as those needed for a space-based gravitational wave observatory like LISA, can be put in free fall with a relative acceleration noise with a square root of the power spectral density of 5.2 +/- 0.1 fm s(exp -2)/square root of Hz, or (0.54 +/- 0.01) x 10(exp -15) g/square root of Hz, with g the standard gravity, for frequencies between 0.7 and 20 mHz. This value is lower than the LISA Pathfinder requirement by more than a factor 5 and within a factor 1.25 of the requirement for the LISA mission, and is compatible with Brownian noise from viscous damping due to the residual gas surrounding the test masses. Above 60 mHz the acceleration noise is dominated by interferometer displacement readout noise at a level of (34.8 +/- 0.3) fm square root of Hz, about 2 orders of magnitude better than requirements. At f less than or equal to 0.5 mHz we observe a low-frequency tail that stays below 12 fm s(exp -2)/square root of Hz down to 0.1 mHz. This performance would allow for a space-based gravitational wave observatory with a sensitivity close to what was originally foreseen for LISA.

523 citations


Journal ArticleDOI
TL;DR: A novel approach based on deep learning for active classification of electrocardiogram (ECG) signals by learning a suitable feature representation from the raw ECG data in an unsupervised way using stacked denoising autoencoders (SDAEs) with sparsity constraint.

507 citations


Journal ArticleDOI
M. Aguilar, L. Ali Cavasonza1, Behcet Alpat2, G. Ambrosi2  +265 moreInstitutions (39)
TL;DR: In the absolute rigidity range ∼60 to ∼500 GV, the antiproton p[over ¯], proton p, and positron e^{+} fluxes are found to have nearly identical rigidity dependence and the electron e^{-} flux exhibits a different rigidity dependent.
Abstract: A precision measurement by AMS of the antiproton flux and the antiproton-to-proton flux ratio in primary cosmic rays in the absolute rigidity range from 1 to 450 GV is presented based on 3.49 × 105 antiproton events and 2.42 × 109 proton events. The fluxes and flux ratios of charged elementary particles in cosmic rays are also presented. In the absolute rigidity range ∼60 to ∼500 GV, the antiproton ¯p, proton p, and positron eþ fluxes are found to have nearly identical rigidity dependence and the electron e− flux exhibits a different rigidity dependence. Below 60 GV, the ( ¯ p=p), ( ¯ p=eþ), and (p=eþ) flux ratios each reaches a maximum. From ∼60 to ∼500 GV, the ( ¯ p=p), ( ¯ p=eþ), and (p=eþ) flux ratios show no rigidity dependence. These are new observations of the properties of elementary particles in the cosmos.

Journal ArticleDOI
TL;DR: CoSMoMVPA is a lightweight MVPA (MVP analysis) toolbox implemented in the intersection of the Matlab and GNU Octave languages, that treats both fMRI and M/EEG data as first-class citizens.
Abstract: Recent years have seen an increase in the popularity of multivariate pattern (MVP) analysis of functional magnetic resonance (fMRI) data, and, to a much lesser extent, magneto- and electro-encephalography (M/EEG) data. We present CoSMoMVPA, a lightweight MVPA (MVP analysis) toolbox implemented in the intersection of the Matlab and GNU Octave languages, that treats both fMRI and M/EEG data as first-class citizens. CoSMoMVPA supports all state-of-the-art MVP analysis techniques, including searchlight analyses, classification, correlations, representational similarity analysis, and the time generalization method. These can be used to address both data-driven and hypothesis-driven questions about neural organization and representations, both within and across: space, time, frequency bands, neuroimaging modalities, individuals, and species. It uses a uniform data representation of fMRI data in the volume or on the surface, and of M/EEG data at the sensor and source level. Through various external toolboxes, it directly supports reading and writing a variety of fMRI and M/EEG neuroimaging formats, and, where applicable, can convert between them. As a result, it can be integrated readily in existing pipelines and used with existing preprocessed datasets. CoSMoMVPA overloads the traditional volumetric searchlight concept to support neighborhoods for M/EEG and surface-based fMRI data, which supports localization of multivariate effects of interest across space, time, and frequency dimensions. CoSMoMVPA also provides a generalized approach to multiple comparison correction across these dimensions using Threshold-Free Cluster Enhancement with state-of-the-art clustering and permutation techniques. CoSMoMVPA is highly modular and uses abstractions to provide a uniform interface for a variety of MVP measures. Typical analyses require a few lines of code, making it accessible to beginner users. At the same time, expert programmers can easily extend its functionality. CoSMoMVPA comes with extensive documentation, including a variety of runnable demonstration scripts and analysis exercises (with example data and solutions). It uses best software engineering practices including version control, distributed development, an automated test suite, and continuous integration testing. It can be used with the proprietary Matlab and the free GNU Octave software, and it complies with open source distribution platforms such as NeuroDebian. CoSMoMVPA is Free/Open Source Software under the permissive MIT license. Website: cosmomvpa.org

Journal ArticleDOI
TL;DR: The current understanding of non-coding variants in cancer is reviewed, including the great diversity of the mutation types — from single nucleotide variants to large genomic rearrangements — and the wide range of mechanisms by which they affect gene expression to promote tumorigenesis.
Abstract: Genomic analyses of cancer genomes have largely focused on mutations in protein-coding regions, but the functional importance of alterations to non-coding regions is becoming increasingly appreciated through whole-genome sequencing. This Review discusses our current understanding of non-coding sequence variants in cancer — both somatic mutations and germline variants, and their interplay — including their identification, computational and experimental evidence for functional impact, and their diverse mechanisms of action for dysregulating coding genes and non-coding RNAs.

Journal ArticleDOI
TL;DR: A computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers, is developed, which can be considered a first step toward defining general microbial dysbiosis.
Abstract: Shotgun metagenomic analysis of the human associated microbiome provides a rich set of microbial features for prediction and biomarker discovery in the context of human diseases and health conditions. However, the use of such high-resolution microbial features presents new challenges, and validated computational tools for learning tasks are lacking. Moreover, classification rules have scarcely been validated in independent studies, posing questions about the generality and generalization of disease-predictive models across cohorts. In this paper, we comprehensively assess approaches to metagenomics-based prediction tasks and for quantitative assessment of the strength of potential microbiome-phenotype associations. We develop a computational framework for prediction tasks using quantitative microbiome profiles, including species-level relative abundances and presence of strain-specific markers. A comprehensive meta-analysis, with particular emphasis on generalization across cohorts, was performed in a collection of 2424 publicly available metagenomic samples from eight large-scale studies. Cross-validation revealed good disease-prediction capabilities, which were in general improved by feature selection and use of strain-specific markers instead of species-level taxonomic abundance. In cross-study analysis, models transferred between studies were in some cases less accurate than models tested by within-study cross-validation. Interestingly, the addition of healthy (control) samples from other studies to training sets improved disease prediction capabilities. Some microbial species (most notably Streptococcus anginosus) seem to characterize general dysbiotic states of the microbiome rather than connections with a specific disease. Our results in modelling features of the “healthy” microbiome can be considered a first step toward defining general microbial dysbiosis. The software framework, microbiome profiles, and metadata for thousands of samples are publicly available at http://segatalab.cibio.unitn.it/tools/metaml.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +977 moreInstitutions (106)
TL;DR: In this paper, the results of a matched-filter search using relativistic models of compact-object binaries that recovered GW150914 as the most significant event during the coincident observations between the two LIGO detectors were reported.
Abstract: On September 14, 2015, at 09∶50:45 UTC the two detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) simultaneously observed the binary black hole merger GW150914. We report the results of a matched-filter search using relativistic models of compact-object binaries that recovered GW150914 as the most significant event during the coincident observations between the two LIGO detectors from September 12 to October 20, 2015 GW150914 was observed with a matched-filter signal-to-noise ratio of 24 and a false alarm rate estimated to be less than 1 event per 203000 years, equivalent to a significance greater than 5.1 σ.

Journal ArticleDOI
TL;DR: In this paper, the elastic modulus and dynamic storage moduli of 3D printed parts along three different build orientations were increased by the presence of graphene nanoplatelets (xGnP) in the ABS matrix.
Abstract: For the first time, graphene nanoplatelets (xGnP) were incorporated at 4 wt% in acrylonitrile–butadiene–styrene (ABS) filaments obtained by a solvent-free process consisting of melt compounding and extrusion. Nanocomposite filaments were then used to feed a fused deposition modelling (FDM) machine to obtain specimens with various build orientations. The elastic modulus and dynamic storage moduli of 3D printed parts along three different build orientations were increased by the presence of xGnP in the ABS matrix. At the same time, a decrease in both stress and strain at break was observed when xGnP is added to ABS. Moreover, a higher thermal stability was induced on 3D printed parts by xGnP, as indicated by a reduction in both coefficient of linear thermal expansion and creep compliance. A comparison between 3D printed and compression moulded parts highlighted the importance of the orientation effects induced by the fused deposition modelling process.


Journal ArticleDOI
TL;DR: In this article, a genetically engineered mouse model and human prostate cancer transcriptome data were integrated to show that N-Myc overexpression leads to the development of poorly differentiated, invasive prostate cancer that is molecularly similar to human NEPC.

Journal ArticleDOI
TL;DR: In this article, a method for detection and reconstruction of the gravitational wave (GW) transients with the networks of advanced detectors is presented. But the method is not suitable for high-confidence detection of a broad range of GW sources.
Abstract: We present a method for detection and reconstruction of the gravitational wave (GW) transients with the networks of advanced detectors. Originally designed to search for transients with the initial GW detectors, it uses significantly improved algorithms, which enhance both the low-latency searches with rapid localization of GW events for the electromagnetic follow-up and high confidence detection of a broad range of the transient GW sources. In this paper, we present the analytic framework of the method. Following a short description of the core analysis algorithms, we introduce a novel approach to the reconstruction of the GW polarization from a pattern of detector responses to a GW signal. This polarization pattern is a unique signature of an arbitrary GW signal that can be measured independently from the other source parameters. The polarization measurements enable rapid reconstruction of the GW waveforms, sky localization, and helps identification of the source origin.

Journal ArticleDOI
27 Jan 2016
TL;DR: This paper reviews and highlights some of the most recent advances in this field, including clustered, thinned, sparse, and time-modulated arrays, and their proposed design methodologies.
Abstract: The proliferation of wireless services is driving innovative phased array solutions that are able to provide better cost/performance tradeoffs. In this context, the use of irregular array architectures provides a viable solution. This paper reviews and highlights some of the most recent advances in this field, including clustered, thinned, sparse, and time-modulated arrays, and their proposed design methodologies.

Journal ArticleDOI
TL;DR: PanPhlAn as mentioned in this paper is a pangenome-based phylogenomic analysis tool that uses metagenomic data to achieve strain-level microbial profiling resolution, which is used for pathogen discovery, epidemiology and population genomics.
Abstract: Identifying microbial strains and characterizing their functional potential is essential for pathogen discovery, epidemiology and population genomics. We present pangenome-based phylogenomic analysis (PanPhlAn; http://segatalab.cibio.unitn.it/tools/panphlan), a tool that uses metagenomic data to achieve strain-level microbial profiling resolution. PanPhlAn recognized outbreak strains, produced the largest strain-level population genomic study of human-associated bacteria and, in combination with metatranscriptomics, profiled the transcriptional activity of strains in complex communities.

Journal ArticleDOI
TL;DR: A critical review of the recent advances in DA approaches for remote sensing is provided and an overview of DA methods divided into four categories: 1) invariant feature selection, 2) representation matching, 3) adaptation of classifiers, and 4) selective sampling are presented.
Abstract: The success of the supervised classification of remotely sensed images acquired over large geographical areas or at short time intervals strongly depends on the representativity of the samples used to train the classification algorithm and to define the model. When training samples are collected from an image or a spatial region that is different from the one used for mapping, spectral shifts between the two distributions are likely to make the model fail. Such shifts are generally due to differences in acquisition and atmospheric conditions or to changes in the nature of the object observed. To design classification methods that are robust to data set shifts, recent remote sensing literature has considered solutions based on domain adaptation (DA) approaches. Inspired by machine-learning literature, several DA methods have been proposed to solve specific problems in remote sensing data classification. This article provides a critical review of the recent advances in DA approaches for remote sensing and presents an overview of DA methods divided into four categories: 1) invariant feature selection, 2) representation matching, 3) adaptation of classifiers, and 4) selective sampling. We provide an overview of recent methodologies, examples of applications of the considered techniques to real remote sensing images characterized by very high spatial and spectral resolution as well as possible guidelines for the selection of the method to use in real application scenarios.

Proceedings Article
04 Nov 2016
TL;DR: This paper proposed a framework for language learning that relies on multi-agent communication in the context of referential games, where a sender and a receiver see a pair of images and the receiver must rely on this message to identify the target.
Abstract: The current mainstream approach to train natural language systems is to expose them to large amounts of text. This passive learning is problematic if we are interested in developing interactive machines, such as conversational agents. We propose a framework for language learning that relies on multi-agent communication. We study this learning in the context of referential games. In these games, a sender and a receiver see a pair of images. The sender is told one of them is the target and is allowed to send a message from a fixed, arbitrary vocabulary to the receiver. The receiver must rely on this message to identify the target. Thus, the agents develop their own language interactively out of the need to communicate. We show that two networks with simple configurations are able to learn to coordinate in the referential game. We further explore how to make changes to the game environment to cause the "word meanings" induced in the game to better reflect intuitive semantic properties of the images. In addition, we present a simple strategy for grounding the agents' code into natural language. Both of these are necessary steps towards developing machines that are able to communicate with humans productively.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +953 moreInstitutions (106)
TL;DR: It is concluded that the stochastic gravitational-wave background from binary black holes, created from the incoherent superposition of all the merging binaries in the Universe, is potentially measurable by the Advanced LIGO and Advanced Virgo detectors operating at their projected final sensitivity.
Abstract: The LIGO detection of the gravitational wave transient GW150914, from the inspiral and merger of two black holes with masses $\gtrsim 30\, \text{M}_\odot$, suggests a population of binary black holes with relatively high mass. This observation implies that the stochastic gravitational-wave background from binary black holes, created from the incoherent superposition of all the merging binaries in the Universe, could be higher than previously expected. Using the properties of GW150914, we estimate the energy density of such a background from binary black holes. In the most sensitive part of the Advanced LIGO/Virgo band for stochastic backgrounds (near 25 Hz), we predict $\Omega_\text{GW}(f=25 Hz) = 1.1_{-0.9}^{+2.7} \times 10^{-9}$ with 90\% confidence. This prediction is robustly demonstrated for a variety of formation scenarios with different parameters. The differences between models are small compared to the statistical uncertainty arising from the currently poorly constrained local coalescence rate. We conclude that this background is potentially measurable by the Advanced LIGO/Virgo detectors operating at their projected final sensitivity.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy1  +999 moreInstitutions (109)
TL;DR: The transient noise backgrounds used to determine the significance of the event (designated GW150914) are described and the results of investigations into potential correlated or uncorrelated sources of transient noise in the detectors around the time of theevent are presented.
Abstract: On 14 September 2015, a gravitational wave signal from a coalescing black hole binary system was observed by the Advanced LIGO detectors. This paper describes the transient noise backgrounds used to determine the significance of the event (designated GW150914) and presents the results of investigations into potential correlated or uncorrelated sources of transient noise in the detectors around the time of the event. The detectors were operating nominally at the time of GW150914. We have ruled out environmental influences and non-Gaussian instrument noise at either LIGO detector as the cause of the observed gravitational wave signal.

Journal ArticleDOI
TL;DR: In this article, the authors exploit the different modes of a silicon ring resonator as an extra dimension for photons to generate topologically robust optical isolators and driven-dissipative analog of the 4D quantum Hall effect.
Abstract: Recent technological advances in integrated photonics have spurred on the study of topological phenomena in engineered bosonic systems. Indeed, the controllability of silicon ring-resonator arrays has opened up new perspectives for building lattices for photons with topologically nontrivial bands and integrating them into photonic devices for practical applications. Here, we push these developments even further by exploiting the different modes of a silicon ring resonator as an extra dimension for photons. Tunneling along this synthetic dimension is implemented via an external time-dependent modulation that allows for the generation of engineered gauge fields. We show how this approach can be used to generate a variety of exciting topological phenomena in integrated photonics, ranging from a topologically-robust optical isolator in a spatially one-dimensional (1D) ring-resonator chain to a driven-dissipative analog of the 4D quantum Hall effect in a spatially 3D resonator lattice. Our proposal paves the way towards the use of topological effects in the design of novel photonic lattices supporting many frequency channels and displaying higher connectivities.

Journal ArticleDOI
TL;DR: In this article, the authors advocate for the adherence of a plural valuation culture and its establishment as a common practice, by contesting and complementing ineffective and discriminatory single-value approaches.
Abstract: We are increasingly confronted with severe social and economic impacts of environmental degradation all over the world. From a valuation perspective, environmental problems and conflicts originate from trade-offs between values. The urgency and importance to integrate nature's diverse values in decisions and actions stand out more than ever. Valuation, in its broad sense of ‘assigning importance’, is inherently part of most decisions on natural resource and land use. Scholars from different traditions -while moving from heuristic interdisciplinary debate to applied transdisciplinary science- now acknowledge the need for combining multiple disciplines and methods to represent the diverse set of values of nature. This growing group of scientists and practitioners share the ambition to explore how combinations of ecological, socio-cultural and economic valuation tools can support real-life resource and land use decision-making. The current sustainability challenges and the ineffectiveness of single-value approaches to offer relief demonstrate that continuing along a single path is no option. We advocate for the adherence of a plural valuation culture and its establishment as a common practice, by contesting and complementing ineffective and discriminatory single-value approaches. In policy and decision contexts with a willingness to improve sustainability, integrated valuation approaches can be blended in existing processes, whereas in contexts of power asymmetries or environmental conflicts, integrated valuation can promote the inclusion of diverse values through action research and support the struggle for social and environmental justice. The special issue and this editorial synthesis paper bring together lessons from pioneer case studies and research papers, synthesizing main challenges and setting out priorities for the years to come for the field of integrated valuation.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy1  +1619 moreInstitutions (220)
TL;DR: In this article, the sky localization of the first observed compact binary merger is presented, where the authors describe the low-latency analysis of the LIGO data and present a sky localization map.
Abstract: A gravitational-wave (GW) transient was identified in data recorded by the Advanced Laser Interferometer Gravitational-wave Observatory (LIGO) detectors on 2015 September 14. The event, initially designated G184098 and later given the name GW150914, is described in detail elsewhere. By prior arrangement, preliminary estimates of the time, significance, and sky location of the event were shared with 63 teams of observers covering radio, optical, near-infrared, X-ray, and gamma-ray wavelengths with ground- and space-based facilities. In this Letter we describe the low-latency analysis of the GW data and present the sky localization of the first observed compact binary merger. We summarize the follow-up observations reported by 25 teams via private Gamma-ray Coordinates Network circulars, giving an overview of the participating facilities, the GW sky localization coverage, the timeline, and depth of the observations. As this event turned out to be a binary black hole merger, there is little expectation of a detectable electromagnetic (EM) signature. Nevertheless, this first broadband campaign to search for a counterpart of an Advanced LIGO source represents a milestone and highlights the broad capabilities of the transient astronomy community and the observing strategies that have been developed to pursue neutron star binary merger events. Detailed investigations of the EM data and results of the EM follow-up campaign are being disseminated in papers by the individual teams.

Journal ArticleDOI
01 Oct 2016-Icarus
TL;DR: In this article, a high-resolution shape model of the nucleus of the comet 67P/Churyumov-Gerasimenko was used to estimate the porosity of the surface of the cometary nucleus.

Proceedings ArticleDOI
27 Jun 2016
TL;DR: This work introduces a strategy to dynamically select face regions useful for robust HR estimation, inspired by recent advances on matrix completion theory, which significantly outperforms state-of-the-art HR estimation methods in naturalistic conditions.
Abstract: Recent studies in computer vision have shown that, while practically invisible to a human observer, skin color changes due to blood flow can be captured on face videos and, surprisingly, be used to estimate the heart rate (HR). While considerable progress has been made in the last few years, still many issues remain open. In particular, state of-the-art approaches are not robust enough to operate in natural conditions (e.g. in case of spontaneous movements, facial expressions, or illumination changes). Opposite to previous approaches that estimate the HR by processing all the skin pixels inside a fixed region of interest, we introduce a strategy to dynamically select face regions useful for robust HR estimation. Our approach, inspired by recent advances on matrix completion theory, allows us to predict the HR while simultaneously discover the best regions of the face to be used for estimation. Thorough experimental evaluation conducted on public benchmarks suggests that the proposed approach significantly outperforms state-of the-art HR estimation methods in naturalistic conditions.