scispace - formally typeset
Search or ask a question

Showing papers by "University of Trento published in 2015"


Journal ArticleDOI
Fausto Acernese1, M. Agathos2, Kazuhiro Agatsuma2, D. Aisa3  +230 moreInstitutions (19)
TL;DR: Advanced Virgo as mentioned in this paper is the project to upgrade the Virgo interferometric detector of gravitational waves, with the aim of increasing the number of observable galaxies (and thus the detection rate) by three orders of magnitude.
Abstract: Advanced Virgo is the project to upgrade the Virgo interferometric detector of gravitational waves, with the aim of increasing the number of observable galaxies (and thus the detection rate) by three orders of magnitude. The project is now in an advanced construction phase and the assembly and integration will be completed by the end of 2015. Advanced Virgo will be part of a network, alongside the two Advanced LIGO detectors in the US and GEO HF in Germany, with the goal of contributing to the early detection of gravitational waves and to opening a new window of observation on the universe. In this paper we describe the main features of the Advanced Virgo detector and outline the status of the construction.

3,004 citations


Journal ArticleDOI
Dan R. Robinson1, Eliezer M. Van Allen2, Eliezer M. Van Allen3, Yi-Mi Wu1, Nikolaus Schultz4, Robert J. Lonigro1, Juan Miguel Mosquera, Bruce Montgomery5, Mary-Ellen Taplin2, Colin C. Pritchard5, Gerhardt Attard6, Gerhardt Attard7, Himisha Beltran, Wassim Abida4, Robert K. Bradley5, Jake Vinson4, Xuhong Cao1, Pankaj Vats1, Lakshmi P. Kunju1, Maha Hussain1, Felix Y. Feng1, Scott A. Tomlins, Kathleen A. Cooney1, David Smith1, Christine Brennan1, Javed Siddiqui1, Rohit Mehra1, Yu Chen8, Yu Chen4, Dana E. Rathkopf4, Dana E. Rathkopf8, Michael J. Morris4, Michael J. Morris8, Stephen B. Solomon4, Jeremy C. Durack4, Victor E. Reuter4, Anuradha Gopalan4, Jianjiong Gao4, Massimo Loda, Rosina T. Lis2, Michaela Bowden9, Michaela Bowden2, Stephen P. Balk10, Glenn C. Gaviola9, Carrie Sougnez3, Manaswi Gupta3, Evan Y. Yu5, Elahe A. Mostaghel5, Heather H. Cheng5, Hyojeong Mulcahy5, Lawrence D. True11, Stephen R. Plymate5, Heidi Dvinge5, Roberta Ferraldeschi6, Roberta Ferraldeschi7, Penny Flohr7, Penny Flohr6, Susana Miranda6, Susana Miranda7, Zafeiris Zafeiriou7, Zafeiris Zafeiriou6, Nina Tunariu7, Nina Tunariu6, Joaquin Mateo7, Joaquin Mateo6, Raquel Perez-Lopez6, Raquel Perez-Lopez7, Francesca Demichelis8, Francesca Demichelis12, Brian D. Robinson, Marc H. Schiffman8, David M. Nanus, Scott T. Tagawa, Alexandros Sigaras8, Kenneth Eng8, Olivier Elemento8, Andrea Sboner8, Elisabeth I. Heath13, Howard I. Scher4, Howard I. Scher8, Kenneth J. Pienta14, Philip W. Kantoff2, Johann S. de Bono7, Johann S. de Bono6, Mark A. Rubin, Peter S. Nelson, Levi A. Garraway2, Levi A. Garraway3, Charles L. Sawyers4, Arul M. Chinnaiyan 
21 May 2015-Cell
TL;DR: This cohort study provides clinically actionable information that could impact treatment decisions for affected individuals and identified new genomic alterations in PIK3CA/B, R-spondin, BRAF/RAF1, APC, β-catenin, and ZBTB16/PLZF.

2,713 citations


Journal ArticleDOI
TL;DR: An overview of the key aspects of graphene and related materials, ranging from fundamental research challenges to a variety of applications in a large number of sectors, highlighting the steps necessary to take GRMs from a state of raw potential to a point where they might revolutionize multiple industries are provided.
Abstract: We present the science and technology roadmap for graphene, related two-dimensional crystals, and hybrid systems, targeting an evolution in technology, that might lead to impacts and benefits reaching into most areas of society. This roadmap was developed within the framework of the European Graphene Flagship and outlines the main targets and research areas as best understood at the start of this ambitious project. We provide an overview of the key aspects of graphene and related materials (GRMs), ranging from fundamental research challenges to a variety of applications in a large number of sectors, highlighting the steps necessary to take GRMs from a state of raw potential to a point where they might revolutionize multiple industries. We also define an extensive list of acronyms in an effort to standardize the nomenclature in this emerging field.

2,560 citations


Journal ArticleDOI
TL;DR: Improvements in the underlying pipeline for identifying marker genes and the profiling procedure resulted in much improved quantitative performances (higher correlation with true abundances, lower false positive and false negative rates).
Abstract:  Profiling of all domains of life. Marker and quasi-marker genes are now identified not only for microbes (Bacteria and Archaea), but also for viruses and Eukaryotic microbes (Fungi, Protozoa) that are crucial components of microbial communities.  A 6-fold increase in the number of considered species. Markers are now identified from >16,000 reference genomes and >7,000 unique species, dramatically expanding the comprehensiveness of the method. The new pipeline for identifying marker genes is also scalable to the quickly increasing number of reference genomes. See Supplementary Tables 1-3.  Introduction of the concept of quasi-markers, allowing more comprehensive and accurate profiling. For species with less than 200 markers, MetaPhlAn2 adopts additional quasi-marker sequences (Supplementary Note 2) that are occasionally present in other genomes (because of vertical conservation or horizontal transfer). At profiling time, if no other markers of the potentially confounding species are detected, the corresponding quasi-local markers are used to improve the quality and accuracy of the profiling.  Addition of strain-specific barcoding for microbial strain tracking. MetaPhlAn2 includes a completely new feature that exploits marker combinations to perform species-specific and genus-specific “barcoding” for strains in metagenomic samples (Supplementary Note 7). This feature can be used for culture-free pathogen tracking in epidemiology studies and strain tracking across microbiome samples. See Supplementary Figs. 12-20.  Strain-level identification for organisms with sequenced genomes. For the case in which a microbiome includes strains that are very close to one of those already sequenced, MetaPhlAn2 is now able to identify such strains and readily reports their abundances. See Supplementary Note 7, Supplementary Table 13, and Supplementary Fig. 21.  Improvement of false positive and false negative rates. Improvements in the underlying pipeline for identifying marker genes (including the increment of the adopted genomes and the use of quasi-markers) and the profiling procedure resulted in much improved quantitative performances (higher correlation with true abundances, lower false positive and false negative rates). See the validation on synthetic metagenomes in Supplementary Note 4.  Estimation of the percentage of reads mapped against known reference genomes. MetaPhlAn2 is now able to estimate the number of reads that would map against genomes of each clade detected as present and for which an estimation of its relative abundance is provided by the default output. See Supplementary Note 3 for details.  Integration of MetaPhlAn with post-processing and visualization tools. The MetaPhlAn2 package now includes a set of post-processing and visualization tools (“utils” subfolder of the MetaPhlAn2 repository). Multiple MetaPhlAn profiles can in fact be merged in an abundance table (“merge_metaphlan_tables.py”), exported as BIOM files, visualized as heatmap (“metaphlan_hclust_heatmap.py” or the integrated “hclust2” package), GraPhlAn plots (“export2graphlan.py” and the GraPhlAn package1), Krona2 plots (“metaphlan2krona.py”), and single microbe barplot across samples and conditions (“plot_bug.py”).

1,618 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +5117 moreInstitutions (314)
TL;DR: A measurement of the Higgs boson mass is presented based on the combined data samples of the ATLAS and CMS experiments at the CERN LHC in the H→γγ and H→ZZ→4ℓ decay channels.
Abstract: A measurement of the Higgs boson mass is presented based on the combined data samples of the ATLAS and CMS experiments at the CERN LHC in the H→γγ and H→ZZ→4l decay channels. The results are obtained from a simultaneous fit to the reconstructed invariant mass peaks in the two channels and for the two experiments. The measured masses from the individual channels and the two experiments are found to be consistent among themselves. The combined measured mass of the Higgs boson is mH=125.09±0.21 (stat)±0.11 (syst) GeV.

1,567 citations


Journal ArticleDOI
J. Aasi1, J. Abadie1, B. P. Abbott1, Richard J. Abbott1  +884 moreInstitutions (98)
TL;DR: In this paper, the authors review the performance of the LIGO instruments during this epoch, the work done to characterize the detectors and their data, and the effect that transient and continuous noise artefacts have on the sensitivity of the detectors to a variety of astrophysical sources.
Abstract: In 2009–2010, the Laser Interferometer Gravitational-Wave Observatory (LIGO) operated together with international partners Virgo and GEO600 as a network to search for gravitational waves (GWs) of astrophysical origin. The sensitivity of these detectors was limited by a combination of noise sources inherent to the instrumental design and its environment, often localized in time or frequency, that couple into the GW readout. Here we review the performance of the LIGO instruments during this epoch, the work done to characterize the detectors and their data, and the effect that transient and continuous noise artefacts have on the sensitivity of LIGO to a variety of astrophysical sources.

1,266 citations


Journal ArticleDOI
TL;DR: In this article, a comprehensive review of various printing technologies, commonly used substrates and electronic materials is presented, including solution/dry printing and contact/noncontact printing technologies on the basis of technological, materials, and process-related developments in the field.
Abstract: Printing sensors and electronics over flexible substrates are an area of significant interest due to low-cost fabrication and possibility of obtaining multifunctional electronics over large areas. Over the years, a number of printing technologies have been developed to pattern a wide range of electronic materials on diverse substrates. As further expansion of printed technologies is expected in future for sensors and electronics, it is opportune to review the common features, the complementarities, and the challenges associated with various printing technologies. This paper presents a comprehensive review of various printing technologies, commonly used substrates and electronic materials. Various solution/dry printing and contact/noncontact printing technologies have been assessed on the basis of technological, materials, and process-related developments in the field. Critical challenges in various printing techniques and potential research directions have been highlighted. Possibilities of merging various printing methodologies have been explored to extend the lab developed standalone systems to high-speed roll-to-roll production lines for system level integration.

951 citations


Journal ArticleDOI
30 Sep 2015
TL;DR: This position paper position that a new shift is necessary in computing, taking the control of computing applications, data, and services away from some central nodes to the other logical extreme of the Internet, and refers to this vision of human-centered edge-device based computing as Edge-centric Computing.
Abstract: In many aspects of human activity, there has been a continuous struggle between the forces of centralization and decentralization. Computing exhibits the same phenomenon; we have gone from mainframes to PCs and local networks in the past, and over the last decade we have seen a centralization and consolidation of services and applications in data centers and clouds. We position that a new shift is necessary. Technological advances such as powerful dedicated connection boxes deployed in most homes, high capacity mobile end-user devices and powerful wireless networks, along with growing user concerns about trust, privacy, and autonomy requires taking the control of computing applications, data, and services away from some central nodes (the "core") to the other logical extreme (the "edge") of the Internet. We also position that this development can help blurring the boundary between man and machine, and embrace social computing in which humans are part of the computation and decision making loop, resulting in a human-centered system design. We refer to this vision of human-centered edge-device based computing as Edge-centric Computing. We elaborate in this position paper on this vision and present the research challenges associated with its implementation.

844 citations


Journal ArticleDOI
Damian Smedley1, Syed Haider2, Steffen Durinck3, Luca Pandini4, Paolo Provero4, Paolo Provero5, James E. Allen6, Olivier Arnaiz7, Mohammad Awedh8, Richard Baldock9, Giulia Barbiera4, Philippe Bardou10, Tim Beck11, Andrew Blake, Merideth Bonierbale12, Anthony J. Brookes11, Gabriele Bucci4, Iwan Buetti4, Sarah W. Burge6, Cédric Cabau10, Joseph W. Carlson13, Claude Chelala14, Charalambos Chrysostomou11, Davide Cittaro4, Olivier Collin15, Raul Cordova12, Rosalind J. Cutts14, Erik Dassi16, Alex Di Genova17, Anis Djari10, Anthony Esposito18, Heather Estrella18, Eduardo Eyras19, Eduardo Eyras20, Julio Fernandez-Banet18, Simon A. Forbes1, Robert C. Free11, Takatomo Fujisawa, Emanuela Gadaleta14, Jose Manuel Garcia-Manteiga4, David Goodstein13, Kristian Gray6, José Afonso Guerra-Assunção14, Bernard Haggarty9, Dong Jin Han21, Byung Woo Han21, Todd W. Harris22, Jayson Harshbarger, Robert K. Hastings11, Richard D. Hayes13, Claire Hoede10, Shen Hu23, Zhi-Liang Hu24, Lucie N. Hutchins, Zhengyan Kan18, Hideya Kawaji, Aminah Keliet10, Arnaud Kerhornou6, Sunghoon Kim21, Rhoda Kinsella6, Christophe Klopp10, Lei Kong25, Daniel Lawson6, Dejan Lazarevic4, Ji Hyun Lee21, Thomas Letellier10, Chuan-Yun Li25, Pietro Liò26, Chu Jun Liu25, Jie Luo6, Alejandro Maass17, Jérôme Mariette10, Thomas Maurel6, Stefania Merella4, Azza M. Mohamed8, François Moreews10, Ibounyamine Nabihoudine10, Nelson Ndegwa27, Céline Noirot10, Cristian Perez-Llamas19, Michael Primig28, Alessandro Quattrone16, Hadi Quesneville10, Davide Rambaldi4, James M. Reecy24, Michela Riba4, Steven Rosanoff6, Amna A. Saddiq8, Elisa Salas12, Olivier Sallou15, Rebecca Shepherd1, Reinhard Simon12, Linda Sperling7, William Spooner29, Daniel M. Staines6, Delphine Steinbach10, Kevin R. Stone, Elia Stupka4, Jon W. Teague1, Abu Z. Dayem Ullah14, Jun Wang25, Doreen Ware29, Marie Wong-Erasmus, Ken Youens-Clark29, Amonida Zadissa6, Shi Jian Zhang25, Arek Kasprzyk4, Arek Kasprzyk8 
TL;DR: The latest version of the BioMart Community Portal comes with many new databases that have been created by the ever-growing community and comes with better support and extensibility for data analysis and visualization tools.
Abstract: The BioMart Community Portal (www.biomart.org) is a community-driven effort to provide a unified interface to biomedical databases that are distributed worldwide. The portal provides access to numerous database projects supported by 30 scientific organizations. It includes over 800 different biological datasets spanning genomics, proteomics, model organisms, cancer data, ontology information and more. All resources available through the portal are independently administered and funded by their host organizations. The BioMart data federation technology provides a unified interface to all the available data. The latest version of the portal comes with many new databases that have been created by our ever-growing community. It also comes with better support and extensibility for data analysis and visualization tools. A new addition to our toolbox, the enrichment analysis tool is now accessible through graphical and web service interface. The BioMart community portal averages over one million requests per day. Building on this level of service and the wealth of information that has become available, the BioMart Community Portal has introduced a new, more scalable and cheaper alternative to the large data stores maintained by specialized organizations.

664 citations


Journal ArticleDOI
TL;DR: In this paper, a review of the atomic nucleus from the ground up is presented, including the structure of light nuclei, electroweak response of nuclei relevant in electron and neutrino scattering, and the properties of dense nucleonic matter.
Abstract: Quantum Monte Carlo techniques aim at providing a description of complex quantum systems such as nuclei and nucleonic matter from first principles, i.e., realistic nuclear interactions and currents. The methods are similar to those used for many-electron systems in quantum chemistry and condensed matter physics, but are extended to include spin-isospin, tensor, spin-orbit, and three-body interactions. This review shows how to build the atomic nucleus from the ground up. Examples include the structure of light nuclei, electroweak response of nuclei relevant in electron and neutrino scattering, and the properties of dense nucleonic matter.

602 citations


Journal ArticleDOI
18 Jun 2015-PeerJ
TL;DR: GraPhlAn (Graphical Phylogenetic Analysis), a computational tool that produces high-quality, compact visualizations of microbial genomes and metagenomes, is developed as an open-source command-driven tool in order to be easily integrated into complex, publication-quality bioinformatics pipelines.
Abstract: The increased availability of genomic and metagenomic data poses challenges at multiple analysis levels, including visualization of very large-scale microbial and microbial community data paired with rich metadata. We developed GraPhlAn (Graphical Phylogenetic Analysis), a computational tool that produces high-quality, compact visualizations of microbial genomes and metagenomes. This includes phylogenies spanning up to thousands of taxa, annotated with metadata ranging from microbial community abundances to microbial physiology or host and environmental phenotypes. GraPhlAn has been developed as an open-source command-driven tool in order to be easily integrated into complex, publication-quality bioinformatics pipelines. It can be executed either locally or through an online Galaxy web application. We present several examples including taxonomic and phylogenetic visualization of microbial communities, metabolic functions, and biomarker discovery that illustrate GraPhlAn's potential for modern microbial and community genomics.

Proceedings ArticleDOI
06 Oct 2015
TL;DR: This work proposes Appearance and Motion DeepNet (AMDN) which utilizes deep neural networks to automatically learn feature representations, and introduces a novel double fusion framework, combining both the benefits of traditional early fusion and late fusion strategies.
Abstract: We present a novel unsupervised deep learning framework for anomalous event detection in complex video scenes. While most existing works merely use hand-crafted appearance and motion features, we propose Appearance and Motion DeepNet (AMDN) which utilizes deep neural networks to automatically learn feature representations. To exploit the complementary information of both appearance and motion patterns, we introduce a novel double fusion framework, combining both the benefits of traditional early fusion and late fusion strategies. Specifically, stacked denoising autoencoders are proposed to separately learn both appearance and motion features as well as a joint representation (early fusion). Based on the learned representations, multiple one-class SVM models are used to predict the anomaly scores of each input, which are then integrated with a late fusion strategy for final anomaly detection. We evaluate the proposed method on two publicly available video surveillance datasets, showing competitive performance with respect to state of the art approaches.

Journal ArticleDOI
M. Aguilar, D. Aisa1, Behcet Alpat, A. Alvino  +308 moreInstitutions (42)
TL;DR: The detailed variation with rigidity of the helium flux spectral index is presented for the first time and the spectral index progressively hardens at rigidities larger than 100 GV.
Abstract: Knowledge of the precise rigidity dependence of the helium flux is important in understanding the origin, acceleration, and propagation of cosmic rays. A precise measurement of the helium flux in primary cosmic rays with rigidity (momentum/charge) from 1.9 GV to 3 TV based on 50 million events is presented and compared to the proton flux. The detailed variation with rigidity of the helium flux spectral index is presented for the first time. The spectral index progressively hardens at rigidities larger than 100 GV. The rigidity dependence of the helium flux spectral index is similar to that of the proton spectral index though the magnitudes are different. Remarkably, the spectral index of the proton to helium flux ratio increases with rigidity up to 45 GV and then becomes constant; the flux ratio above 45 GV is well described by a single power law.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam2  +2802 moreInstitutions (215)
04 Jun 2015-Nature
TL;DR: In this paper, the branching fractions of the B meson (B-s(0)) and the B-0 meson decaying into two oppositely charged muons (mu(+) and mu(-)) were observed.
Abstract: The standard model of particle physics describes the fundamental particles and their interactions via the strong, electromagnetic and weak forces. It provides precise predictions for measurable quantities that can be tested experimentally. The probabilities, or branching fractions, of the strange B meson (B-s(0)) and the B-0 meson decaying into two oppositely charged muons (mu(+) and mu(-)) are especially interesting because of their sensitivity to theories that extend the standard model. The standard model predicts that the B-s(0)->mu(+)mu(-) and B-0 ->mu(+)mu(-) decays are very rare, with about four of the former occurring for every billion B-s(0) mesons produced, and one of the latter occurring for every ten billion B-0 mesons(1). A difference in the observed branching fractions with respect to the predictions of the standard model would provide a direction in which the standard model should be extended. Before the Large Hadron Collider (LHC) at CERN2 started operating, no evidence for either decay mode had been found. Upper limits on the branching fractions were an order of magnitude above the standard model predictions. The CMS (Compact Muon Solenoid) and LHCb(Large Hadron Collider beauty) collaborations have performed a joint analysis of the data from proton-proton collisions that they collected in 2011 at a centre-of-mass energy of seven teraelectronvolts and in 2012 at eight teraelectronvolts. Here we report the first observation of the B-s(0)->mu(+)mu(-) decay, with a statistical significance exceeding six standard deviations, and the best measurement so far of its branching fraction. Furthermore, we obtained evidence for the B-0 ->mu(+)mu(-) decay with a statistical significance of three standard deviations. Both measurements are statistically compatible with standard model predictions and allow stringent constraints to be placed on theories beyond the standard model. The LHC experiments will resume taking data in 2015, recording proton-proton collisions at a centre-of-mass energy of 13 teraelectronvolts, which will approximately double the production rates of B-s(0) and B-0 mesons and lead to further improvements in the precision of these crucial tests of the standard model.

Proceedings ArticleDOI
18 Mar 2015
TL;DR: How RAISE has been collected and organized is described, how digital image forensics and many other multimedia research areas may benefit of this new publicly available benchmark dataset and a very recent forensic technique for JPEG compression detection is tested.
Abstract: Digital forensics is a relatively new research area which aims at authenticating digital media by detecting possible digital forgeries. Indeed, the ever increasing availability of multimedia data on the web, coupled with the great advances reached by computer graphical tools, makes the modification of an image and the creation of visually compelling forgeries an easy task for any user. This in turns creates the need of reliable tools to validate the trustworthiness of the represented information. In such a context, we present here RAISE, a large dataset of 8156 high-resolution raw images, depicting various subjects and scenarios, properly annotated and available together with accompanying metadata. Such a wide collection of untouched and diverse data is intended to become a powerful resource for, but not limited to, forensic researchers by providing a common benchmark for a fair comparison, testing and evaluation of existing and next generation forensic algorithms. In this paper we describe how RAISE has been collected and organized, discuss how digital image forensics and many other multimedia research areas may benefit of this new publicly available benchmark dataset and test a very recent forensic technique for JPEG compression detection.

Journal ArticleDOI
TL;DR: In this article, a review summarizes the recent advancements in the interphase tailoring of fiber-reinforced polymer composites and summarizes the future opportunities and challenges in the engineering of fiber/matrix interphase.

Journal ArticleDOI
23 Jan 2015-Science
TL;DR: Images from the OSIRIS scientific imaging system onboard Rosetta show that the nucleus of 67P/Churyumov-Gerasimenko consists of two lobes connected by a short neck, which raises the question of whether the two Lobes represent a contact binary formed 4.5 billion years ago, or a single body where a gap has evolved via mass loss.
Abstract: Images from the OSIRIS scientific imaging system onboard Rosetta show that the nucleus of 67P/Churyumov-Gerasimenko consists of two lobes connected by a short neck. The nucleus has a bulk density less than half that of water. Activity at a distance from the Sun of >3 astronomical units is predominantly from the neck, where jets have been seen consistently. The nucleus rotates about the principal axis of momentum. The surface morphology suggests that the removal of larger volumes of material, possibly via explosive release of subsurface pressure or via creation of overhangs by sublimation, may be a major mass loss process. The shape raises the question of whether the two lobes represent a contact binary formed 4.5 billion years ago, or a single body where a gap has evolved via mass loss.

Journal ArticleDOI
TL;DR: In this article, the spin-parity and tensor structure of the interactions of the recently discovered Higgs boson is performed using the H to ZZ, Z gamma*, gamma* gamma* to 4 l, H to WW to l nu l nu, and H to gamma gamma decay modes.
Abstract: The study of the spin-parity and tensor structure of the interactions of the recently discovered Higgs boson is performed using the H to ZZ, Z gamma*, gamma* gamma* to 4 l, H to WW to l nu l nu, and H to gamma gamma decay modes. The full dataset recorded by the CMS experiment during the LHC Run 1 is used, corresponding to an integrated luminosity of up to 5.1 inverse femtobarns at a center-of-mass energy of 7 TeV and up to 19.7 inverse femtobarns at 8 TeV. A wide range of spin-two models is excluded at a 99% confidence level or higher, or at a 99.87% confidence level for the minimal gravity-like couplings, regardless of whether assumptions are made on the production mechanism. Any mixed-parity spin-one state is excluded in the ZZ and WW modes at a greater than 99.999% confidence level. Under the hypothesis that the resonance is a spin-zero boson, the tensor structure of the interactions of the Higgs boson with two vector bosons ZZ, Z gamma, gamma gamma, and WW is investigated and limits on eleven anomalous contributions are set. Tighter constraints on anomalous HVV interactions are obtained by combining the HZZ and HWW measurements. All observations are consistent with the expectations for the standard model Higgs boson with the quantum numbers J[PC]=0[++].

Journal ArticleDOI
08 Oct 2015-Nature
TL;DR: This work identifies the host transmembrane protein SERINC5, and to a lesser extent SERINC3, as a potent inhibitor of HIV-1 particle infectivity that is counteracted by Nef.
Abstract: HIV-1 Nef, a protein important for the development of AIDS, has well-characterized effects on host membrane trafficking and receptor downregulation. By an unidentified mechanism, Nef increases the intrinsic infectivity of HIV-1 virions in a host-cell-dependent manner. Here we identify the host transmembrane protein SERINC5, and to a lesser extent SERINC3, as a potent inhibitor of HIV-1 particle infectivity that is counteracted by Nef. SERINC5 localizes to the plasma membrane, where it is efficiently incorporated into budding HIV-1 virions and impairs subsequent virion penetration of susceptible target cells. Nef redirects SERINC5 to a Rab7-positive endosomal compartment and thereby excludes it from HIV-1 particles. The ability to counteract SERINC5 was conserved in Nef encoded by diverse primate immunodeficiency viruses, as well as in the structurally unrelated glycosylated Gag from murine leukaemia virus. These examples of functional conservation and convergent evolution emphasize the fundamental importance of SERINC5 as a potent anti-retroviral factor.

Journal ArticleDOI
TL;DR: Tumor DNA samples from the blood of 97 patients with castration-resistant prostate cancer were analyzed and found that androgen receptor amplifications were present from the beginning and correlated with abiraterone resistance, suggesting that detection of these amplifications should be useful for identifying abiraton-resistant cancers before starting treatment.
Abstract: Androgen receptor ( AR ) gene aberrations are rare in prostate cancer before primary hormone treatment but emerge with castration resistance. To determine AR gene status using a minimally invasive assay that could have broad clinical utility, we developed a targeted next-generation sequencing approach amenable to plasma DNA, covering all AR coding bases and genomic regions that are highly informative in prostate cancer. We sequenced 274 plasma samples from 97 castration-resistant prostate cancer patients treated with abiraterone at two institutions. We controlled for normal DNA in patients’ circulation and detected a sufficiently high tumor DNA fraction to quantify AR copy number state in 217 samples (80 patients). Detection of AR copy number gain and point mutations in plasma were inversely correlated, supported further by the enrichment of nonsynonymous versus synonymous mutations in AR copy number normal as opposed to AR gain samples. Whereas AR copy number was unchanged from before treatment to progression and no mutant AR alleles showed signal for acquired gain, we observed emergence of T878A or L702H AR amino acid changes in 13% of tumors at progression on abiraterone. Patients with AR gain or T878A or L702H before abiraterone (45%) were 4.9 and 7.8 times less likely to have a ≥50 or ≥90% decline in prostate-specific antigen (PSA), respectively, and had a significantly worse overall [hazard ratio (HR), 7.33; 95% confidence interval (CI), 3.51 to 15.34; P = 1.3 × 10 −9 ) and progression-free (HR, 3.73; 95% CI, 2.17 to 6.41; P = 5.6 × 10 −7 ) survival. Evaluation of plasma AR by next-generation sequencing could identify cancers with primary resistance to abiraterone.

Journal ArticleDOI
Alessandra Rotundi1, Alessandra Rotundi2, Holger Sierks3, Vincenzo Della Corte1, Marco Fulle1, Pedro J. Gutiérrez4, Luisa Lara4, Cesare Barbieri, Philippe Lamy5, Rafael Rodrigo4, Rafael Rodrigo6, Detlef Koschny7, Hans Rickman8, Hans Rickman9, H. U. Keller10, José Juan López-Moreno4, Mario Accolla1, Mario Accolla2, Jessica Agarwal3, Michael F. A'Hearn11, Nicolas Altobelli7, Francesco Angrilli12, M. Antonietta Barucci13, Jean-Loup Bertaux14, Ivano Bertini12, Dennis Bodewits11, E. Bussoletti2, Luigi Colangeli15, M. Cosi16, Gabriele Cremonese1, Jean-François Crifo14, Vania Da Deppo, Björn Davidsson9, Stefano Debei12, Mariolino De Cecco17, Francesca Esposito1, M. Ferrari2, M. Ferrari1, Sonia Fornasier13, F. Giovane18, Bo Å. S. Gustafson19, Simon F. Green20, Olivier Groussin5, Eberhard Grün3, Carsten Güttler3, M. Herranz4, Stubbe F. Hviid21, Wing Ip22, Stavro Ivanovski1, José M. Jerónimo4, Laurent Jorda5, J. Knollenberg21, R. Kramm3, Ekkehard Kührt21, Michael Küppers7, Monica Lazzarin, Mark Leese20, Antonio C. López-Jiménez4, F. Lucarelli2, Stephen C. Lowry23, Francesco Marzari12, Elena Mazzotta Epifani1, J. Anthony M. McDonnell20, J. Anthony M. McDonnell23, Vito Mennella1, Harald Michalik, A. Molina24, R. Morales4, Fernando Moreno4, Stefano Mottola21, Giampiero Naletto, Nilda Oklay3, Jose Luis Ortiz4, Ernesto Palomba1, Pasquale Palumbo1, Pasquale Palumbo2, Jean-Marie Perrin14, Jean-Marie Perrin25, J. E. Rodriguez4, L. Sabau26, Colin Snodgrass20, Colin Snodgrass3, Roberto Sordini1, Nicolas Thomas27, Cecilia Tubiana3, Jean-Baptiste Vincent3, Paul R. Weissman28, K. P. Wenzel7, Vladimir Zakharov13, John C. Zarnecki20, John C. Zarnecki6 
23 Jan 2015-Science
TL;DR: In this article, the GIADA (Grain Impact Analyser and Dust Accumulator) experiment on the European Space Agency's Rosetta spacecraft orbiting comet 67P/Churyumov-Gerasimenko was used to detect 35 outflowing grains of mass 10−10 to 10−7 kilograms.
Abstract: Critical measurements for understanding accretion and the dust/gas ratio in the solar nebula, where planets were forming 4.5 billion years ago, are being obtained by the GIADA (Grain Impact Analyser and Dust Accumulator) experiment on the European Space Agency’s Rosetta spacecraft orbiting comet 67P/Churyumov-Gerasimenko. Between 3.6 and 3.4 astronomical units inbound, GIADA and OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) detected 35 outflowing grains of mass 10−10 to 10−7 kilograms, and 48 grains of mass 10−5 to 10−2 kilograms, respectively. Combined with gas data from the MIRO (Microwave Instrument for the Rosetta Orbiter) and ROSINA (Rosetta Orbiter Spectrometer for Ion and Neutral Analysis) instruments, we find a dust/gas mass ratio of 4 ± 2 averaged over the sunlit nucleus surface. A cloud of larger grains also encircles the nucleus in bound orbits from the previous perihelion. The largest orbiting clumps are meter-sized, confirming the dust/gas ratio of 3 inferred at perihelion from models of dust comae and trails.

Journal ArticleDOI
TL;DR: Experimental results with three Radarsat-2 images in quad polarization mode indicate that classification accuracies could be significantly increased by integrating spatial and polarimetric features using ensemble learning strategies.
Abstract: Fully Polarimetric Synthetic Aperture Radar (PolSAR) has the advantages of all-weather, day and night observation and high resolution capabilities. The collected data are usually sorted in Sinclair matrix, coherence or covariance matrices which are directly related to physical properties of natural media and backscattering mechanism. Additional information related to the nature of scattering medium can be exploited through polarimetric decomposition theorems. Accordingly, PolSAR image classification gains increasing attentions from remote sensing communities in recent years. However, the above polarimetric measurements or parameters cannot provide sufficient information for accurate PolSAR image classification in some scenarios, e.g. in complex urban areas where different scattering mediums may exhibit similar PolSAR response due to couples of unavoidable reasons. Inspired by the complementarity between spectral and spatial features bringing remarkable improvements in optical image classification, the complementary information between polarimetric and spatial features may also contribute to PolSAR image classification. Therefore, the roles of textural features such as contrast, dissimilarity, homogeneity and local range, morphological profiles (MPs) in PolSAR image classification are investigated using two advanced ensemble learning (EL) classifiers: Random Forest and Rotation Forest. Supervised Wishart classifier and support vector machines (SVMs) are used as benchmark classifiers for the evaluation and comparison purposes. Experimental results with three Radarsat-2 images in quad polarization mode indicate that classification accuracies could be significantly increased by integrating spatial and polarimetric features using ensemble learning strategies. Rotation Forest can get better accuracy than SVM and Random Forest, in the meantime, Random Forest is much faster than Rotation Forest.

Journal ArticleDOI
TL;DR: A review of the state-of-the-art and most recent advances of compressive sensing and related methods as applied to electromagnetics can be found in this article, where a wide set of applicative scenarios comprising the diagnosis and synthesis of antenna arrays, the estimation of directions of arrival, and the solution of inverse scattering and radar imaging problems are reviewed.
Abstract: Several problems arising in electromagnetics can be directly formulated or suitably recast for an effective solution within the compressive sensing (CS) framework. This has motivated a great interest in developing and applying CS methodologies to several conventional and innovative electromagnetic scenarios. This work is aimed at presenting, to the best of the authors’ knowledge, a review of the state-of-the-art and most recent advances of CS formulations and related methods as applied to electromagnetics. Toward this end, a wide set of applicative scenarios comprising the diagnosis and synthesis of antenna arrays, the estimation of directions of arrival, and the solution of inverse scattering and radar imaging problems are reviewed. Current challenges and trends in the application of CS to the solution of traditional and new electromagnetic problems are also discussed.

Journal ArticleDOI
23 Jan 2015-Science
TL;DR: Images of comet 67P/Churyumov-Gerasimenko acquired by the OSIRIS imaging system onboard the European Space Agency’s Rosetta spacecraft offer some support for subsurface fluidization models and mass loss through the ejection of large chunks of material.
Abstract: Images of comet 67P/Churyumov-Gerasimenko acquired by the OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System) imaging system onboard the European Space Agency’s Rosetta spacecraft at scales of better than 0.8 meter per pixel show a wide variety of different structures and textures. The data show the importance of airfall, surface dust transport, mass wasting, and insolation weathering for cometary surface evolution, and they offer some support for subsurface fluidization models and mass loss through the ejection of large chunks of material.

Journal ArticleDOI
TL;DR: In this article, the authors argue that current levels of youth unemployment need to be understood in the context of increased labor market flexibility, an expansion of higher education, youth migration, and family legacies of long-term unemployment.
Abstract: Current levels of youth unemployment need to be understood in the context of increased labor market flexibility, an expansion of higher education, youth migration, and family legacies of long-term unemployment. Compared with previous recessions, European-wide policies and investments have significantly increased with attempts to support national policies. By mapping these developments and debates, we illustrate the different factors shaping the future of European labor markets. We argue that understanding youth unemployment requires a holistic approach that combines an analysis of changes in the economic sphere around labor market flexibility, skills attainment, and employer demand, as well as understanding the impact of family legacies affecting increasingly polarized trajectories for young people today. The success of EU policy initiatives and investments will be shaped by the ability of national actors to implement these effectively.

Journal ArticleDOI
TL;DR: The first direct search for lepton-flavour-violating decays of the recently discovered Higgs boson (H) is described in this paper, where the search is performed in the H→μτ_e and H→mτ_h channels, where τeτe and τ_h are tau leptons reconstructed in the electronic and hadronic decay channels, respectively.

Journal ArticleDOI
TL;DR: This article establishes a consolidated analysis framework that advances the fundamental understanding of Web service composition building blocks in terms of concepts, models, languages, productivity support techniques, and tools and reviews the state of the art in service composition from an unprecedented, holistic perspective.
Abstract: Web services are a consolidated reality of the modern Web with tremendous, increasing impact on everyday computing tasks. They turned the Web into the largest, most accepted, and most vivid distributed computing platform ever. Yet, the use and integration of Web services into composite services or applications, which is a highly sensible and conceptually non-trivial task, is still not unleashing its full magnitude of power. A consolidated analysis framework that advances the fundamental understanding of Web service composition building blocks in terms of concepts, models, languages, productivity support techniques, and tools is required. This framework is necessary to enable effective exploration, understanding, assessing, comparing, and selecting service composition models, languages, techniques, platforms, and tools. This article establishes such a framework and reviews the state of the art in service composition from an unprecedented, holistic perspective.

Journal ArticleDOI
TL;DR: The majority of advanced, treatment-resistant tumors across tumor types harbor biologically informative alterations, and the establishment of a clinical trial for WES of metastatic tumors with prospective follow-up of patients can help identify candidate predictive biomarkers of response.
Abstract: Importance Understanding molecular mechanisms of response and resistance to anticancer therapies requires prospective patient follow-up and clinical and functional validation of both common and low-frequency mutations. We describe a whole-exome sequencing (WES) precision medicine trial focused on patients with advanced cancer. Objective To understand how WES data affect therapeutic decision making in patients with advanced cancer and to identify novel biomarkers of response. Design, Setting, and Patients Patients with metastatic and treatment-resistant cancer were prospectively enrolled at a single academic center for paired metastatic tumor and normal tissue WES during a 19-month period (February 2013 through September 2014). A comprehensive computational pipeline was used to detect point mutations, indels, and copy number alterations. Mutations were categorized as category 1, 2, or 3 on the basis of actionability; clinical reports were generated and discussed in precision tumor board. Patients were observed for 7 to 25 months for correlation of molecular information with clinical response. Main Outcomes and Measures Feasibility, use of WES for decision making, and identification of novel biomarkers. Results A total of 154 tumor-normal pairs from 97 patients with a range of metastatic cancers were sequenced, with a mean coverage of 95X and 16 somatic alterations detected per patient. In total, 16 mutations were category 1 (targeted therapy available), 98 were category 2 (biologically relevant), and 1474 were category 3 (unknown significance). Overall, WES provided informative results in 91 cases (94%), including alterations for which there is an approved drug, there are therapies in clinical or preclinical development, or they are considered drivers and potentially actionable (category 1-2); however, treatment was guided in only 5 patients (5%) on the basis of these recommendations because of access to clinical trials and/or off-label use of drugs. Among unexpected findings, a patient with prostate cancer with exceptional response to treatment was identified who harbored a somatic hemizygous deletion of the DNA repair gene FANCA and putative partial loss of function of the second allele through germline missense variant. Follow-up experiments established that loss of FANCA function was associated with platinum hypersensitivity both in vitro and in patient-derived xenografts, thus providing biologic rationale and functional evidence for his extreme clinical response. Conclusions and Relevance The majority of advanced, treatment-resistant tumors across tumor types harbor biologically informative alterations. The establishment of a clinical trial for WES of metastatic tumors with prospective follow-up of patients can help identify candidate predictive biomarkers of response.

Journal ArticleDOI
TL;DR: DECAF is presented, a detailed analysis of the correlations between participants' self-assessments and their physiological responses and single-trial classification results for valence, arousal and dominance are presented, with performance evaluation against existing data sets.
Abstract: In this work, we present DECAF —a multimodal data set for dec oding user physiological responses to af fective multimedia content. Different from data sets such as DEAP [15] and MAHNOB-HCI [31] , DECAF contains (1) brain signals acquired using the Magnetoencephalogram (MEG) sensor, which requires little physical contact with the user’s scalp and consequently facilitates naturalistic affective response, and (2) explicit and implicit emotional responses of 30 participants to 40 one-minute music video segments used in [15] and 36 movie clips, thereby enabling comparisons between the EEG versus MEG modalities as well as movie versus music stimuli for affect recognition. In addition to MEG data, DECAF comprises synchronously recorded near-infra-red (NIR) facial videos, horizontal Electrooculogram (hEOG), Electrocardiogram (ECG), and trapezius-Electromyogram (tEMG) peripheral physiological responses. To demonstrate DECAF’s utility, we present (i) a detailed analysis of the correlations between participants’ self-assessments and their physiological responses and (ii) single-trial classification results for valence , arousal and dominance , with performance evaluation against existing data sets. DECAF also contains time-continuous emotion annotations for movie clips from seven users, which we use to demonstrate dynamic emotion prediction.

Journal ArticleDOI
TL;DR: A new approach, Hibernus, is proposed, which enables computation to be sustained during intermittent supply by reactively hibernating: saving system state only once, when power is about to be lost, and then sleeping until the supply recovers.
Abstract: A key challenge to the future of energy-harvesting systems is the discontinuous power supply that is often generated. We propose a new approach, Hibernus, which enables computation to be sustained during intermittent supply. The approach has a low energy and time overhead which is achieved by reactively hibernating: saving system state only once, when power is about to be lost, and then sleeping until the supply recovers. We validate the approach experimentally on a processor with FRAM nonvolatile memory, allowing it to reactively hibernate using only energy stored in its decoupling capacitance. When compared to a recently proposed technique, the approach reduces processor time and energy overheads by 76%–100% and 49%–79% respectively.