scispace - formally typeset
Search or ask a question

Showing papers by "University of Perugia published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy3  +970 moreInstitutions (114)
TL;DR: This second gravitational-wave observation provides improved constraints on stellar populations and on deviations from general relativity.
Abstract: We report the observation of a gravitational-wave signal produced by the coalescence of two stellar-mass black holes. The signal, GW151226, was observed by the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) on December 26, 2015 at 03:38:53 UTC. The signal was initially identified within 70 s by an online matched-filter search targeting binary coalescences. Subsequent off-line analyses recovered GW151226 with a network signal-to-noise ratio of 13 and a significance greater than 5 σ. The signal persisted in the LIGO frequency band for approximately 1 s, increasing in frequency and amplitude over about 55 cycles from 35 to 450 Hz, and reached a peak gravitational strain of 3.4+0.7−0.9×10−22. The inferred source-frame initial black hole masses are 14.2+8.3−3.7M⊙ and 7.5+2.3−2.3M⊙ and the final black hole mass is 20.8+6.1−1.7M⊙. We find that at least one of the component black holes has spin greater than 0.2. This source is located at a luminosity distance of 440+180−190 Mpc corresponding to a redshift 0.09+0.03−0.04. All uncertainties define a 90 % credible interval. This second gravitational-wave observation provides improved constraints on stellar populations and on deviations from general relativity.

3,448 citations


Journal ArticleDOI
TL;DR: This updated version of mclust adds new covariance structures, dimension reduction capabilities for visualisation, model selection criteria, initialisation strategies for the EM algorithm, and bootstrap-based inference, making it a full-featured R package for data analysis via finite mixture modelling.
Abstract: Finite mixture models are being used increasingly to model a wide variety of random phenomena for clustering, classification and density estimation. mclust is a powerful and popular package which allows modelling of data as a Gaussian finite mixture with different covariance structures and different numbers of mixture components, for a variety of purposes of analysis. Recently, version 5 of the package has been made available on CRAN. This updated version adds new covariance structures, dimension reduction capabilities for visualisation, model selection criteria, initialisation strategies for the EM algorithm, and bootstrap-based inference, making it a full-featured R package for data analysis via finite mixture modelling.

1,841 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy1  +976 moreInstitutions (107)
TL;DR: It is found that the final remnant's mass and spin, as determined from the low-frequency and high-frequency phases of the signal, are mutually consistent with the binary black-hole solution in general relativity.
Abstract: The LIGO detection of GW150914 provides an unprecedented opportunity to study the two-body motion of a compact-object binary in the large-velocity, highly nonlinear regime, and to witness the final merger of the binary and the excitation of uniquely relativistic modes of the gravitational field. We carry out several investigations to determine whether GW150914 is consistent with a binary black-hole merger in general relativity. We find that the final remnant’s mass and spin, as determined from the low-frequency (inspiral) and high-frequency (postinspiral) phases of the signal, are mutually consistent with the binary black-hole solution in general relativity. Furthermore, the data following the peak of GW150914 are consistent with the least-damped quasinormal mode inferred from the mass and spin of the remnant black hole. By using waveform models that allow for parametrized general-relativity violations during the inspiral and merger phases, we perform quantitative tests on the gravitational-wave phase in the dynamical regime and we determine the first empirical bounds on several high-order post-Newtonian coefficients. We constrain the graviton Compton wavelength, assuming that gravitons are dispersed in vacuum in the same way as particles with mass, obtaining a 90%-confidence lower bound of 1013 km. In conclusion, within our statistical uncertainties, we find no evidence for violations of general relativity in the genuinely strong-field regime of gravity.

1,421 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy3  +978 moreInstitutions (112)
TL;DR: The first observational run of the Advanced LIGO detectors, from September 12, 2015 to January 19, 2016, saw the first detections of gravitational waves from binary black hole mergers as discussed by the authors.
Abstract: The first observational run of the Advanced LIGO detectors, from September 12, 2015 to January 19, 2016, saw the first detections of gravitational waves from binary black hole mergers. In this paper we present full results from a search for binary black hole merger signals with total masses up to 100M⊙ and detailed implications from our observations of these systems. Our search, based on general-relativistic models of gravitational wave signals from binary black hole systems, unambiguously identified two signals, GW150914 and GW151226, with a significance of greater than 5σ over the observing period. It also identified a third possible signal, LVT151012, with substantially lower significance, which has a 87% probability of being of astrophysical origin. We provide detailed estimates of the parameters of the observed systems. Both GW150914 and GW151226 provide an unprecedented opportunity to study the two-body motion of a compact-object binary in the large velocity, highly nonlinear regime. We do not observe any deviations from general relativity, and place improved empirical bounds on several high-order post-Newtonian coefficients. From our observations we infer stellar-mass binary black hole merger rates lying in the range 9−240Gpc−3yr−1. These observations are beginning to inform astrophysical predictions of binary black hole formation rates, and indicate that future observing runs of the Advanced detector network will yield many more gravitational wave detections.

1,172 citations


Journal ArticleDOI
TL;DR: Population modeling and cage experiments indicate that a CRISPR-Cas9 construct targeting one of these loci meets the minimum requirement for a gene drive targeting female reproduction in an insect population, which could expedite the development of gene drives to suppress mosquito populations to levels that do not support malaria transmission.
Abstract: Gene drive systems that enable super-Mendelian inheritance of a transgene have the potential to modify insect populations over a timeframe of a few years. We describe CRISPR-Cas9 endonuclease constructs that function as gene drive systems in Anopheles gambiae, the main vector for malaria. We identified three genes (AGAP005958, AGAP011377 and AGAP007280) that confer a recessive female-sterility phenotype upon disruption, and inserted into each locus CRISPR-Cas9 gene drive constructs designed to target and edit each gene. For each targeted locus we observed a strong gene drive at the molecular level, with transmission rates to progeny of 91.4 to 99.6%. Population modeling and cage experiments indicate that a CRISPR-Cas9 construct targeting one of these loci, AGAP007280, meets the minimum requirement for a gene drive targeting female reproduction in an insect population. These findings could expedite the development of gene drives to suppress mosquito populations to levels that do not support malaria transmission.

955 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +984 moreInstitutions (116)
TL;DR: The data around the time of the event were analyzed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity.
Abstract: On September 14, 2015, the Laser Interferometer Gravitational-wave Observatory (LIGO) detected a gravitational-wave transient (GW150914); we characterise the properties of the source and its parameters. The data around the time of the event were analysed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity. GW150914 was produced by a nearly equal mass binary black hole of $36^{+5}_{-4} M_\odot$ and $29^{+4}_{-4} M_\odot$ (for each parameter we report the median value and the range of the 90% credible interval). The dimensionless spin magnitude of the more massive black hole is bound to be $0.7$ (at 90% probability). The luminosity distance to the source is $410^{+160}_{-180}$ Mpc, corresponding to a redshift $0.09^{+0.03}_{-0.04}$ assuming standard cosmology. The source location is constrained to an annulus section of $590$ deg$^2$, primarily in the southern hemisphere. The binary merges into a black hole of $62^{+4}_{-4} M_\odot$ and spin $0.67^{+0.05}_{-0.07}$. This black hole is significantly more massive than any other known in the stellar-mass regime.

874 citations


Journal ArticleDOI
Fengpeng An1, Guangpeng An, Qi An2, Vito Antonelli3  +226 moreInstitutions (55)
TL;DR: The Jiangmen Underground Neutrino Observatory (JUNO) as mentioned in this paper is a 20kton multi-purpose underground liquid scintillator detector with the determination of neutrino mass hierarchy (MH) as a primary physics goal.
Abstract: The Jiangmen Underground Neutrino Observatory (JUNO), a 20 kton multi-purpose underground liquid scintillator detector, was proposed with the determination of the neutrino mass hierarchy (MH) as a primary physics goal. The excellent energy resolution and the large fiducial volume anticipated for the JUNO detector offer exciting opportunities for addressing many important topics in neutrino and astro-particle physics. In this document, we present the physics motivations and the anticipated performance of the JUNO detector for various proposed measurements. Following an introduction summarizing the current status and open issues in neutrino physics, we discuss how the detection of antineutrinos generated by a cluster of nuclear power plants allows the determination of the neutrino MH at a 3–4σ significance with six years of running of JUNO. The measurement of antineutrino spectrum with excellent energy resolution will also lead to the precise determination of the neutrino oscillation parameters ${\mathrm{sin}}^{2}{\theta }_{12}$, ${\rm{\Delta }}{m}_{21}^{2}$, and $| {\rm{\Delta }}{m}_{{ee}}^{2}| $ to an accuracy of better than 1%, which will play a crucial role in the future unitarity test of the MNSP matrix. The JUNO detector is capable of observing not only antineutrinos from the power plants, but also neutrinos/antineutrinos from terrestrial and extra-terrestrial sources, including supernova burst neutrinos, diffuse supernova neutrino background, geoneutrinos, atmospheric neutrinos, and solar neutrinos. As a result of JUNO's large size, excellent energy resolution, and vertex reconstruction capability, interesting new data on these topics can be collected. For example, a neutrino burst from a typical core-collapse supernova at a distance of 10 kpc would lead to ∼5000 inverse-beta-decay events and ∼2000 all-flavor neutrino–proton ES events in JUNO, which are of crucial importance for understanding the mechanism of supernova explosion and for exploring novel phenomena such as collective neutrino oscillations. Detection of neutrinos from all past core-collapse supernova explosions in the visible universe with JUNO would further provide valuable information on the cosmic star-formation rate and the average core-collapse neutrino energy spectrum. Antineutrinos originating from the radioactive decay of uranium and thorium in the Earth can be detected in JUNO with a rate of ∼400 events per year, significantly improving the statistics of existing geoneutrino event samples. Atmospheric neutrino events collected in JUNO can provide independent inputs for determining the MH and the octant of the ${\theta }_{23}$ mixing angle. Detection of the (7)Be and (8)B solar neutrino events at JUNO would shed new light on the solar metallicity problem and examine the transition region between the vacuum and matter dominated neutrino oscillations. Regarding light sterile neutrino topics, sterile neutrinos with ${10}^{-5}\,{{\rm{eV}}}^{2}\lt {\rm{\Delta }}{m}_{41}^{2}\lt {10}^{-2}\,{{\rm{eV}}}^{2}$ and a sufficiently large mixing angle ${\theta }_{14}$ could be identified through a precise measurement of the reactor antineutrino energy spectrum. Meanwhile, JUNO can also provide us excellent opportunities to test the eV-scale sterile neutrino hypothesis, using either the radioactive neutrino sources or a cyclotron-produced neutrino beam. The JUNO detector is also sensitive to several other beyondthe-standard-model physics. Examples include the search for proton decay via the $p\to {K}^{+}+\bar{ u }$ decay channel, search for neutrinos resulting from dark-matter annihilation in the Sun, search for violation of Lorentz invariance via the sidereal modulation of the reactor neutrino event rate, and search for the effects of non-standard interactions. The proposed construction of the JUNO detector will provide a unique facility to address many outstanding crucial questions in particle and astrophysics in a timely and cost-effective fashion. It holds the great potential for further advancing our quest to understanding the fundamental properties of neutrinos, one of the building blocks of our Universe.

807 citations


Journal ArticleDOI
25 Jan 2016-PLOS ONE
TL;DR: The higher BP in SSA is maintained over decades, suggesting limited efficacy of prevention strategies in such group in Europe, and the lower BP in Muslim populations suggests that yet untapped lifestyle and behavioral habits may reveal advantages towards the development of hypertension.
Abstract: Background: People of Sub Saharan Africa (SSA) and South Asians(SA) ethnic minorities living in Europe have higher risk of stroke than native Europeans(EU). Study objective is to provide an assessment of gender specific absolute differences in office systolic(SBP) and diastolic(DBP) blood pressure(BP) levels between SSA, SA, and EU. Methods and Findings: We performed a systematic review and meta-analysis of observational studies conducted in Europe that examined BP in non-selected adult SSA, SA and EU subjects. Medline, PubMed, Embase, Web of Science, and Scopus were searched from their inception through January 31st 2015, for relevant articles. Outcome measures were mean SBP and DBP differences between minorities and EU, using a random effects model and tested for heterogeneity. Twenty-one studies involving 9,070 SSA, 18,421 SA, and 130,380 EU were included. Compared with EU, SSA had higher values of both SBP (3.38 mmHg, 95% CI 1.28 to 5.48 mmHg; and 6.00 mmHg, 95% CI 2.22 to 9.78 in men and women respectively) and DBP (3.29 mmHg, 95% CI 1.80 to 4.78; 5.35 mmHg, 95% CI 3.04 to 7.66). SA had lower SBP than EU(-4.57 mmHg, 95% CI -6.20 to -2.93; -2.97 mmHg, 95% CI -5.45 to -0.49) but similar DBP values. Meta-analysis by subgroup showed that SA originating from countries where Islam is the main religion had lower SBP and DBP values than EU. In multivariate meta-regression analyses, SBP difference between minorities and EU populations, was influenced by panethnicity and diabetes prevalence. Conclusions: 1) The higher BP in SSA is maintained over decades, suggesting limited efficacy of prevention strategies in such group in Europe;2) The lower BP in Muslim populations suggests that yet untapped lifestyle and behavioral habits may reveal advantages towards the development of hypertension;3) The additive effect of diabetes, emphasizes the need of new strategies for the control of hypertension in groups at high prevalence of diabetes.

792 citations


Journal ArticleDOI
TL;DR: A review of landslide-climate studies can be found in this paper, where the authors examine advantages and limits of the approaches adopted to evaluate the effects of climate variations on landslides, including prospective modelling and retrospective methods that use landslide and climate records.

710 citations


Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2283 moreInstitutions (141)
TL;DR: Combined fits to CMS UE proton–proton data at 7TeV and to UEProton–antiproton data from the CDF experiment at lower s, are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13.
Abstract: New sets of parameters ("tunes") for the underlying-event (UE) modeling of the PYTHIA8, PYTHIA6 and HERWIG++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE data at sqrt(s) = 7 TeV and to UE data from the CDF experiment at lower sqrt(s), are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13 TeV. In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons of the UE tunes to "minimum bias" (MB) events, multijet, and Drell-Yan (q q-bar to Z / gamma* to lepton-antilepton + jets) observables at 7 and 8 TeV are presented, as well as predictions of MB and UE observables at 13 TeV.

Journal ArticleDOI
TL;DR: The Commission has identified ten essential and achievable goals and ten accompanying, mutually additive, and synergistic key actions that—if implemented effectively and broadly—will make substantial contributions to the management of blood pressure globally.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a review of the main commercialized insulation materials (conventional, alternative and advanced) for the building sector through a holistic and multidisciplinary approach, considering thermal properties, acoustic properties, reaction to fire and water vapor resistance; environmental issues were also taken into account by means of Life Cycle Assessment approach.
Abstract: The energy consumption of a building is strongly dependent on the characteristics of its envelope. The thermal performance of external walls represents a key factor to increase the energy efficiency of the construction sector and to reduce greenhouse gases emissions. Thermal insulation is undoubtedly one of the best ways to reduce the energy consumption due to both winter heating and summer cooling. Insulation materials play an important role in this scenario since the selection of the correct material, its thickness and its position, allow to obtain good indoor thermal comfort conditions and adequate energy savings. Thermal properties are extremely important, but they are not the only ones to be considered when designing a building envelope: sound insulation, resistance to fire, water vapor permeability and impact on the environment and on human health need to be carefully assessed too. The purpose of the paper is to provide a review of the main commercialized insulation materials (conventional, alternative and advanced) for the building sector through a holistic and multidisciplinary approach, considering thermal properties, acoustic properties, reaction to fire and water vapor resistance; environmental issues were also taken into account by means of Life Cycle Assessment approach. A comparative analysis was performed, considering also unconventional insulation materials that are not yet present in the market. Finally a case study was conducted evaluating both thermal transmittance and dynamic thermal properties of one lightweight and three heavyweight walls, with different types of insulating materials and ways of installation (external, internal or cavity insulation).

Journal ArticleDOI
TL;DR: A comprehensive analysis of surface texture metrology for metal additive manufacturing has been performed in this paper, where the results of this analysis are divided into sections that address specific areas of interest: industrial domain; additive manufacturing processes and materials; types of surface investigated; surface measurement technology and surface texture characterisation.
Abstract: A comprehensive analysis of literature pertaining to surface texture metrology for metal additive manufacturing has been performed. This review paper structures the results of this analysis into sections that address specific areas of interest: industrial domain; additive manufacturing processes and materials; types of surface investigated; surface measurement technology and surface texture characterisation. Each section reports on how frequently specific techniques, processes or materials have been utilised and discusses how and why they are employed. Based on these results, possible optimisation of methods and reporting is suggested and the areas that may have significant potential for future research are highlighted.

Journal ArticleDOI
M. Aguilar, L. Ali Cavasonza1, Behcet Alpat2, G. Ambrosi2  +265 moreInstitutions (39)
TL;DR: In the absolute rigidity range ∼60 to ∼500 GV, the antiproton p[over ¯], proton p, and positron e^{+} fluxes are found to have nearly identical rigidity dependence and the electron e^{-} flux exhibits a different rigidity dependent.
Abstract: A precision measurement by AMS of the antiproton flux and the antiproton-to-proton flux ratio in primary cosmic rays in the absolute rigidity range from 1 to 450 GV is presented based on 3.49 × 105 antiproton events and 2.42 × 109 proton events. The fluxes and flux ratios of charged elementary particles in cosmic rays are also presented. In the absolute rigidity range ∼60 to ∼500 GV, the antiproton ¯p, proton p, and positron eþ fluxes are found to have nearly identical rigidity dependence and the electron e− flux exhibits a different rigidity dependence. Below 60 GV, the ( ¯ p=p), ( ¯ p=eþ), and (p=eþ) flux ratios each reaches a maximum. From ∼60 to ∼500 GV, the ( ¯ p=p), ( ¯ p=eþ), and (p=eþ) flux ratios show no rigidity dependence. These are new observations of the properties of elementary particles in the cosmos.

Journal ArticleDOI
Marco Ajello1, Andrea Albert2, W. B. Atwood3, Guido Barbiellini4  +155 moreInstitutions (45)
TL;DR: The Fermi Large Area Telescope (LAT) has provided the most detailed view to date of the emission toward the Galactic center (GC) in high-energy gamma-rays as mentioned in this paper.
Abstract: The Fermi Large Area Telescope (LAT) has provided the most detailed view to date of the emission toward the Galactic center (GC) in high-energy gamma-rays. This paper describes the analysis of data ...

Journal ArticleDOI
Fabio Acero1, M. Ackermann, Marco Ajello2, Andrea Albert3  +166 moreInstitutions (37)
TL;DR: In this article, the authors describe the development of the Galactic Interstellar Emission Model (GIEM) which is the standard adopted by the LAT Collaboration and is publicly available, based on a linear combination of maps for interstellar gas column density in Galactocentric annuli and for the inverse-Compton emission produced in the Galaxy.
Abstract: Most of the celestial γ rays detected by the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope originate from the interstellar medium when energetic cosmic rays interact with interstellar nucleons and photons. Conventional point-source and extended-source studies rely on the modeling of this diffuse emission for accurate characterization. Here, we describe the development of the Galactic Interstellar Emission Model (GIEM), which is the standard adopted by the LAT Collaboration and is publicly available. This model is based on a linear combination of maps for interstellar gas column density in Galactocentric annuli and for the inverse-Compton emission produced in the Galaxy. In the GIEM, we also include large-scale structures like Loop I and the Fermi bubbles. The measured gas emissivity spectra confirm that the cosmic-ray proton density decreases with Galactocentric distance beyond 5 kpc from the Galactic Center. The measurements also suggest a softening of the proton spectrum with Galactocentric distance. We observe that the Fermi bubbles have boundaries with a shape similar to a catenary at latitudes below 20° and we observe an enhanced emission toward their base extending in the north and south Galactic directions and located within ∼4° of the Galactic Center.

Journal ArticleDOI
TL;DR: An underappreciated risk is demonstrated for the treatment of allo-HSCT recipients with antibiotics that may exacerbate GVHD in the colon, and this study suggests that not all antibiotic regimens are appropriate for treating transplant patients.
Abstract: Intestinal bacteria may modulate the risk of infection and graft-versus-host disease (GVHD) after allogeneic hematopoietic stem cell transplantation (allo-HSCT) Allo-HSCT recipients often develop neutropenic fever, which is treated with antibiotics that may target anaerobic bacteria in the gut We retrospectively examined 857 allo-HSCT recipients and found that treatment of neutropenic fever with imipenem-cilastatin and piperacillin-tazobactam antibiotics was associated with increased GVHD-related mortality at 5 years (215% for imipenem-cilastatin-treated patients versus 131% for untreated patients, P = 0025; 198% for piperacillin-tazobactam-treated patients versus 119% for untreated patients, P = 0007) However, two other antibiotics also used to treat neutropenic fever, aztreonam and cefepime, were not associated with GVHD-related mortality (P = 078 and P = 098, respectively) Analysis of stool specimens from allo-HSCT recipients showed that piperacillin-tazobactam administration was associated with perturbation of gut microbial composition Studies in mice demonstrated aggravated GVHD mortality with imipenem-cilastatin or piperacillin-tazobactam compared to aztreonam (P < 001 and P < 005, respectively) We found pathological evidence for increased GVHD in the colon of imipenem-cilastatin-treated mice (P < 005), but no difference in the concentration of short-chain fatty acids or numbers of regulatory T cells Notably, imipenem-cilastatin treatment of mice with GVHD led to loss of the protective mucus lining of the colon (P < 001) and the compromising of intestinal barrier function (P < 005) Sequencing of mouse stool specimens showed an increase in Akkermansia muciniphila (P < 0001), a commensal bacterium with mucus-degrading capabilities, raising the possibility that mucus degradation may contribute to murine GVHD We demonstrate an underappreciated risk for the treatment of allo-HSCT recipients with antibiotics that may exacerbate GVHD in the colon

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +977 moreInstitutions (106)
TL;DR: In this paper, the results of a matched-filter search using relativistic models of compact-object binaries that recovered GW150914 as the most significant event during the coincident observations between the two LIGO detectors were reported.
Abstract: On September 14, 2015, at 09∶50:45 UTC the two detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) simultaneously observed the binary black hole merger GW150914. We report the results of a matched-filter search using relativistic models of compact-object binaries that recovered GW150914 as the most significant event during the coincident observations between the two LIGO detectors from September 12 to October 20, 2015 GW150914 was observed with a matched-filter signal-to-noise ratio of 24 and a false alarm rate estimated to be less than 1 event per 203000 years, equivalent to a significance greater than 5.1 σ.


Journal ArticleDOI
TL;DR: A new dataset of 5,000 histological images of human colorectal cancer including eight different types of tissue is presented and an optimal classification strategy is found that markedly outperformed traditional methods, improving the state of the art for tumour-stroma separation and setting a new standard for multiclass tissue separation.
Abstract: Automatic recognition of different tissue types in histological images is an essential part in the digital pathology toolbox. Texture analysis is commonly used to address this problem; mainly in the context of estimating the tumour/stroma ratio on histological samples. However, although histological images typically contain more than two tissue types, only few studies have addressed the multi-class problem. For colorectal cancer, one of the most prevalent tumour types, there are in fact no published results on multiclass texture separation. In this paper we present a new dataset of 5,000 histological images of human colorectal cancer including eight different types of tissue. We used this set to assess the classification performance of a wide range of texture descriptors and classifiers. As a result, we found an optimal classification strategy that markedly outperformed traditional methods, improving the state of the art for tumour-stroma separation from 96.9% to 98.6% accuracy and setting a new standard for multiclass tissue separation (87.4% accuracy for eight classes). We make our dataset of histological images publicly available under a Creative Commons license and encourage other researchers to use it as a benchmark for their studies.

Journal ArticleDOI
03 Feb 2016
TL;DR: In this paper, the authors studied nonconvex distributed optimization in multi-agent networks with time-varying (nonsymmetric) connectivity and proposed an algorithmic framework for the distributed minimization of the sum of a smooth (possibly nonconcave and non-separable) function, the agents' sum-utility, plus a convex regularizer.
Abstract: We study nonconvex distributed optimization in multiagent networks with time-varying (nonsymmetric) connectivity. We introduce the first algorithmic framework for the distributed minimization of the sum of a smooth (possibly nonconvex and nonseparable) function—the agents’ sum-utility—plus a convex (possibly nonsmooth and nonseparable) regularizer. The latter is usually employed to enforce some structure in the solution, typically sparsity. The proposed method hinges on successive convex approximation techniques while leveraging dynamic consensus as a mechanism to distribute the computation among the agents: each agent first solves (possibly inexactly) a local convex approximation of the nonconvex original problem, and then performs local averaging operations. Asymptotic convergence to (stationary) solutions of the nonconvex problem is established. Our algorithmic framework is then customized to a variety of convex and nonconvex problems in several fields, including signal processing, communications, networking, and machine learning. Numerical results show that the new method compares favorably to existing distributed algorithms on both convex and nonconvex problems.

Journal ArticleDOI
TL;DR: In this largest meta-analysis of hypertensive patients, the nocturnal BP fall provided substantial prognostic information, independent of 24-hour SBP levels, and heterogeneity was low for systolic night-to-day ratio and Reverse dipping and moderate for extreme dippers.
Abstract: The prognostic importance of the nocturnal systolic blood pressure (SBP) fall, adjusted for average 24-hour SBP levels, is unclear. The Ambulatory Blood Pressure Collaboration in Patients With Hypertension (ABC-H) examined this issue in a meta-analysis of 17 312 hypertensives from 3 continents. Risks were computed for the systolic night-to-day ratio and for different dipping patterns (extreme, reduced, and reverse dippers) relative to normal dippers. ABC-H investigators provided multivariate adjusted hazard ratios (HRs), with and without adjustment for 24-hour SBP, for total cardiovascular events (CVEs), coronary events, strokes, cardiovascular mortality, and total mortality. Average 24-hour SBP varied from 131 to 140 mm Hg and systolic night-to-day ratio from 0.88 to 0.93. There were 1769 total CVEs, 916 coronary events, 698 strokes, 450 cardiovascular deaths, and 903 total deaths. After adjustment for 24-hour SBP, the systolic night-to-day ratio predicted all outcomes: from a 1-SD increase, summary HRs were 1.12 to 1.23. Reverse dipping also predicted all end points: HRs were 1.57 to 1.89. Reduced dippers, relative to normal dippers, had a significant 27% higher risk for total CVEs. Risks for extreme dippers were significantly influenced by antihypertensive treatment (P<0.001): untreated patients had increased risk of total CVEs (HR, 1.92), whereas treated patients had borderline lower risk (HR, 0.72) than normal dippers. For CVEs, heterogeneity was low for systolic night-to-day ratio and reverse/reduced dipping and moderate for extreme dippers. Quality of included studies was moderate to high, and publication bias was undetectable. In conclusion, in this largest meta-analysis of hypertensive patients, the nocturnal BP fall provided substantial prognostic information, independent of 24-hour SBP levels.

Journal ArticleDOI
TL;DR: The authors' ad-hoc findings provide preliminary evidence of safety and therapeutic benefit of HSC-GT, which resulted in protection from CNS demyelination in eight patients and, in at least three patients, amelioration of peripheral nervous system abnormalities, with signs of remyelinated at both sites.

Journal ArticleDOI
TL;DR: An uncertainty principle for graph signals is derived and the conditions for the recovery of band-limited signals from a subset of samples are illustrated and shown, showing an interesting link between uncertainty principle and sampling and proposing alternative signal recovery algorithms.
Abstract: In many applications, the observations can be represented as a signal defined over the vertices of a graph. The analysis of such signals requires the extension of standard signal processing tools. In this paper, first, we provide a class of graph signals that are maximally concentrated on the graph domain and on its dual. Then, building on this framework, we derive an uncertainty principle for graph signals and illustrate the conditions for the recovery of band-limited signals from a subset of samples. We show an interesting link between uncertainty principle and sampling and propose alternative signal recovery algorithms, including a generalization to frame-based reconstruction methods. After showing that the performance of signal recovery algorithms is significantly affected by the location of samples, we suggest and compare a few alternative sampling strategies. Finally, we provide the conditions for perfect recovery of a useful signal corrupted by sparse noise, showing that this problem is also intrinsically related to vertex-frequency localization properties.

Journal ArticleDOI
TL;DR: a Department of Digestive and Liver Disease, Ospedale Sant’Andrea, Rome , Italy; b NET Centre, St Vincent’s University and Department of Clinical Medicine, St. James Hospital and Trinity College, Dublin , Ireland;
Abstract: a Department of Digestive and Liver Disease, Ospedale Sant’Andrea, Rome , Italy; b NET Centre, St. Vincent’s University and Department of Clinical Medicine, St. James Hospital and Trinity College, Dublin , Ireland; c Department of Radiology, Section for Molecular Imaging, University Hospital, Uppsala , Sweden; d Netherlands Cancer Centre, Lijnden , The Netherlands; e NET Centre, Umbria Regional Cancer Network, Università degli Studi di Perugia, Perugia , Italy; f Gastroenterology Department, Hampshire Hospitals NHS Trust, Hampshire , UK; g Department of Endocrine and Metabolic Sciences, University of Genoa, Genoa , Italy; h Pancreatic Diseases Branch, Kyushu University Hospital, Fukuoka , Japan; i Department of Radiology, Memorial Sloan Kettering Cancer Center, New York, N.Y. , USA; j Department of Endocrinology, Peking Union Medical College Hospital, Beijing , China; k Department of Internal Medicine, Division of Endocrinology, Erasmus Medical Center, Rotterdam , The Netherlands; l Department of Visceral and Transplant Surgery, Campus Virchow Klinikum, Charité Universitätsmedizin Berlin, Berlin , Germany; m Department of Gastroenterology, Beaujon Hospital, Clichy , France

Journal ArticleDOI
TL;DR: The original version of this consensus statement on mechanical thrombectomy was approved at the European Stroke Organisation-Karolinska Stroke Update conference in Stockholm, 16–18 November 2014 and has later, during 2015, been updated with new clinical trials data in accordance with a decision made at the conference.
Abstract: The original version of this consensus statement on mechanical thrombectomy was approved at the European Stroke Organisation (ESO)-Karolinska Stroke Update conference in Stockholm, 16-18 November 2014. The statement has later, during 2015, been updated with new clinical trials data in accordance with a decision made at the conference. Revisions have been made at a face-to-face meeting during the ESO Winter School in Berne in February, through email exchanges and the final version has then been approved by each society. The recommendations are identical to the original version with evidence level upgraded by 20 February 2015 and confirmed by 15 May 2015. The purpose of the ESO-Karolinska Stroke Update meetings is to provide updates on recent stroke therapy research and to discuss how the results may be implemented into clinical routine. Selected topics are discussed at consensus sessions, for which a consensus statement is prepared and discussed by the participants at the meeting. The statements are advisory to the ESO guidelines committee. This consensus statement includes recommendations on mechanical thrombectomy after acute stroke. The statement is supported by ESO, European Society of Minimally Invasive Neurological Therapy (ESMINT), European Society of Neuroradiology (ESNR), and European Academy of Neurology (EAN).

Journal ArticleDOI
TL;DR: EffectorP is the first prediction program for fungal effectors based on machine learning and will facilitate functionalfungal effector studies and improve the understanding of effectors in plant-pathogen interactions.
Abstract: Eukaryotic filamentous plant pathogens secrete effector proteins that modulate the host cell to facilitate infection. Computational effector candidate identification and subsequent functional characterization delivers valuable insights into plant-pathogen interactions. However, effector prediction in fungi has been challenging due to a lack of unifying sequence features such as conserved N-terminal sequence motifs. Fungal effectors are commonly predicted from secretomes based on criteria such as small size and cysteine-rich, which suffers from poor accuracy. We present EffectorP which pioneers the application of machine learning to fungal effector prediction. EffectorP improves fungal effector prediction from secretomes based on a robust signal of sequence-derived properties, achieving sensitivity and specificity of over 80%. Features that discriminate fungal effectors from secreted noneffectors are predominantly sequence length, molecular weight and protein net charge, as well as cysteine, serine and tryptophan content. We demonstrate that EffectorP is powerful when combined with in planta expression data for predicting high-priority effector candidates. EffectorP is the first prediction program for fungal effectors based on machine learning. Our findings will facilitate functional fungal effector studies and improve our understanding of effectors in plant-pathogen interactions. EffectorP is available at http://effectorp.csiro.au.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +953 moreInstitutions (106)
TL;DR: It is concluded that the stochastic gravitational-wave background from binary black holes, created from the incoherent superposition of all the merging binaries in the Universe, is potentially measurable by the Advanced LIGO and Advanced Virgo detectors operating at their projected final sensitivity.
Abstract: The LIGO detection of the gravitational wave transient GW150914, from the inspiral and merger of two black holes with masses $\gtrsim 30\, \text{M}_\odot$, suggests a population of binary black holes with relatively high mass. This observation implies that the stochastic gravitational-wave background from binary black holes, created from the incoherent superposition of all the merging binaries in the Universe, could be higher than previously expected. Using the properties of GW150914, we estimate the energy density of such a background from binary black holes. In the most sensitive part of the Advanced LIGO/Virgo band for stochastic backgrounds (near 25 Hz), we predict $\Omega_\text{GW}(f=25 Hz) = 1.1_{-0.9}^{+2.7} \times 10^{-9}$ with 90\% confidence. This prediction is robustly demonstrated for a variety of formation scenarios with different parameters. The differences between models are small compared to the statistical uncertainty arising from the currently poorly constrained local coalescence rate. We conclude that this background is potentially measurable by the Advanced LIGO/Virgo detectors operating at their projected final sensitivity.

Posted Content
TL;DR: This work introduces the first algorithmic framework for the distributed minimization of the sum of a smooth function-the agents' sum-utility-plus a convex (possibly nonsmooth and nonseparable) regularizer, and shows that the new method compares favorably to existing distributed algorithms on both convex and nonconvex problems.
Abstract: We study nonconvex distributed optimization in multi-agent networks with time-varying (nonsymmetric) connectivity. We introduce the first algorithmic framework for the distributed minimization of the sum of a smooth (possibly nonconvex and nonseparable) function - the agents' sum-utility - plus a convex (possibly nonsmooth and nonseparable) regularizer. The latter is usually employed to enforce some structure in the solution, typically sparsity. The proposed method hinges on successive convex approximation techniques while leveraging dynamic consensus as a mechanism to distribute the computation among the agents: each agent first solves (possibly inexactly) a local convex approximation of the nonconvex original problem, and then performs local averaging operations. Asymptotic convergence to (stationary) solutions of the nonconvex problem is established. Our algorithmic framework is then customized to a variety of convex and nonconvex problems in several fields, including signal processing, communications, networking, and machine learning. Numerical results show that the new method compares favorably to existing distributed algorithms on both convex and nonconvex problems.