scispace - formally typeset
Search or ask a question

Showing papers by "Sapienza University of Rome published in 2019"


Journal ArticleDOI
Peter A. R. Ade1, James E. Aguirre2, Z. Ahmed3, Simone Aiola4  +276 moreInstitutions (53)
TL;DR: The Simons Observatory (SO) is a new cosmic microwave background experiment being built on Cerro Toco in Chile, due to begin observations in the early 2020s as mentioned in this paper.
Abstract: The Simons Observatory (SO) is a new cosmic microwave background experiment being built on Cerro Toco in Chile, due to begin observations in the early 2020s. We describe the scientific goals of the experiment, motivate the design, and forecast its performance. SO will measure the temperature and polarization anisotropy of the cosmic microwave background in six frequency bands centered at: 27, 39, 93, 145, 225 and 280 GHz. The initial configuration of SO will have three small-aperture 0.5-m telescopes and one large-aperture 6-m telescope, with a total of 60,000 cryogenic bolometers. Our key science goals are to characterize the primordial perturbations, measure the number of relativistic species and the mass of neutrinos, test for deviations from a cosmological constant, improve our understanding of galaxy evolution, and constrain the duration of reionization. The small aperture telescopes will target the largest angular scales observable from Chile, mapping ≈ 10% of the sky to a white noise level of 2 μK-arcmin in combined 93 and 145 GHz bands, to measure the primordial tensor-to-scalar ratio, r, at a target level of σ(r)=0.003. The large aperture telescope will map ≈ 40% of the sky at arcminute angular resolution to an expected white noise level of 6 μK-arcmin in combined 93 and 145 GHz bands, overlapping with the majority of the Large Synoptic Survey Telescope sky region and partially with the Dark Energy Spectroscopic Instrument. With up to an order of magnitude lower polarization noise than maps from the Planck satellite, the high-resolution sky maps will constrain cosmological parameters derived from the damping tail, gravitational lensing of the microwave background, the primordial bispectrum, and the thermal and kinematic Sunyaev-Zel'dovich effects, and will aid in delensing the large-angle polarization signal to measure the tensor-to-scalar ratio. The survey will also provide a legacy catalog of 16,000 galaxy clusters and more than 20,000 extragalactic sources.

1,027 citations


Journal ArticleDOI
TL;DR: A need to understand the processes and role of oxidative stress in neurodegenerative diseases is understood, with a focus on the pivotal role played by OS in mitochondrial dysfunction.
Abstract: Oxidative stress is proposed as a regulatory element in ageing and various neurological disorders. The excess of oxidants causes a reduction of antioxidants, which in turn produce an oxidation–reduction imbalance in organisms. Paucity of the antioxidant system generates oxidative-stress, characterized by elevated levels of reactive species (oxygen, hydroxyl free radical, and so on). Mitochondria play a key role in ATP supply to cells via oxidative phosphorylation, as well as synthesis of essential biological molecules. Various redox reactions catalyzed by enzymes take place in the oxidative phosphorylation process. An inefficient oxidative phosphorylation may generate reactive oxygen species (ROS), leading to mitochondrial dysfunction. Mitochondrial redox metabolism, phospholipid metabolism, and proteolytic pathways are found to be the major and potential source of free radicals. A lower concentration of ROS is essential for normal cellular signaling, whereas the higher concentration and long-time exposure of ROS cause damage to cellular macromolecules such as DNA, lipids and proteins, ultimately resulting in necrosis and apoptotic cell death. Normal and proper functioning of the central nervous system (CNS) is entirely dependent on the chemical integrity of brain. It is well established that the brain consumes a large amount of oxygen and is highly rich in lipid content, becoming prone to oxidative stress. A high consumption of oxygen leads to excessive production of ROS. Apart from this, the neuronal membranes are found to be rich in polyunsaturated fatty acids, which are highly susceptible to ROS. Various neurodegenerative diseases such as Parkinson’s disease (PD), Alzheimer’s disease (AD), Huntington’s disease (HD), and amyotrophic lateral sclerosis (ALS), among others, can be the result of biochemical alteration (due to oxidative stress) in bimolecular components. There is a need to understand the processes and role of oxidative stress in neurodegenerative diseases. This review is an effort towards improving our understanding of the pivotal role played by OS in neurodegenerative disorders.

920 citations


Journal ArticleDOI
TL;DR: A consensus scheme for diagnosing malnutrition in adults in clinical settings on a global scale is proposed and it is recommended that the etiologic criteria be used to guide intervention and anticipated outcomes.
Abstract: Summary Rationale This initiative is focused on building a global consensus around core diagnostic criteria for malnutrition in adults in clinical settings Methods In January 2016, the Global Leadership Initiative on Malnutrition (GLIM) was convened by several of the major global clinical nutrition societies GLIM appointed a core leadership committee and a supporting working group with representatives bringing additional global diversity and expertise Empirical consensus was reached through a series of face-to-face meetings, telephone conferences, and e-mail communications Results A two-step approach for the malnutrition diagnosis was selected, ie, first screening to identify “at risk” status by the use of any validated screening tool, and second, assessment for diagnosis and grading the severity of malnutrition The malnutrition criteria for consideration were retrieved from existing approaches for screening and assessment Potential criteria were subjected to a ballot among the GLIM core and supporting working group members The top five ranked criteria included three phenotypic criteria (non-volitional weight loss, low body mass index, and reduced muscle mass) and two etiologic criteria (reduced food intake or assimilation, and inflammation or disease burden) To diagnose malnutrition at least one phenotypic criterion and one etiologic criterion should be present Phenotypic metrics for grading severity as Stage 1 (moderate) and Stage 2 (severe) malnutrition are proposed It is recommended that the etiologic criteria be used to guide intervention and anticipated outcomes The recommended approach supports classification of malnutrition into four etiology-related diagnosis categories Conclusion A consensus scheme for diagnosing malnutrition in adults in clinical settings on a global scale is proposed Next steps are to secure further collaboration and endorsements from leading nutrition professional societies, to identify overlaps with syndromes like cachexia and sarcopenia, and to promote dissemination, validation studies, and feedback The diagnostic construct should be re-considered every 3–5 years

885 citations


Journal ArticleDOI
TL;DR: This initiative is focused on building a global consensus around core diagnostic criteria for malnutrition in adults in clinical settings.
Abstract: Rationale This initiative is focused on building a global consensus around core diagnostic criteria for malnutrition in adults in clinical settings.

827 citations


Journal ArticleDOI
TL;DR: Christian Maaser, a Andreas Sturm,b Stephan R. Vavricka,c Torsten Kucharzik,d Gionata Fiorino,e Vito Annese,f Emma Calabrese,f Daniel C. Baumgart,h Dominik Bettenworth,i Paula Borralho Nunes,j, Johan Burisch,k, Fabiana Castiglione,l Rami Eliakim,m Pierre Ellul,n Yago Gonz
Abstract: Christian Maaser,a Andreas Sturm,b Stephan R. Vavricka,c Torsten Kucharzik,d Gionata Fiorino,e Vito Annese,f Emma Calabrese,g Daniel C. Baumgart,h Dominik Bettenworth,i Paula Borralho Nunes,j, Johan Burisch,k, Fabiana Castiglione,l Rami Eliakim,m Pierre Ellul,n Yago González-Lama,o Hannah Gordon,p Steve Halligan,q Konstantinos Katsanos,r Uri Kopylov,m Paulo G. Kotze,s Eduards Krustiņš,t Andrea Laghi,u Jimmy K. Limdi,v Florian Rieder,w Jordi Rimola,x Stuart A. Taylor,y Damian Tolan,z Patrick van Rheenen,aa Bram Verstockt,bb, Jaap Stokercc; on behalf of the European Crohn’s and Colitis Organisation [ECCO] and the European Society of Gastrointestinal and Abdominal Radiology [ESGAR]

779 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott2, T. D. Abbott, Fausto Acernese3  +1157 moreInstitutions (70)
TL;DR: In this paper, the authors improved initial estimates of the binary's properties, including component masses, spins, and tidal parameters, using the known source location, improved modeling, and recalibrated Virgo data.
Abstract: On August 17, 2017, the Advanced LIGO and Advanced Virgo gravitational-wave detectors observed a low-mass compact binary inspiral. The initial sky localization of the source of the gravitational-wave signal, GW170817, allowed electromagnetic observatories to identify NGC 4993 as the host galaxy. In this work, we improve initial estimates of the binary's properties, including component masses, spins, and tidal parameters, using the known source location, improved modeling, and recalibrated Virgo data. We extend the range of gravitational-wave frequencies considered down to 23 Hz, compared to 30 Hz in the initial analysis. We also compare results inferred using several signal models, which are more accurate and incorporate additional physical effects as compared to the initial analysis. We improve the localization of the gravitational-wave source to a 90% credible region of 16 deg2. We find tighter constraints on the masses, spins, and tidal parameters, and continue to find no evidence for nonzero component spins. The component masses are inferred to lie between 1.00 and 1.89 M when allowing for large component spins, and to lie between 1.16 and 1.60 M (with a total mass 2.73-0.01+0.04 M) when the spins are restricted to be within the range observed in Galactic binary neutron stars. Using a precessing model and allowing for large component spins, we constrain the dimensionless spins of the components to be less than 0.50 for the primary and 0.61 for the secondary. Under minimal assumptions about the nature of the compact objects, our constraints for the tidal deformability parameter Λ are (0,630) when we allow for large component spins, and 300-230+420 (using a 90% highest posterior density interval) when restricting the magnitude of the component spins, ruling out several equation-of-state models at the 90% credible level. Finally, with LIGO and GEO600 data, we use a Bayesian analysis to place upper limits on the amplitude and spectral energy density of a possible postmerger signal.

715 citations


Journal ArticleDOI
Andrea Cossarizza1, Hyun-Dong Chang, Andreas Radbruch, Andreas Acs2  +459 moreInstitutions (160)
TL;DR: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community providing the theory and key practical aspects offlow cytometry enabling immunologists to avoid the common errors that often undermine immunological data.
Abstract: These guidelines are a consensus work of a considerable number of members of the immunology and flow cytometry community. They provide the theory and key practical aspects of flow cytometry enabling immunologists to avoid the common errors that often undermine immunological data. Notably, there are comprehensive sections of all major immune cell types with helpful Tables detailing phenotypes in murine and human cells. The latest flow cytometry techniques and applications are also described, featuring examples of the data that can be generated and, importantly, how the data can be analysed. Furthermore, there are sections detailing tips, tricks and pitfalls to avoid, all written and peer-reviewed by leading experts in the field, making this an essential research companion.

698 citations


Proceedings ArticleDOI
15 Jun 2019
TL;DR: This model learns the semantic labels in a supervised fashion, and broadens its understanding of the data by learning from self-supervised signals how to solve a jigsaw puzzle on the same images, which helps the network to learn the concepts of spatial correlation while acting as a regularizer for the classification task.
Abstract: Human adaptability relies crucially on the ability to learn and merge knowledge both from supervised and unsupervised learning: the parents point out few important concepts, but then the children fill in the gaps on their own. This is particularly effective, because supervised learning can never be exhaustive and thus learning autonomously allows to discover invariances and regularities that help to generalize. In this paper we propose to apply a similar approach to the task of object recognition across domains: our model learns the semantic labels in a supervised fashion, and broadens its understanding of the data by learning from self-supervised signals how to solve a jigsaw puzzle on the same images. This secondary task helps the network to learn the concepts of spatial correlation while acting as a regularizer for the classification task. Multiple experiments on the PACS, VLCS, Office-Home and digits datasets confirm our intuition and show that this simple method outperforms previous domain generalization and adaptation solutions. An ablation study further illustrates the inner workings of our approach.

678 citations


Journal ArticleDOI
TL;DR: In this article, the authors overview the physics of exotic dark compact objects and their observational status, including the observational evidence for black holes with current and future experiments, and provide an overview of these objects.
Abstract: Very compact objects probe extreme gravitational fields and may be the key to understand outstanding puzzles in fundamental physics. These include the nature of dark matter, the fate of spacetime singularities, or the loss of unitarity in Hawking evaporation. The standard astrophysical description of collapsing objects tells us that massive, dark and compact objects are black holes. Any observation suggesting otherwise would be an indication of beyond-the-standard-model physics. Null results strengthen and quantify the Kerr black hole paradigm. The advent of gravitational-wave astronomy and precise measurements with very long baseline interferometry allow one to finally probe into such foundational issues. We overview the physics of exotic dark compact objects and their observational status, including the observational evidence for black holes with current and future experiments.

572 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

526 citations


Journal ArticleDOI
TL;DR: This work is an updated overview of apigenin, focusing on its health-promoting effects/therapeutic functions and, in particular, results of in vivo research, and an introduction to its chemistry.
Abstract: Several plant bioactive compounds have exhibited functional activities that suggest they could play a remarkable role in preventing a wide range of chronic diseases. The largest group of naturally-occurring polyphenols are the flavonoids, including apigenin. The present work is an updated overview of apigenin, focusing on its health-promoting effects/therapeutic functions and, in particular, results of in vivo research. In addition to an introduction to its chemistry, nutraceutical features have also been described. The main key findings from in vivo research, including animal models and human studies, are summarized. The beneficial indications are reported and discussed in detail, including effects in diabetes, amnesia and Alzheimer’s disease, depression and insomnia, cancer, etc. Finally, data on flavonoids from the main public databases are gathered to highlight the apigenin’s key role in dietary assessment and in the evaluation of a formulated diet, to determine exposure and to investigate its health effects in vivo.

Journal ArticleDOI
TL;DR: It is argued that implementation of agreed-upon standards for models in biodiversity assessments would promote transparency and repeatability, eventually leading to higher quality of the models and the inferences used in assessments.
Abstract: Demand for models in biodiversity assessments is rising, but which models are adequate for the task? We propose a set of best-practice standards and detailed guidelines enabling scoring of studies based on species distribution models for use in biodiversity assessments. We reviewed and scored 400 modeling studies over the past 20 years using the proposed standards and guidelines. We detected low model adequacy overall, but with a marked tendency of improvement over time in model building and, to a lesser degree, in biological data and model evaluation. We argue that implementation of agreed-upon standards for models in biodiversity assessments would promote transparency and repeatability, eventually leading to higher quality of the models and the inferences used in assessments. We encourage broad community participation toward the expansion and ongoing development of the proposed standards and guidelines.

Journal ArticleDOI
TL;DR: Patients with chronic atrophic gastritis or intestinal metaplasia (IM) are at risk for gastric adenocarcinoma, and identification and surveillance of patients with precancerous gastric conditions is cost-effective.
Abstract: Patients with chronic atrophic gastritis or intestinal metaplasia (IM) are at risk for gastric adenocarcinoma. This underscores the importance of diagnosis and risk stratification for these patients. High definition endoscopy with chromoendoscopy (CE) is better than high definition white-light endoscopy alone for this purpose. Virtual CE can guide biopsies for staging atrophic and metaplastic changes and can target neoplastic lesions. Biopsies should be taken from at least two topographic sites (antrum and corpus) and labelled in two separate vials. For patients with mild to moderate atrophy restricted to the antrum there is no evidence to recommend surveillance. In patients with IM at a single location but with a family history of gastric cancer, incomplete IM, or persistent Helicobacter pylori gastritis, endoscopic surveillance with CE and guided biopsies may be considered in 3 years. Patients with advanced stages of atrophic gastritis should be followed up with a high quality endoscopy every 3 years. In patients with dysplasia, in the absence of an endoscopically defined lesion, immediate high quality endoscopic reassessment with CE is recommended. Patients with an endoscopically visible lesion harboring low or high grade dysplasia or carcinoma should undergo staging and treatment. H. pylori eradication heals nonatrophic chronic gastritis, may lead to regression of atrophic gastritis, and reduces the risk of gastric cancer in patients with these conditions, and it is recommended. H. pylori eradication is also recommended for patients with neoplasia after endoscopic therapy. In intermediate to high risk regions, identification and surveillance of patients with precancerous gastric conditions is cost-effective.

Journal ArticleDOI
TL;DR: In patients with a recent history of embolic stroke of undetermined source, dabigatran was not superior to aspirin in preventing recurrent stroke, but there were more clinically relevant nonmajor bleeding events in the dabig atran group.
Abstract: BACKGROUND: Cryptogenic strokes constitute 20 to 30% of ischemic strokes, and most cryptogenic strokes are considered to be embolic and of undetermined source. An earlier randomized trial showed that rivaroxaban is no more effective than aspirin in preventing recurrent stroke after a presumed embolic stroke from an undetermined source. Whether dabigatran would be effective in preventing recurrent strokes after this type of stroke was unclear. METHODS: We conducted a multicenter, randomized, double-blind trial of dabigatran at a dose of 150 mg or 110 mg twice daily as compared with aspirin at a dose of 100 mg once daily in patients who had had an embolic stroke of undetermined source. The primary outcome was recurrent stroke. The primary safety outcome was major bleeding. RESULTS: A total of 5390 patients were enrolled at 564 sites and were randomly assigned to receive dabigatran (2695 patients) or aspirin (2695 patients). During a median follow-up of 19 months, recurrent strokes occurred in 177 patients (6.6%) in the dabigatran group (4.1% per year) and in 207 patients (7.7%) in the aspirin group (4.8% per year) (hazard ratio, 0.85; 95% confidence interval [CI], 0.69 to 1.03; P = 0.10). Ischemic strokes occurred in 172 patients (4.0% per year) and 203 patients (4.7% per year), respectively (hazard ratio, 0.84; 95% CI, 0.68 to 1.03). Major bleeding occurred in 77 patients (1.7% per year) in the dabigatran group and in 64 patients (1.4% per year) in the aspirin group (hazard ratio, 1.19; 95% CI, 0.85 to 1.66). Clinically relevant nonmajor bleeding occurred in 70 patients (1.6% per year) and 41 patients (0.9% per year), respectively. CONCLUSIONS: In patients with a recent history of embolic stroke of undetermined source, dabigatran was not superior to aspirin in preventing recurrent stroke. The incidence of major bleeding was not greater in the dabigatran group than in the aspirin group, but there were more clinically relevant nonmajor bleeding events in the dabigatran group. (Funded by Boehringer Ingelheim; RE-SPECT ESUS ClinicalTrials.gov number, NCT02239120.).

Journal ArticleDOI
TL;DR: Department of Translational and Precision Medicine, Sapienza University of Rome, Rome, Italy; Department of Head and Neck Oncology, Gustave Roussy, Villejuif; Université Paris Saclay, Villeroy-sur-Sierre, France; and Department of Nuclear Medicine and Endocrine Oncological Sciences and Public Health, University of Brescia.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Sheelu Abraham3  +1215 moreInstitutions (134)
TL;DR: In this paper, the mass, spin, and redshift distributions of binary black hole (BBH) mergers with LIGO and Advanced Virgo observations were analyzed using phenomenological population models.
Abstract: We present results on the mass, spin, and redshift distributions with phenomenological population models using the 10 binary black hole (BBH) mergers detected in the first and second observing runs completed by Advanced LIGO and Advanced Virgo. We constrain properties of the BBH mass spectrum using models with a range of parameterizations of the BBH mass and spin distributions. We find that the mass distribution of the more massive BH in such binaries is well approximated by models with no more than 1% of BHs more massive than 45 M and a power-law index of (90% credibility). We also show that BBHs are unlikely to be composed of BHs with large spins aligned to the orbital angular momentum. Modeling the evolution of the BBH merger rate with redshift, we show that it is flat or increasing with redshift with 93% probability. Marginalizing over uncertainties in the BBH population, we find robust estimates of the BBH merger rate density of R= (90% credibility). As the BBH catalog grows in future observing runs, we expect that uncertainties in the population model parameters will shrink, potentially providing insights into the formation of BHs via supernovae, binary interactions of massive stars, stellar cluster dynamics, and the formation history of BHs across cosmic time.

Journal ArticleDOI
Albert M. Sirunyan, Armen Tumasyan, Wolfgang Adam1, Federico Ambrogi1  +2265 moreInstitutions (153)
TL;DR: Combined measurements of the production and decay rates of the Higgs boson, as well as its couplings to vector bosons and fermions, are presented and constraints are placed on various two Higgs doublet models.
Abstract: Combined measurements of the production and decay rates of the Higgs boson, as well as its couplings to vector bosons and fermions, are presented. The analysis uses the LHC proton–proton collision data set recorded with the CMS detector in 2016 at $\sqrt{s}=13\,\text {Te}\text {V} $ , corresponding to an integrated luminosity of 35.9 ${\,\text {fb}^{-1}} $ . The combination is based on analyses targeting the five main Higgs boson production mechanisms (gluon fusion, vector boson fusion, and associated production with a $\mathrm {W}$ or $\mathrm {Z}$ boson, or a top quark-antiquark pair) and the following decay modes: $\mathrm {H} \rightarrow \gamma \gamma $ , $\mathrm {Z}\mathrm {Z}$ , $\mathrm {W}\mathrm {W}$ , $\mathrm {\tau }\mathrm {\tau }$ , $\mathrm {b} \mathrm {b} $ , and $\mathrm {\mu }\mathrm {\mu }$ . Searches for invisible Higgs boson decays are also considered. The best-fit ratio of the signal yield to the standard model expectation is measured to be $\mu =1.17\pm 0.10$ , assuming a Higgs boson mass of $125.09\,\text {Ge}\text {V} $ . Additional results are given for various assumptions on the scaling behavior of the production and decay modes, including generic parametrizations based on ratios of cross sections and branching fractions or couplings. The results are compatible with the standard model predictions in all parametrizations considered. In addition, constraints are placed on various two Higgs doublet models.

Journal ArticleDOI
Roel Aaij, C. Abellán Beteta1, Bernardo Adeva2, Marco Adinolfi3  +858 moreInstitutions (57)
TL;DR: This is the most precise measurement of R_{K} to date and is compatible with the standard model at the level of 2.5 standard deviations.
Abstract: A measurement of the ratio of branching fractions of the decays B + → K + μ + μ − and B + → K + e + e − is presented. The proton-proton collision data used correspond to an integrated luminosity of 5.0 fb − 1 recorded with the LHCb experiment at center-of-mass energies of 7, 8, and 13 TeV. For the dilepton mass-squared range 1.1 < q 2 < 6.0 GeV 2 / c 4 the ratio of branching fractions is measured to be R K = 0.84 6 + 0.060 − 0.054 + 0.016 − 0.014 , where the first uncertainty is statistical and the second systematic. This is the most precise measurement of R K to date and is compatible with the standard model at the level of 2.5 standard deviations.

Journal ArticleDOI
TL;DR: It is already possible to envision the need to move beyond 5G and design a new architecture incorporating innovative technologies to satisfy new needs at both the individual and societal levels.
Abstract: With its ability to provide a single platform enabling a variety of services, such as enhanced mobile broadband communications, virtual reality, automated driving, and the Internet of Things, 5G represents a breakthrough in the design of communication networks. Nevertheless, considering the increasing requests for new services and predicting the development of new technologies within a decade, it is already possible to envision the need to move beyond 5G and design a new architecture incorporating innovative technologies to satisfy new needs at both the individual and societal levels.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Fausto Acernese3  +1237 moreInstitutions (131)
TL;DR: In this paper, the authors place constraints on the dipole radiation and possible deviations from GR in the post-Newtonian coefficients that govern the inspiral regime of a binary neutron star inspiral.
Abstract: The recent discovery by Advanced LIGO and Advanced Virgo of a gravitational wave signal from a binary neutron star inspiral has enabled tests of general relativity (GR) with this new type of source. This source, for the first time, permits tests of strong-field dynamics of compact binaries in the presence of matter. In this Letter, we place constraints on the dipole radiation and possible deviations from GR in the post-Newtonian coefficients that govern the inspiral regime. Bounds on modified dispersion of gravitational waves are obtained; in combination with information from the observed electromagnetic counterpart we can also constrain effects due to large extra dimensions. Finally, the polarization content of the gravitational wave signal is studied. The results of all tests performed here show good agreement with GR.

Journal ArticleDOI
01 Jan 2019-Pain
TL;DR: The most common conditions of peripheral neuropathic pain are trigeminal neuralgia, peripheral nerve injury, painful polyneuropathy, postherpetic neural gia, and painful radiculopathy.
Abstract: The upcoming 11th revision of the International Statistical Classification of Diseases and Related Health Problems (ICD) of the World Health Organization (WHO) offers a unique opportunity to improve the representation of painful disorders. For this purpose, the International Association for the Study of Pain (IASP) has convened an interdisciplinary task force of pain specialists. Here, we present the case for a reclassification of nervous system lesions or diseases associated with persistent or recurrent pain for ≥3 months. The new classification lists the most common conditions of peripheral neuropathic pain: trigeminal neuralgia, peripheral nerve injury, painful polyneuropathy, postherpetic neuralgia, and painful radiculopathy. Conditions of central neuropathic pain include pain caused by spinal cord or brain injury, poststroke pain, and pain associated with multiple sclerosis. Diseases not explicitly mentioned in the classification are captured in residual categories of ICD-11. Conditions of chronic neuropathic pain are either insufficiently defined or missing in the current version of the ICD, despite their prevalence and clinical importance. We provide the short definitions of diagnostic entities for which we submitted more detailed content models to the WHO. Definitions and content models were established in collaboration with the Classification Committee of the IASP's Neuropathic Pain Special Interest Group (NeuPSIG). Up to 10% of the general population experience neuropathic pain. The majority of these patients do not receive satisfactory relief with existing treatments. A precise classification of chronic neuropathic pain in ICD-11 is necessary to document this public health need and the therapeutic challenges related to chronic neuropathic pain.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

Journal ArticleDOI
Roel Aaij, C. Abellán Beteta1, Bernardo Adeva2, Marco Adinolfi3  +877 moreInstitutions (60)
TL;DR: In this paper, a new pentaquark state, P_{c}(4312)+, was discovered with a statistical significance of 7.3σ in a data sample of Λ_{b}^{0}→J/ψpK^{-} decays, which is an order of magnitude larger than that previously analyzed by the LHCb Collaboration.
Abstract: A narrow pentaquark state, P_{c}(4312)^{+}, decaying to J/ψp, is discovered with a statistical significance of 7.3σ in a data sample of Λ_{b}^{0}→J/ψpK^{-} decays, which is an order of magnitude larger than that previously analyzed by the LHCb Collaboration. The P_{c}(4450)^{+} pentaquark structure formerly reported by LHCb is confirmed and observed to consist of two narrow overlapping peaks, P_{c}(4440)^{+} and P_{c}(4457)^{+}, where the statistical significance of this two-peak interpretation is 5.4σ. The proximity of the Σ_{c}^{+}D[over ¯]^{0} and Σ_{c}^{+}D[over ¯]^{*0} thresholds to the observed narrow peaks suggests that they play an important role in the dynamics of these states.

Journal ArticleDOI
TL;DR: A comprehensive review of the state of the art in this active field, with a due balance between theoretical, experimental and technological results, can be found in this article, where significant achievements are presented in tables or in schematic figures, in order to convey a global perspective of the several horizons that fall under the name of photonic quantum information.
Abstract: Photonic quantum technologies represent a promising platform for several applications, ranging from long-distance communications to the simulation of complex phenomena. Indeed, the advantages offered by single photons do make them the candidate of choice for carrying quantum information in a broad variety of areas with a versatile approach. Furthermore, recent technological advances are now enabling first concrete applications of photonic quantum information processing. The goal of this manuscript is to provide the reader with a comprehensive review of the state of the art in this active field, with a due balance between theoretical, experimental and technological results. When more convenient, we will present significant achievements in tables or in schematic figures, in order to convey a global perspective of the several horizons that fall under the name of photonic quantum information.

Journal ArticleDOI
Vagheesh M. Narasimhan1, Nick Patterson2, Nick Patterson3, Priya Moorjani4, Nadin Rohland3, Nadin Rohland1, Rebecca Bernardos1, Swapan Mallick3, Swapan Mallick1, Swapan Mallick5, Iosif Lazaridis1, Nathan Nakatsuka6, Nathan Nakatsuka1, Iñigo Olalde1, Mark Lipson1, Alexander M. Kim1, Luca M. Olivieri, Alfredo Coppa7, Massimo Vidale8, James Mallory9, Vyacheslav Moiseyev10, Egor Kitov11, Egor Kitov10, Janet Monge12, Nicole Adamski5, Nicole Adamski1, Neel Alex4, Nasreen Broomandkhoshbacht1, Nasreen Broomandkhoshbacht5, Francesca Candilio13, Kimberly Callan5, Kimberly Callan1, Olivia Cheronet13, Olivia Cheronet14, Brendan J. Culleton15, Matthew Ferry5, Matthew Ferry1, Daniel Fernandes, Suzanne Freilich14, Beatriz Gamarra13, Daniel Gaudio13, Mateja Hajdinjak16, Eadaoin Harney1, Eadaoin Harney5, Thomas K. Harper15, Denise Keating13, Ann Marie Lawson1, Ann Marie Lawson5, Matthew Mah3, Matthew Mah5, Matthew Mah1, Kirsten Mandl14, Megan Michel5, Megan Michel1, Mario Novak13, Jonas Oppenheimer1, Jonas Oppenheimer5, Niraj Rai17, Niraj Rai18, Kendra Sirak13, Kendra Sirak19, Kendra Sirak1, Viviane Slon16, Kristin Stewardson5, Kristin Stewardson1, Fatma Zalzala5, Fatma Zalzala1, Zhao Zhang1, Gaziz Akhatov, Anatoly N. Bagashev, Alessandra Bagnera, Bauryzhan Baitanayev, Julio Bendezu-Sarmiento20, Arman A. Bissembaev, Gian Luca Bonora, T Chargynov21, T. A. Chikisheva10, Petr K. Dashkovskiy22, Anatoly P. Derevianko10, Miroslav Dobeš23, Katerina Douka24, Katerina Douka16, Nadezhda Dubova10, Meiram N. Duisengali, Dmitry Enshin, Andrey Epimakhov25, Alexey Fribus26, Dorian Q. Fuller27, Dorian Q. Fuller28, Alexander Goryachev, Andrey Gromov10, S. P. Grushin22, Bryan Hanks29, Margaret A. Judd29, Erlan Kazizov, Aleksander Khokhlov30, Aleksander P. Krygin, Elena Kupriyanova31, Pavel Kuznetsov30, Donata Luiselli32, Farhod Maksudov33, Aslan M. Mamedov, Talgat B. Mamirov, Christopher Meiklejohn34, Deborah C. Merrett35, Roberto Micheli, Oleg Mochalov30, Samariddin Mustafokulov33, Ayushi Nayak16, Davide Pettener32, Richard Potts36, Dmitry Razhev, Marina Petrovna Rykun37, Stefania Sarno32, Tatyana M. Savenkova, Kulyan Sikhymbaeva, Sergey Mikhailovich Slepchenko, Oroz A. Soltobaev21, Nadezhda Stepanova10, Svetlana V. Svyatko10, Svetlana V. Svyatko9, Kubatbek Tabaldiev, Maria Teschler-Nicola14, Maria Teschler-Nicola38, Alexey A. Tishkin22, Vitaly V. Tkachev, Sergey Vasilyev10, Petr Velemínský39, Dmitriy Voyakin, Antonina Yermolayeva, Muhammad Zahir16, Muhammad Zahir40, Valery S. Zubkov, A. V. Zubova10, Vasant Shinde41, Carles Lalueza-Fox42, Matthias Meyer16, David W. Anthony43, Nicole Boivin16, Kumarasamy Thangaraj18, Douglas J. Kennett15, Douglas J. Kennett44, Michael D. Frachetti45, Ron Pinhasi14, Ron Pinhasi13, David Reich 
06 Sep 2019-Science
TL;DR: It is shown that Steppe ancestry then integrated further south in the first half of the second millennium BCE, contributing up to 30% of the ancestry of modern groups in South Asia, supporting the idea that the archaeologically documented dispersal of domesticates was accompanied by the spread of people from multiple centers of domestication.
Abstract: By sequencing 523 ancient humans, we show that the primary source of ancestry in modern South Asians is a prehistoric genetic gradient between people related to early hunter-gatherers of Iran and Southeast Asia. After the Indus Valley Civilization's decline, its people mixed with individuals in the southeast to form one of the two main ancestral populations of South Asia, whose direct descendants live in southern India. Simultaneously, they mixed with descendants of Steppe pastoralists who, starting around 4000 years ago, spread via Central Asia to form the other main ancestral population. The Steppe ancestry in South Asia has the same profile as that in Bronze Age Eastern Europe, tracking a movement of people that affected both regions and that likely spread the distinctive features shared between Indo-Iranian and Balto-Slavic languages.

Journal ArticleDOI
Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam, Federico Ambrogi  +2298 moreInstitutions (160)
TL;DR: In this article, a search for invisible decays of a Higgs boson via vector boson fusion is performed using proton-proton collision data collected with the CMS detector at the LHC in 2016 at a center-of-mass energy root s = 13 TeV, corresponding to an integrated luminosity of 35.9fb(-1).

Journal ArticleDOI
TL;DR: It is important to address the risk factors for sarcopenia, particularly low physical activity and sedentary behavior in the general population, using a life‐long approach.
Abstract: The term sarcopenia was introduced in 1988. The original definition was a "muscle loss" of the appendicular muscle mass in the older people as measured by dual energy x-ray absorptiometry (DXA). In 2010, the definition was altered to be low muscle mass together with low muscle function and this was agreed upon as reported in a number of consensus papers. The Society of Sarcopenia, Cachexia and Wasting Disorders supports the recommendations of more recent consensus conferences, i.e. that rapid screening, such as with the SARC-F questionnaire, should be utilized with a formal diagnosis being made by measuring grip strength or chair stand together with DXA estimation of appendicular muscle mass (indexed for height2). Assessments of the utility of ultrasound and creatine dilution techniques are ongoing. Use of ultrasound may not be easily reproducible. Primary sarcopenia is aging associated (mediated) loss of muscle mass. Secondary sarcopenia (or disease-related sarcopenia) has predominantly focused on loss of muscle mass without the emphasis on muscle function. Diseases that can cause muscle wasting (i.e. secondary sarcopenia) include malignant cancer, COPD, heart failure, and renal failure and others. Management of sarcopenia should consist of resistance exercise in combination with a protein intake of 1 to 1.5 g/kg/day. There is insufficient evidence that vitamin D and anabolic steroids are beneficial. These recommendations apply to both primary (age-related) sarcopenia and secondary (disease related) sarcopenia. Secondary sarcopenia also needs appropriate treatment of the underlying disease. It is important that primary care health professionals become aware of and make the diagnosis of age-related and disease-related sarcopenia. It is important to address the risk factors for sarcopenia, particularly low physical activity and sedentary behavior in the general population, using a life-long approach. There is a need for more clinical research into the appropriate measurement for muscle mass and the management of sarcopenia. Accordingly, this position statement provides recommendations on the management of sarcopenia and how to progress the knowledge and recognition of sarcopenia.

Journal ArticleDOI
TL;DR: This update of evidence-based guidelines (GL) aims to translate current evidence and expert opinion into recommendations for multidisciplinary teams responsible for the optimal nutritional and metabolic management of adult patients with liver disease.

Journal ArticleDOI
TL;DR: Clinical criteria for a large number of IEI that were designed in expert panels with an external review are presented and implemented for novel entries and verification of existing data sets from 2014, yielding a substantial refinement.