Showing papers by "University of Los Andes published in 2019"
••
Australian National University1, University of Melbourne2, Ghent University3, Pontifical Catholic University of Chile4, University of Pamplona5, University of Puerto Rico6, State University of Campinas7, Florida International University8, Spanish National Research Council9, Imperial College London10, University of Los Andes11, Alexander von Humboldt Biological Resources Research Institute12, Zoological Society of London13, North-West University14, Smithsonian Institution15, Colorado State University16, Universidad San Francisco de Quito17, Museum für Naturkunde18, Massey University19, University of Maryland, College Park20, University of Florida21, University of the Republic22, Cornell University23, Georgia Institute of Technology24, National Autonomous University of Mexico25, University of Pittsburgh26, Instituto Politécnico Nacional27, Andrés Bello National University28, University of Nevada, Reno29, Zoo Miami30, Natural History Museum31
TL;DR: A global, quantitative assessment of the amphibian chytridiomycosis panzootic demonstrates its role in the decline of at least 501 amphibian species over the past half-century and represents the greatest recorded loss of biodiversity attributable to a disease.
Abstract: Anthropogenic trade and development have broken down dispersal barriers, facilitating the spread of diseases that threaten Earth's biodiversity. We present a global, quantitative assessment of the amphibian chytridiomycosis panzootic, one of the most impactful examples of disease spread, and demonstrate its role in the decline of at least 501 amphibian species over the past half-century, including 90 presumed extinctions. The effects of chytridiomycosis have been greatest in large-bodied, range-restricted anurans in wet climates in the Americas and Australia. Declines peaked in the 1980s, and only 12% of declined species show signs of recovery, whereas 39% are experiencing ongoing decline. There is risk of further chytridiomycosis outbreaks in new areas. The chytridiomycosis panzootic represents the greatest recorded loss of biodiversity attributable to a disease.
680 citations
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1491 more•Institutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.
526 citations
••
TL;DR: Combined measurements of the production and decay rates of the Higgs boson, as well as its couplings to vector bosons and fermions, are presented and constraints are placed on various two Higgs doublet models.
Abstract: Combined measurements of the production and decay rates of the Higgs boson, as well as its couplings to vector bosons and fermions, are presented. The analysis uses the LHC proton–proton collision data set recorded with the CMS detector in 2016 at $\sqrt{s}=13\,\text {Te}\text {V} $ , corresponding to an integrated luminosity of 35.9 ${\,\text {fb}^{-1}} $ . The combination is based on analyses targeting the five main Higgs boson production mechanisms (gluon fusion, vector boson fusion, and associated production with a $\mathrm {W}$ or $\mathrm {Z}$ boson, or a top quark-antiquark pair) and the following decay modes: $\mathrm {H} \rightarrow \gamma \gamma $ , $\mathrm {Z}\mathrm {Z}$ , $\mathrm {W}\mathrm {W}$ , $\mathrm {\tau }\mathrm {\tau }$ , $\mathrm {b} \mathrm {b} $ , and $\mathrm {\mu }\mathrm {\mu }$ . Searches for invisible Higgs boson decays are also considered. The best-fit ratio of the signal yield to the standard model expectation is measured to be $\mu =1.17\pm 0.10$ , assuming a Higgs boson mass of $125.09\,\text {Ge}\text {V} $ . Additional results are given for various assumptions on the scaling behavior of the production and decay modes, including generic parametrizations based on ratios of cross sections and branching fractions or couplings. The results are compatible with the standard model predictions in all parametrizations considered. In addition, constraints are placed on various two Higgs doublet models.
451 citations
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1496 more•Institutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.
425 citations
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1501 more•Institutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.
407 citations
••
Virginia Commonwealth University1, University College London2, Harvard University3, Stanford University4, University of Queensland5, Broad Institute6, King's College London7, University of Glasgow8, Brown University9, SUNY Downstate Medical Center10, Arizona State University11, Heidelberg University12, Icahn School of Medicine at Mount Sinai13, University of North Carolina at Chapel Hill14, University of Los Andes15, Universidad Autónoma de Nuevo León16, Indiana University17
TL;DR: To facilitate and promote research in multi-ancestry and admixed cohorts, key methodological considerations are outlined and opportunities, challenges, solutions, and areas in need of development are highlighted.
350 citations
••
TL;DR: In this article, a search for invisible decays of a Higgs boson via vector boson fusion is performed using proton-proton collision data collected with the CMS detector at the LHC in 2016 at a center-of-mass energy root s = 13 TeV, corresponding to an integrated luminosity of 35.9fb(-1).
347 citations
••
Joint Genome Institute1, University of Liverpool2, Radboud University Nijmegen3, University of Guelph4, Catholic University of Leuven5, Arizona State University6, University of Cape Town7, European Bioinformatics Institute8, Cairo University9, Vanderbilt University10, University of South Florida11, Colorado State University12, University of Michigan13, University of California, Davis14, University of Auvergne15, University of Southern California16, University of Queensland17, University of Arizona18, Texas A&M University19, National Institute of Genetics20, University of Alicante21, Kyoto University22, Université Paris-Saclay23, University of Chicago24, University of Los Andes25, Universidad Miguel Hernández de Elche26, University of Maryland, Baltimore27, University of Hawaii at Manoa28, Ohio State University29, École Polytechnique Fédérale de Lausanne30, University of British Columbia31, University of Exeter32, Oregon State University33, Australian Institute of Marine Science34, University of California, Irvine35, University of Tennessee36, University of Delaware37, Max Planck Society38, Montana State University39, University of California, San Diego40, J. Craig Venter Institute41
TL;DR: The MIUViG (Minimum Information about an Uncultivated Virus Genome) as mentioned in this paper standard was developed within the Genomic Standards Consortium framework and includes virus origin, genome quality, genome annotation, taxonomic classification, biogeographic distribution and in silico host prediction.
Abstract: We present an extension of the Minimum Information about any (x) Sequence (MIxS) standard for reporting sequences of uncultivated virus genomes. Minimum Information about an Uncultivated Virus Genome (MIUViG) standards were developed within the Genomic Standards Consortium framework and include virus origin, genome quality, genome annotation, taxonomic classification, biogeographic distribution and in silico host prediction. Community-wide adoption of MIUViG standards, which complement the Minimum Information about a Single Amplified Genome (MISAG) and Metagenome-Assembled Genome (MIMAG) standards for uncultivated bacteria and archaea, will improve the reporting of uncultivated virus genomes in public databases. In turn, this should enable more robust comparative studies and a systematic exploration of the global virosphere.
318 citations
••
Dalhousie University1, McMaster University2, University of Rochester Medical Center3, University of Washington4, McGill University5, University of Florida6, University of Pennsylvania7, Wayne State University8, Beaumont Hospital9, University of Helsinki10, University of Pittsburgh11, University of Modena and Reggio Emilia12, Pontifical Catholic University of Chile13, University of Los Andes14, University of Minnesota15
TL;DR: These evidence-based guidelines from the American Society of Hematology intend to support decision making about preventing VTE in patients undergoing surgery by making conditional recommendations for mechanical prophylaxis over no proPHylaxis and against inferior vena cava filters.
271 citations
••
University of Leeds1, University of Edinburgh2, University College London3, University of Exeter4, Imperial College London5, National University of Saint Anthony the Abbot in Cuzco6, Universidad Autónoma Gabriel René Moreno7, National Institute of Amazonian Research8, Universidade do Estado de Mato Grosso9, Universidade Federal do Acre10, University of Los Andes11, University of Washington12, Environmental Change Institute13, Centre national de la recherche scientifique14, Museu Paraense Emílio Goeldi15, Lancaster University16, University of Lorraine17, Universidad Nacional de la Amazonía Peruana18, Smithsonian Institution19, University of Montpellier20, James Cook University21, Wageningen University and Research Centre22, Agro ParisTech23, Naturalis24, University of Amsterdam25, Federal University of Western Pará26, State University of Campinas27, National Institute for Space Research28, Florida International University29, University of São Paulo30, Tropenbos International31, Amazon.com32, Federal University of Pará33, Michigan Technological University34, University of Texas at Austin35, Venezuelan Institute for Scientific Research36, Polytechnic University of Valencia37, Royal Museum for Central Africa38, Tecnológico de Antioquia39, George Mason University40, Universidad del Tolima41, National University of Colombia42, Paul Sabatier University43, Georgetown University44, University of La Serena45, Forestry Commission46, Federal University of Alagoas47, Duke University48, Van Hall Larenstein University of Applied Sciences49, University of Nottingham50
TL;DR: A slow shift to a more dry‐affiliated Amazonia is underway, with changes in compositional dynamics consistent with climate‐change drivers, but yet to significantly impact whole‐community composition.
Abstract: Most of the planet's diversity is concentrated in the tropics, which includes many regions undergoing rapid climate change. Yet, while climate‐induced biodiversity changes are widely documented elsewhere, few studies have addressed this issue for lowland tropical ecosystems. Here we investigate whether the floristic and functional composition of intact lowland Amazonian forests have been changing by evaluating records from 106 long‐term inventory plots spanning 30 years. We analyse three traits that have been hypothesized to respond to different environmental drivers (increase in moisture stress and atmospheric CO2 concentrations): maximum tree size, biogeographic water‐deficit affiliation and wood density. Tree communities have become increasingly dominated by large‐statured taxa, but to date there has been no detectable change in mean wood density or water deficit affiliation at the community level, despite most forest plots having experienced an intensification of the dry season. However, among newly recruited trees, dry‐affiliated genera have become more abundant, while the mortality of wet‐affiliated genera has increased in those plots where the dry season has intensified most. Thus, a slow shift to a more dry‐affiliated Amazonia is underway, with changes in compositional dynamics (recruits and mortality) consistent with climate‐change drivers, but yet to significantly impact whole‐community composition. The Amazon observational record suggests that the increase in atmospheric CO2 is driving a shift within tree communities to large‐statured species and that climate changes to date will impact forest composition, but long generation times of tropical trees mean that biodiversity change is lagging behind climate change.
263 citations
••
TL;DR: A systematic review is introduced based on the steps to achieve traffic classification by using ML techniques to identify the procedures followed by the existing works to achieve their goals and to outline future directions for ML-based traffic classification.
Abstract: Traffic analysis is a compound of strategies intended to find relationships, patterns, anomalies, and misconfigurations, among others things, in Internet traffic. In particular, traffic classification is a subgroup of strategies in this field that aims at identifying the application’s name or type of Internet traffic. Nowadays, traffic classification has become a challenging task due to the rise of new technologies, such as traffic encryption and encapsulation, which decrease the performance of classical traffic classification strategies. Machine learning (ML) gains interest as a new direction in this field, showing signs of future success, such as knowledge extraction from encrypted traffic, and more accurate Quality of Service management. ML is fast becoming a key tool to build traffic classification solutions in real network traffic scenarios; in this sense, the purpose of this investigation is to explore the elements that allow this technique to work in the traffic classification field. Therefore, a systematic review is introduced based on the steps to achieve traffic classification by using ML techniques. The main aim is to understand and to identify the procedures followed by the existing works to achieve their goals. As a result, this survey paper finds a set of trends derived from the analysis performed on this domain; in this manner, the authors expect to outline future directions for ML-based traffic classification.
••
Northwestern University1, University of California, San Diego2, University of Illinois at Urbana–Champaign3, Colorado State University4, University of Colorado Boulder5, City University of New York6, Dartmouth College7, University of Texas at Austin8, University of Wisconsin-Madison9, University of Minnesota10, National Scientific and Technical Research Council11, University of Los Andes12, University of Arizona13, J. Craig Venter Institute14
TL;DR: The findings indicate that mammalian gut microbiome plasticity in response to dietary shifts over both the lifespan of an individual host and the evolutionary history of a given host species is constrained by host physiological evolution, and the gut microbiome cannot be considered separately from host physiology when describing host nutritional strategies and the emergence of host dietary niches.
Abstract: Over the past decade several studies have reported that the gut microbiomes of mammals with similar dietary niches exhibit similar compositional and functional traits. However, these studies rely heavily on samples from captive individuals and often confound host phylogeny, gut morphology, and diet. To more explicitly test the influence of host dietary niche on the mammalian gut microbiome we use 16S rRNA gene amplicon sequencing and shotgun metagenomics to compare the gut microbiota of 18 species of wild non-human primates classified as either folivores or closely related non-folivores, evenly distributed throughout the primate order and representing a range of gut morphological specializations. While folivory results in some convergent microbial traits, collectively we show that the influence of host phylogeny on both gut microbial composition and function is much stronger than that of host dietary niche. This pattern does not result from differences in host geographic location or actual dietary intake at the time of sampling, but instead appears to result from differences in host physiology. These findings indicate that mammalian gut microbiome plasticity in response to dietary shifts over both the lifespan of an individual host and the evolutionary history of a given host species is constrained by host physiological evolution. Therefore, the gut microbiome cannot be considered separately from host physiology when describing host nutritional strategies and the emergence of host dietary niches.
••
TL;DR: Assessment of the safety and efficacy of the intra‐articular injection of single or repeated umbilical cord‐derived (UC) MSCs in knee OA found repeated UC‐MSC treatment is safe and superior to active comparator in knees OA at 1‐year follow‐up.
Abstract: Knee osteoarthritis (OA) is a leading cause of pain and disability. Although conventional treatments show modest benefits, pilot and phase I/II trials with bone marrow (BM) and adipose-derived (AD) mesenchymal stromal cells (MSCs) point to the feasibility, safety, and occurrence of clinical and structural improvement in focal or diffuse disease. This study aimed to assess the safety and efficacy of the intra-articular injection of single or repeated umbilical cord-derived (UC) MSCs in knee OA. UC-MSCs were cultured in an International Organization for Standardization 9001:2015 certified Good Manufacturing Practice-type Laboratory. Patients with symptomatic knee OA were randomized to receive hyaluronic acid at baseline and 6 months (HA, n = 8), single-dose (20 × 106 ) UC-MSC at baseline (MSC-1, n = 9), or repeated UC-MSC doses at baseline and 6 months (20 × 106 × 2; MSC-2, n = 9). Clinical scores and magnetic resonance images (MRIs) were assessed throughout the 12 months follow-up. No severe adverse events were reported. Only MSC-treated patients experienced significant pain and function improvements from baseline (p = .001). At 12 months, Western Ontario and Mc Master Universities Arthritis Index (WOMAC-A; pain subscale) reached significantly lower levels of pain in the MSC-2-treated group (1.1 ± 1.3) as compared with the HA group (4.3 ± 3.5; p = .04). Pain Visual Analog scale was significantly lower in the MSC-2 group versus the HA group (2.4 ± 2.1 vs. 22.1 ± 9.8, p = .03) at 12 months. For total WOMAC, MSC-2 had lower scores than HA at 12 months (4.2 ± 3.9 vs. 15.2 ± 11, p = .05). No differences in MRI scores were detected. In a phase I/II trial (NCT02580695), repeated UC-MSC treatment is safe and superior to active comparator in knee OA at 1-year follow-up. Stem Cells Translational Medicine 2019;8:215&224.
••
TL;DR: In this article, the performance of missing transverse momentum (Tmiss) reconstruction algorithms for the CMS experiment is presented, using proton-proton collisions at a center of mass energy of 13 TeV, collected at the CERN LHC in 2016.
Abstract: The performance of missing transverse momentum (Tmiss) reconstruction algorithms for the CMS experiment is presented, using proton-proton collisions at a center-of-mass energy of 13 TeV, collected at the CERN LHC in 2016. The data sample corresponds to an integrated luminosity of 35.9 fb-1. The results include measurements of the scale and resolution of Tmiss, and detailed studies of events identified with anomalous Tmiss. The performance is presented of a Tmiss reconstruction algorithm that mitigates the effects of multiple proton-proton interactions, using the "pileup per particle identification" method. The performance is shown of an algorithm used to estimate the compatibility of the reconstructed Tmiss with the hypothesis that it originates from resolution effects.
••
Transylvania University1, Charité2, Swiss Institute of Allergy and Asthma Research3, Wrocław Medical University4, National Institutes of Health5, Istanbul Medeniyet University6, Saga Group7, Hacettepe University8, University of Cologne9, Complutense University of Madrid10, University of Manchester11, National and Kapodistrian University of Athens12, University of Southampton13, Hospital Clínico San Carlos14, University of Messina15, University of Marburg16, St Mary's Hospital17, University Hospital Southampton NHS Foundation Trust18, University of Edinburgh19, Medical University of Graz20, University of Amsterdam21, Erasmus University Rotterdam22, University of Los Andes23
TL;DR: The European Academy of Allergy and Clinical Immunology has developed a clinical practice guideline providing evidence‐based recommendations for the use of house dust mites (HDM) AIT as add‐on treatment for HDM‐driven allergic asthma.
Abstract: Allergen immunotherapy (AIT) has been in use for the treatment of allergic disease for more than 100 years. Asthma treatment relies mainly on corticosteroids and other controllers recommended to achieve and maintain asthma control, prevent exacerbations, and improve quality of life. AIT is underused in asthma, both in children and in adults. Notably, patients with allergic asthma not adequately controlled on pharmacotherapy (including biologics) represent an unmet health need. The European Academy of Allergy and Clinical Immunology has developed a clinical practice guideline providing evidence-based recommendations for the use of house dust mites (HDM) AIT as add-on treatment for HDM-driven allergic asthma. This guideline was developed by a multi-disciplinary working group using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. HDM AIT was separately evaluated by route of administration and children and adults: subcutaneous (SCIT) and sublingual AIT (SLIT), drops, and tablets. Recommendations were formulated for each. The important prerequisites for successful treatment with HDM AIT are (a) selection of patients most likely to respond to AIT and (b) use of allergen extracts and desensitization protocols of proven efficacy. To date, only AIT with HDM SLIT-tablet has demonstrated a robust effect in adults for critical end points (exacerbations, asthma control, and safety). Thus, it is recommended as an add-on to regular asthma therapy for adults with controlled or partially controlled HDM-driven allergic asthma (conditional recommendation, moderate-quality evidence). HDM SCIT is recommended for adults and children, and SLIT drops are recommended for children with controlled HDM-driven allergic asthma as the add-on to regular asthma therapy to decrease symptoms and medication needs (conditional recommendation, low-quality evidence).
••
TL;DR: A search for Higgs boson pair production using the combined results from four final states: bbγγ, bbττ, bbbb, and bbVV, where V represents a W or Z boson, is performed using data collected in 2016 by the CMS experiment from LHC proton-proton collisions.
Abstract: This Letter describes a search for Higgs boson pair production using the combined results from four final states: bbγγ, bbττ, bbbb, and bbVV, where V represents a W or Z boson. The search is performed using data collected in 2016 by the CMS experiment from LHC proton-proton collisions at s=13 TeV, corresponding to an integrated luminosity of 35.9 fb-1. Limits are set on the Higgs boson pair production cross section. A 95% confidence level observed (expected) upper limit on the nonresonant production cross section is set at 22.2 (12.8) times the standard model value. A search for narrow resonances decaying to Higgs boson pairs is also performed in the mass range 250–3000 GeV. No evidence for a signal is observed, and upper limits are set on the resonance production cross section.
••
San Diego State University1, Charles University in Prague2, Cairo University3, Monash University4, University of Notre Dame5, Delft University of Technology6, University of Liverpool7, Utrecht University8, Kavli Institute of Nanoscience9, Aix-Marseille University10, New York University11, University of California, Davis12, University of Otago13, University of Groningen14, Katholieke Universiteit Leuven15, University of Los Andes16, Norwegian Institute of Public Health17, Universidad Mayor18, Saint Petersburg State University of Information Technologies, Mechanics and Optics19, National Autonomous University of Mexico20, Technical University of Denmark21, University of Jordan22, University of Alicante23, University of Illinois at Urbana–Champaign24, Autonomous University of Barcelona25, Pontifical Catholic University of Chile26, University of Warsaw27, Dartmouth College28, University of Khartoum29, University of Chicago30, University of Barcelona31, Scripps Research Institute32, University College Cork33, University of Tampere34, Northern Illinois University35, University of Sydney36, Columbus Zoo and Aquarium37, McGill University38, University of Western Australia39, University of Colorado Boulder40, Virginia Tech41, Tel Aviv University42, Instituto Superior Técnico43, Makerere University44, University of California, San Diego45, Hawaii Pacific University46, Stockholm University47, University of California, Irvine48, University of Buenos Aires49, Fudan University50, University of Padua51, University of Pittsburgh52, Ebonyi State University53, Andrés Bello National University54, University of Copenhagen55, Radboud University Nijmegen56
TL;DR: It is concluded that crAssphage is a benign cosmopolitan virus that may have coevolved with the human lineage and is an integral part of the normal human gut virome.
Abstract: Microbiomes are vast communities of microorganisms and viruses that populate all natural ecosystems. Viruses have been considered to be the most variable component of microbiomes, as supported by virome surveys and examples of high genomic mosaicism. However, recent evidence suggests that the human gut virome is remarkably stable compared with that of other environments. Here, we investigate the origin, evolution and epidemiology of crAssphage, a widespread human gut virus. Through a global collaboration, we obtained DNA sequences of crAssphage from more than one-third of the world's countries and showed that the phylogeography of crAssphage is locally clustered within countries, cities and individuals. We also found fully colinear crAssphage-like genomes in both Old-World and New-World primates, suggesting that the association of crAssphage with primates may be millions of years old. Finally, by exploiting a large cohort of more than 1,000 individuals, we tested whether crAssphage is associated with bacterial taxonomic groups of the gut microbiome, diverse human health parameters and a wide range of dietary factors. We identified strong correlations with different clades of bacteria that are related to Bacteroidetes and weak associations with several diet categories, but no significant association with health or disease. We conclude that crAssphage is a benign cosmopolitan virus that may have coevolved with the human lineage and is an integral part of the normal human gut virome.
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1496 more•Institutions (238)
TL;DR: The third volume of the FCC Conceptual Design Report as discussed by the authors is devoted to the hadron collider FCC-hh, and summarizes the physics discovery opportunities, presents the FCC-HH accelerator design, performance reach, and staged operation plan, discusses the underlying technologies, the civil engineering and technical infrastructure, and also sketches a possible implementation.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics (EPPSU), the Future Circular Collider (FCC) study was launched as a world-wide international collaboration hosted by CERN. The FCC study covered an energy-frontier hadron collider (FCC-hh), a highest-luminosity high-energy lepton collider (FCC-ee), the corresponding 100 km tunnel infrastructure, as well as the physics opportunities of these two colliders, and a high-energy LHC, based on FCC-hh technology. This document constitutes the third volume of the FCC Conceptual Design Report, devoted to the hadron collider FCC-hh. It summarizes the FCC-hh physics discovery opportunities, presents the FCC-hh accelerator design, performance reach, and staged operation plan, discusses the underlying technologies, the civil engineering and technical infrastructure, and also sketches a possible implementation. Combining ingredients from the Large Hadron Collider (LHC), the high-luminosity LHC upgrade and adding novel technologies and approaches, the FCC-hh design aims at significantly extending the energy frontier to 100 TeV. Its unprecedented centre-of-mass collision energy will make the FCC-hh a unique instrument to explore physics beyond the Standard Model, offering great direct sensitivity to new physics and discoveries.
••
Helge Bruelheide1, Jürgen Dengler2, Jürgen Dengler3, Borja Jiménez-Alfaro4 +181 more•Institutions (100)
TL;DR: The sPlot database as mentioned in this paper contains 1,121,244 vegetation plots, which comprise 23,586,216 records of plant species and their relative cover or abundance in plots collected worldwide between 1885 and 2015.
Abstract: Aims :Vegetation-plot records provide information on the presence and cover or abundance of plants co-occurring in the same community. Vegetation-plot data are spread across research groups, environmental agencies and biodiversity research centers and, thus, are rarely accessible at continental or global scales. Here we present the sPlot database, which collates vegetation plots worldwide to allow for the exploration of global patterns in taxonomic, functional and phylogenetic diversity at the plant community level. Results: sPlot version 2.1 contains records from 1,121,244 vegetation plots, which comprise 23,586,216 records of plant species and their relative cover or abundance in plots collected worldwide between 1885 and 2015. We complemented the information for each plot by retrieving climate and soil conditions and the biogeographic context (e.g., biomes) from external sources, and by calculating community-weighted means and variances of traits using gap-filled data from the global plant trait database TRY. Moreover, we created a phylogenetic tree for 50,167 out of the 54,519 species identified in the plots. We present the first maps of global patterns of community richness and community-weighted means of key traits. Conclusions: The availability of vegetation plot data in sPlot offers new avenues for vegetation analysis at the global scale.
••
TL;DR: It is demonstrated that Mott nanodevices retain a memory of previous resistive switching events long after the insulating resistance has recovered, and it is found that the intrinsic metastability of first-order phase transitions is the origin of this phenomenon.
Abstract: Resistive switching, a phenomenon in which the resistance of a device can be modified by applying an electric field1–5, is at the core of emerging technologies such as neuromorphic computing and resistive memories6–9. Among the different types of resistive switching, threshold firing10–14 is one of the most promising, as it may enable the implementation of artificial spiking neurons7,13,14. Threshold firing is observed in Mott insulators featuring an insulator-to-metal transition15,16, which can be triggered by applying an external voltage: the material becomes conducting (‘fires’) if a threshold voltage is exceeded7,10–12. The dynamics of this induced transition have been thoroughly studied, and its underlying mechanism and characteristic time are well documented10,12,17,18. By contrast, there is little knowledge regarding the opposite transition: the process by which the system returns to the insulating state after the voltage is removed. Here we show that Mott nanodevices retain a memory of previous resistive switching events long after the insulating resistance has recovered. We demonstrate that, although the device returns to its insulating state within 50 to 150 nanoseconds, it is possible to re-trigger the insulator-to-metal transition by using subthreshold voltages for a much longer time (up to several milliseconds). We find that the intrinsic metastability of first-order phase transitions is the origin of this phenomenon, and so it is potentially present in all Mott systems. This effect constitutes a new type of volatile memory in Mott-based devices, with potential applications in resistive memories, solid-state frequency discriminators and neuromorphic circuits. Mott materials feature scale-less relaxation dynamics after the insulator-to-metal transition that make its electric triggering dependent on recent switching events.
••
TL;DR: Observed patterns in the relationship between microbiome and virome diversity in 21 adult monozygotic twin pairs selected for high or low microbiome concordance support a strong role of the microbiome in patterning for the virome.
••
TL;DR: In this paper, a search for supersymmetric particles in the final state with multiple jets and large missing transverse momentum was performed using a sample of proton-proton collisions collected with the CMS detector.
Abstract: Results are reported from a search for supersymmetric particles in the final state with multiple jets and large missing transverse momentum. The search uses a sample of proton-proton collisions at $ \sqrt{s} $ = 13 TeV collected with the CMS detector in 2016–2018, corresponding to an integrated luminosity of 137 fb$^{−1}$, representing essentially the full LHC Run 2 data sample. The analysis is performed in a four-dimensional search region defined in terms of the number of jets, the number of tagged bottom quark jets, the scalar sum of jet transverse momenta, and the magnitude of the vector sum of jet transverse momenta. No significant excess in the event yield is observed relative to the expected background contributions from standard model processes. Limits on the pair production of gluinos and squarks are obtained in the framework of simplified models for supersymmetric particle production and decay processes. Assuming the lightest supersymmetric particle to be a neutralino, lower limits on the gluino mass as large as 2000 to 2310 GeV are obtained at 95% confidence level, while lower limits on the squark mass as large as 1190 to 1630 GeV are obtained, depending on the production scenario.
••
Albert M. Sirunyan1, Robin Erbacher2, C. A. Carrillo Montoya3, Wagner Carvalho4 +2322 more•Institutions (159)
TL;DR: In this article, the light-by-light scattering process in ultra-peripheral PbPb collisions at a centre-of-mass energy per nucleon pair of 5.02 TeV is reported.
••
TL;DR: The authors study how the proximate and distal sociocultural environments affect the well-established relationship between entrepreneurial self-efficacy and entrepreneurial intentions, focusing on the insti...
Abstract: We study how the proximate and distal sociocultural environments affect the well-established relationship between entrepreneurial self-efficacy and entrepreneurial intentions. We focus on the insti...
••
01 Sep 2019TL;DR: An approach to train recurrent neural networks with Long-Short-Term Memory (LSTM) architecture in order to predict sequences of next events, their timestamp, and their associated resource pools is proposed.
Abstract: Deep learning techniques have recently found applications in the field of predictive business process monitoring. These techniques allow us to predict, among other things, what will be the next events in a case, when will they occur, and which resources will trigger them. They also allow us to generate entire execution traces of a business process, or even entire event logs, which opens up the possibility of using such models for process simulation. This paper addresses the question of how to use deep learning techniques to train accurate models of business process behavior from event logs. The paper proposes an approach to train recurrent neural networks with Long-Short-Term Memory (LSTM) architecture in order to predict sequences of next events, their timestamp, and their associated resource pools. An experimental evaluation on real-life event logs shows that the proposed approach outperforms previously proposed LSTM architectures targeted at this problem.
••
TL;DR: A Grading of Recommendations, Assessment, Development and Evaluation (GRADE) summary of findings (SoF) table format that displays the critical information from a network meta-analysis (NMA) that facilitates understanding NMA findings and health decision-making is developed.
••
University of London1, Spanish National Research Council2, Umeå University3, Yale University4, National Institutes of Health5, University of São Paulo6, University of Los Andes7, Ho Chi Minh City Medicine and Pharmacy University8, Duy Tan University9, National Taiwan University10, Monash University11, Nagasaki University12, University of Tsukuba13, University of Valencia14, Oulu University Hospital15, University of Oulu16, Fudan University17, Seoul National University18, Health Canada19, University of Ottawa20, Swiss Tropical and Public Health Institute21, University of Basel22, Harvard University23, Kyoto University24, Anhui Medical University25, Shanghai Jiao Tong University26, Queensland University of Technology27
TL;DR: Several city indicators modify the effect of heat, with a higher mortality impact associated with increases in population density, fine particles, gross domestic product (GDP) and Gini index (a measure of income inequality), whereas higher levels of green spaces were linked with a decreased effect ofHeat.
Abstract: BACKGROUND: The health burden associated with temperature is expected to increase due to a warming climate. Populations living in cities are likely to be particularly at risk, but the role of urban ...
••
01 Mar 2019TL;DR: The proposed Oversampling strategy showed superior results on average when compared with SMOTE and other variants, demonstrating the importance of selecting the right attributes when defining the neighborhood in SMOTE-based oversampling methods.
Abstract: In this work, the Synthetic Minority Over-sampling Technique (SMOTE) approach is adapted for high-dimensional binary settings. A novel distance metric is proposed for the computation of the neighborhood for each minority sample, which takes into account only a subset of the available attributes that are relevant for the task. Three variants for the distance metric are explored: Euclidean, Manhattan, and Chebyshev distances, and four different ranking strategies: Fisher Score, Mutual Information, Eigenvector Centrality, and Correlation Score. Our proposal was compared with various oversampling techniques on low- and high-dimensional datasets with the presence of class-imbalance, including a case study on Natural Language Processing (NLP). The proposed oversampling strategy showed superior results on average when compared with SMOTE and other variants, demonstrating the importance of selecting the right attributes when defining the neighborhood in SMOTE-based oversampling methods.
••
TL;DR: In this paper, the authors argue that the search for explainable models and interpretable decisions in AI must be reformulated in terms of the broader project of offering a pragmatic and naturalistic account of understanding in AI.
Abstract: In this paper I argue that the search for explainable models and interpretable decisions in AI must be reformulated in terms of the broader project of offering a pragmatic and naturalistic account of understanding in AI. Intuitively, the purpose of providing an explanation of a model or a decision is to make it understandable to its stakeholders. But without a previous grasp of what it means to say that an agent understands a model or a decision, the explanatory strategies will lack a well-defined goal. Aside from providing a clearer objective for XAI, focusing on understanding also allows us to relax the factivity condition on explanation, which is impossible to fulfill in many machine learning models, and to focus instead on the pragmatic conditions that determine the best fit between a model and the methods and devices deployed to understand it. After an examination of the different types of understanding discussed in the philosophical and psychological literature, I conclude that interpretative or approximation models not only provide the best way to achieve the objectual understanding of a machine learning model, but are also a necessary condition to achieve post hoc interpretability. This conclusion is partly based on the shortcomings of the purely functionalist approach to post hoc interpretability that seems to be predominant in most recent literature.
••
Albert M. Sirunyan1, Robin Erbacher2, C. A. Carrillo Montoya3, Dave M Newbold4 +2319 more•Institutions (158)
TL;DR: In this article, a search for direct production of the supersymmetric partners of electrons or muons is presented in final states with two opposite-charge, same-flavour leptons (electrons and muons), no jets, and large missing transverse momentum.