scispace - formally typeset
Search or ask a question

Showing papers by "Royal Holloway, University of London published in 2019"


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

526 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

425 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

407 citations


Journal ArticleDOI
TL;DR: This is the first set of consistently dated marine sediment cores enabling paleoclimate scientists to evaluate leads/lags between circulation and climate changes over vast regions of the Atlantic Ocean.
Abstract: Rapid changes in ocean circulation and climate have been observed in marine-sediment and ice cores over the last glacial period and deglaciation, highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing. To date, these rapid changes in climate and ocean circulation are still not fully explained. One obstacle hindering progress in our understanding of the interactions between past ocean circulation and climate changes is the difficulty of accurately dating marine cores. Here, we present a set of 92 marine sediment cores from the Atlantic Ocean for which we have established age-depth models that are consistent with the Greenland GICC05 ice core chronology, and computed the associated dating uncertainties, using a new deposition modeling technique. This is the first set of consistently dated marine sediment cores enabling paleoclimate scientists to evaluate leads/lags between circulation and climate changes over vast regions of the Atlantic Ocean. Moreover, this data set is of direct use in paleoclimate modeling studies.

399 citations


Journal ArticleDOI
TL;DR: The increase in the methane burden began in 2007, with the mean global mole fraction in remote surface background air rising from about 1775 ppb in 2006 to 1850 ppb by 2017, at rates not observed since the 1980s as discussed by the authors.
Abstract: Atmospheric methane grew very rapidly in 2014 (12.7±0.5 ppb/yr), 2015 (10.1±0.7 ppb/yr), 2016 (7.0± 0.7 ppb/yr) and 2017 (7.7±0.7 ppb/yr), at rates not observed since the 1980s. The increase in the methane burden began in 2007, with the mean global mole fraction in remote surface background air rising from about 1775 ppb in 2006 to 1850 ppb in 2017. Simultaneously the 13C/12C isotopic ratio (expressed as δ13CCH4) has shifted, in a new trend to more negative values that have been observed worldwide for over a decade. The causes of methane's recent mole fraction increase are therefore either a change in the relative proportions (and totals) of emissions from biogenic and thermogenic and pyrogenic sources, especially in the tropics and sub-tropics, or a decline in the atmospheric sink of methane, or both. Unfortunately, with limited measurement data sets, it is not currently possible to be more definitive. The climate warming impact of the observed methane increase over the past decade, if continued at >5 ppb/yr in the coming decades, is sufficient to challenge the Paris Agreement, which requires sharp cuts in the atmospheric methane burden. However, anthropogenic methane emissions are relatively very large and thus offer attractive targets for rapid reduction, which are essential if the Paris Agreement aims are to be attained.

329 citations


Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +3380 moreInstitutions (206)
TL;DR: In this article, a search for high-mass dielectron and dimuon resonances in the mass range of 250 GeV to 6 TeV was performed at the Large Hadron Collider.

248 citations


Journal ArticleDOI
Morad Aaboud, Georges Aad1, Brad Abbott2, Dale Charles Abbott3  +2936 moreInstitutions (198)
TL;DR: An exclusion limit on the H→invisible branching ratio of 0.26(0.17_{-0.05}^{+0.07}) at 95% confidence level is observed (expected) in combination with the results at sqrt[s]=7 and 8 TeV.
Abstract: Dark matter particles, if sufficiently light, may be produced in decays of the Higgs boson. This Letter presents a statistical combination of searches for H→invisible decays where H is produced according to the standard model via vector boson fusion, Z(ll)H, and W/Z(had)H, all performed with the ATLAS detector using 36.1 fb^{-1} of pp collisions at a center-of-mass energy of sqrt[s]=13 TeV at the LHC. In combination with the results at sqrt[s]=7 and 8 TeV, an exclusion limit on the H→invisible branching ratio of 0.26(0.17_{-0.05}^{+0.07}) at 95% confidence level is observed (expected).

234 citations


Journal ArticleDOI
Naihui Zhou1, Yuxiang Jiang2, Timothy Bergquist3, Alexandra J. Lee4  +185 moreInstitutions (71)
TL;DR: The third CAFA challenge, CAFA3, that featured an expanded analysis over the previous CAFA rounds, both in terms of volume of data analyzed and the types of analysis performed, concluded that while predictions of the molecular function and biological process annotations have slightly improved over time, those of the cellular component have not.
Abstract: The Critical Assessment of Functional Annotation (CAFA) is an ongoing, global, community-driven effort to evaluate and improve the computational annotation of protein function. Here, we report on the results of the third CAFA challenge, CAFA3, that featured an expanded analysis over the previous CAFA rounds, both in terms of volume of data analyzed and the types of analysis performed. In a novel and major new development, computational predictions and assessment goals drove some of the experimental assays, resulting in new functional annotations for more than 1000 genes. Specifically, we performed experimental whole-genome mutation screening in Candida albicans and Pseudomonas aureginosa genomes, which provided us with genome-wide experimental data for genes associated with biofilm formation and motility. We further performed targeted assays on selected genes in Drosophila melanogaster, which we suspected of being involved in long-term memory. We conclude that while predictions of the molecular function and biological process annotations have slightly improved over time, those of the cellular component have not. Term-centric prediction of experimental annotations remains equally challenging; although the performance of the top methods is significantly better than the expectations set by baseline methods in C. albicans and D. melanogaster, it leaves considerable room and need for improvement. Finally, we report that the CAFA community now involves a broad range of participants with expertise in bioinformatics, biological experimentation, biocuration, and bio-ontologies, working together to improve functional annotation, computational function prediction, and our ability to manage big data in the era of large experimental screens.

227 citations


Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +2962 moreInstitutions (195)
TL;DR: In this article, an improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail, including corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies.
Abstract: This paper describes the reconstruction of electrons and photons with the ATLAS detector, employed for measurements and searches exploiting the complete LHC Run 2 dataset. An improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail. Corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies, and the measurement of the charge of reconstructed electron candidates are determined using up to 81 fb−1 of proton-proton collision data collected at √s=13 TeV between 2015 and 2017.

227 citations


Journal ArticleDOI
TL;DR: It is the hope that the renewed sociotechnical frame for the IS discipline discussed in the paper holds potential to contribute to the enduring strength of the diverse, distinctive, yet unified discipline.
Abstract: The sociotechnical perspective is often seen as one of the foundational viewpoints—or an “axis of cohesion”— for the Information Systems (IS) discipline, contributing to both its distinctiveness and its ability to coherently expand its boundaries. However, our review of papers in the two leading IS journals from 2000 to 2016 suggests that IS research has lost sight of the discipline’s sociotechnical character—a character that was widely acknowledged at the discipline’s inception. This is a problem because an axis of cohesion can be fundamental to a discipline’s long-term vitality. In order to address this issue, we offer ways to renew the sociotechnical perspective so that it can continue to serve as a distinctive and coherent foundation for the discipline. Our hope is that the renewed sociotechnical frame for the IS discipline discussed in the paper holds potential to contribute to the enduring strength of our diverse, distinctive, yet unified discipline. It also prompts members of the discipline to think more deeply about what it means to be an IS scholar.

225 citations


Journal ArticleDOI
Georges Aad1, Alexander Kupco2, Samuel Webb3, Timo Dreyer4  +2961 moreInstitutions (196)
TL;DR: In this article, the ATLAS Collaboration during Run 2 of the Large Hadron Collider (LHC) was used to identify jets containing b-hadrons, and the performance of the algorithms was evaluated in the s...
Abstract: The algorithms used by the ATLAS Collaboration during Run 2 of the Large Hadron Collider to identify jets containing b-hadrons are presented. The performance of the algorithms is evaluated in the s ...

Journal ArticleDOI
TL;DR: Findings indicated a gender difference in reasons for camouflaging, with autistic women more likely to endorse “conventional” reasons (e.g. getting by in formal settings such as work), which have implications for understanding camouflaging in autistic adults.
Abstract: Camouflaging entails 'masking' in or 'passing' social situations. Research suggests camouflaging behaviours are common in autistic people, and may negatively impact mental health. To enhance understanding of camouflaging, this study examined reasons, contexts and costs of camouflaging. 262 autistic people completed measures of camouflaging behaviours, camouflaging contexts (e.g. work vs. family), camouflaging reasons (e.g. to make friends) and mental health symptoms. Findings indicated a gender difference in reasons for camouflaging, with autistic women more likely to endorse "conventional" reasons (e.g. getting by in formal settings such as work). Both camouflaging highly across contexts and 'switching' between camouflaging in some contexts but not in others, related to poorer mental health. These findings have implications for understanding camouflaging in autistic adults.

Proceedings Article
14 Aug 2019
TL;DR: In this article, the authors argue that results are commonly inflated due to two pervasive sources of experimental bias: spatial bias caused by distributions of training and testing data that are not representative of a real-world deployment.
Abstract: Is Android malware classification a solved problem? Published F1 scores of up to 0.99 appear to leave very little room for improvement. In this paper, we argue that results are commonly inflated due to two pervasive sources of experimental bias: "spatial bias" caused by distributions of training and testing data that are not representative of a real-world deployment; and "temporal bias" caused by incorrect time splits of training and testing sets, leading to impossible configurations. We propose a set of space and time constraints for experiment design that eliminates both sources of bias. We introduce a new metric that summarizes the expected robustness of a classifier in a real-world setting, and we present an algorithm to tune its performance. Finally, we demonstrate how this allows us to evaluate mitigation strategies for time decay such as active learning. We have implemented our solutions in TESSERACT, an open source evaluation framework for comparing malware classifiers in a realistic setting. We used TESSERACT to evaluate three Android malware classifiers from the literature on a dataset of 129K applications spanning over three years. Our evaluation confirms that earlier published results are biased, while also revealing counter-intuitive performance and showing that appropriate tuning can lead to significant improvements.

Journal ArticleDOI
11 Apr 2019-Nature
TL;DR: Analysis of records from 414 societies that span the past 10,000 years from 30 regions around the world reveals that moralizing gods follow—rather than precede—large increases in social complexity.
Abstract: The origins of religion and of complex societies represent evolutionary puzzles1–8. The ‘moralizing gods’ hypothesis offers a solution to both puzzles by proposing that belief in morally concerned supernatural agents culturally evolved to facilitate cooperation among strangers in large-scale societies9–13. Although previous research has suggested an association between the presence of moralizing gods and social complexity3,6,7,9–18, the relationship between the two is disputed9–13,19–24, and attempts to establish causality have been hampered by limitations in the availability of detailed global longitudinal data. To overcome these limitations, here we systematically coded records from 414 societies that span the past 10,000 years from 30 regions around the world, using 51 measures of social complexity and 4 measures of supernatural enforcement of morality. Our analyses not only confirm the association between moralizing gods and social complexity, but also reveal that moralizing gods follow—rather than precede—large increases in social complexity. Contrary to previous predictions9,12,16,18, powerful moralizing ‘big gods’ and prosocial supernatural punishment tend to appear only after the emergence of ‘megasocieties’ with populations of more than around one million people. Moralizing gods are not a prerequisite for the evolution of social complexity, but they may help to sustain and expand complex multi-ethnic empires after they have become established. By contrast, rituals that facilitate the standardization of religious traditions across large populations25,26 generally precede the appearance of moralizing gods. This suggests that ritual practices were more important than the particular content of religious belief to the initial rise of social complexity. Belief in moralizing gods followed the expansion of human societies and may have been preceded by doctrinal rituals that contributed to the initial rise of social complexity.

Journal ArticleDOI
Morad Aaboud, Georges Aad1, Brad Abbott2, Dale Charles Abbott3  +3001 moreInstitutions (220)
TL;DR: In this paper, the decays of B0 s! + and B0! + have been studied using 26 : 3 fb of 13TeV LHC proton-proton collision data collected with the ATLAS detector in 2015 and 2016.
Abstract: A study of the decays B0 s ! + and B0 ! + has been performed using 26 : 3 fb of 13TeV LHC proton-proton collision data collected with the ATLAS detector in 2015 and 2016. Since the detector resolut ...

Journal ArticleDOI
TL;DR: Magnetic force microscopy (MFM) has become a truly widespread and commonly used characterization technique that has been applied to a variety of research and industrial applications as discussed by the authors, where the main advantages of the method include its high spatial resolution (typically ∼50 nm), ability to work in variable temperature and applied magnetic fields, versatility, and simplicity in operation, all without almost any need for sample preparation.
Abstract: Since it was first demonstrated in 1987, magnetic force microscopy (MFM) has become a truly widespread and commonly used characterization technique that has been applied to a variety of research and industrial applications. Some of the main advantages of the method includes its high spatial resolution (typically ∼50 nm), ability to work in variable temperature and applied magnetic fields, versatility, and simplicity in operation, all without almost any need for sample preparation. However, for most commercial systems, the technique has historically provided only qualitative information, and the number of available modes was typically limited, thus not reflecting the experimental demands. Additionally, the range of samples under study was largely restricted to “classic” ferromagnetic samples (typically, thin films or patterned nanostructures). Throughout this Perspective article, the recent progress and development of MFM is described, followed by a summary of the current state-of-the-art techniques and objects for study. Finally, the future of this fascinating field is discussed in the context of emerging instrumental and material developments. Aspects including quantitative MFM, the accurate interpretation of the MFM images, new instrumentation, probe-engineering alternatives, and applications of MFM to new (often interdisciplinary) areas of the materials science, physics, and biology will be discussed. We first describe the physical principles of MFM, specifically paying attention to common artifacts frequently occurring in MFM measurements; then, we present a comprehensive review of the recent developments in the MFM modes, instrumentation, and the main application areas; finally, the importance of the technique is speculated upon for emerging or anticipated to emerge fields including skyrmions, 2D-materials, and topological insulators.

Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: The third volume of the FCC Conceptual Design Report as discussed by the authors is devoted to the hadron collider FCC-hh, and summarizes the physics discovery opportunities, presents the FCC-HH accelerator design, performance reach, and staged operation plan, discusses the underlying technologies, the civil engineering and technical infrastructure, and also sketches a possible implementation.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics (EPPSU), the Future Circular Collider (FCC) study was launched as a world-wide international collaboration hosted by CERN. The FCC study covered an energy-frontier hadron collider (FCC-hh), a highest-luminosity high-energy lepton collider (FCC-ee), the corresponding 100 km tunnel infrastructure, as well as the physics opportunities of these two colliders, and a high-energy LHC, based on FCC-hh technology. This document constitutes the third volume of the FCC Conceptual Design Report, devoted to the hadron collider FCC-hh. It summarizes the FCC-hh physics discovery opportunities, presents the FCC-hh accelerator design, performance reach, and staged operation plan, discusses the underlying technologies, the civil engineering and technical infrastructure, and also sketches a possible implementation. Combining ingredients from the Large Hadron Collider (LHC), the high-luminosity LHC upgrade and adding novel technologies and approaches, the FCC-hh design aims at significantly extending the energy frontier to 100 TeV. Its unprecedented centre-of-mass collision energy will make the FCC-hh a unique instrument to explore physics beyond the Standard Model, offering great direct sensitivity to new physics and discoveries.

Journal ArticleDOI
David Curtin1, Marco Drewes2, Matthew McCullough3, Patrick Meade4, Rabindra N. Mohapatra5, Jessie Shelton6, Brian Shuve7, Brian Shuve8, Elena Accomando9, Cristiano Alpigiani10, Stefan Antusch11, J. C. Arteaga-Velázquez12, Brian Batell13, Martin Bauer14, Nikita Blinov7, Karen S. Caballero-Mora, Jae Hyeok Chang4, Eung Jin Chun15, Raymond T. Co16, Timothy Cohen17, Peter Cox18, Nathaniel Craig19, Csaba Csáki20, Yanou Cui21, Francesco D'Eramo22, Luigi Delle Rose23, P. S. Bhupal Dev24, Keith R. Dienes5, Keith R. Dienes25, Jeff A. Dror26, Jeff A. Dror27, Rouven Essig4, Jared A. Evans6, Jared A. Evans28, Jason L. Evans15, Arturo Fernandez Tellez29, Oliver Fischer30, Thomas Flacke, Anthony Fradette31, Claudia Frugiuele32, Elina Fuchs32, Tony Gherghetta33, Gian F. Giudice3, Dmitry Gorbunov34, Rajat Gupta35, Claudia Hagedorn36, Lawrence J. Hall27, Lawrence J. Hall26, Philip Harris37, Juan Carlos Helo38, Juan Carlos Helo39, Martin Hirsch40, Yonit Hochberg41, Anson Hook5, Alejandro Ibarra42, Alejandro Ibarra15, Seyda Ipek43, Sunghoon Jung44, Simon Knapen27, Simon Knapen26, Eric Kuflik41, Zhen Liu, Salvator Lombardo20, Henry Lubatti10, David McKeen45, Emiliano Molinaro46, Stefano Moretti47, Stefano Moretti9, Natsumi Nagata18, Matthias Neubert48, Matthias Neubert20, Jose Miguel No49, Jose Miguel No50, Emmanuel Olaiya47, Gilad Perez32, Michael E. Peskin7, David Pinner51, David Pinner52, Maxim Pospelov31, Maxim Pospelov53, Matthew Reece52, Dean J. Robinson28, Mario Rodriguez Cahuantzi29, R. Santonico54, Matthias Schlaffer32, Claire H. Shepherd-Themistocleous47, Andrew Spray, Daniel Stolarski55, Martin A. Subieta Vasquez56, Raman Sundrum5, Andrea Thamm3, Brooks Thomas57, Yuhsin Tsai5, Brock Tweedie13, Stephen M. West58, Charles Young7, Felix Yu48, Bryan Zaldivar50, Bryan Zaldivar59, Yongchao Zhang60, Yongchao Zhang24, Kathryn M. Zurek27, Kathryn M. Zurek26, Kathryn M. Zurek3, José Zurita30 
University of Toronto1, Université catholique de Louvain2, CERN3, C. N. Yang Institute for Theoretical Physics4, University of Maryland, College Park5, University of Illinois at Urbana–Champaign6, Stanford University7, Harvey Mudd College8, University of Southampton9, University of Washington10, University of Basel11, Universidad Michoacana de San Nicolás de Hidalgo12, University of Pittsburgh13, Heidelberg University14, Korea Institute for Advanced Study15, University of Michigan16, University of Oregon17, University of Tokyo18, University of California, Santa Barbara19, Cornell University20, University of California, Riverside21, University of Padua22, University of Florence23, Washington University in St. Louis24, University of Arizona25, Lawrence Berkeley National Laboratory26, University of California, Berkeley27, University of Cincinnati28, Benemérita Universidad Autónoma de Puebla29, Karlsruhe Institute of Technology30, University of Victoria31, Weizmann Institute of Science32, University of Minnesota33, Moscow Institute of Physics and Technology34, Durham University35, University of Southern Denmark36, Massachusetts Institute of Technology37, Valparaiso University38, University of La Serena39, Spanish National Research Council40, Hebrew University of Jerusalem41, Technische Universität München42, University of California, Irvine43, Seoul National University44, TRIUMF45, Aarhus University46, Rutherford Appleton Laboratory47, University of Mainz48, King's College London49, Autonomous University of Madrid50, Brown University51, Harvard University52, Perimeter Institute for Theoretical Physics53, University of Rome Tor Vergata54, Carleton University55, Higher University of San Andrés56, Lafayette College57, Royal Holloway, University of London58, University of Grenoble59, Université libre de Bruxelles60
TL;DR: A model-independent approach is developed to describe the sensitivity of MATHUSLA to BSM LLP signals, and a general discussion of the top-down and bottom-up motivations for LLP searches are synthesized to demonstrate the exceptional strength and breadth of the physics case for the construction of the MATH USLA detector.
Abstract: We examine the theoretical motivations for long-lived particle (LLP) signals at the LHC in a comprehensive survey of standard model (SM) extensions. LLPs are a common prediction of a wide range of theories that address unsolved fundamental mysteries such as naturalness, dark matter, baryogenesis and neutrino masses, and represent a natural and generic possibility for physics beyond the SM (BSM). In most cases the LLP lifetime can be treated as a free parameter from the [Formula: see text]m scale up to the Big Bang Nucleosynthesis limit of [Formula: see text] m. Neutral LLPs with lifetimes above [Formula: see text]100 m are particularly difficult to probe, as the sensitivity of the LHC main detectors is limited by challenging backgrounds, triggers, and small acceptances. MATHUSLA is a proposal for a minimally instrumented, large-volume surface detector near ATLAS or CMS. It would search for neutral LLPs produced in HL-LHC collisions by reconstructing displaced vertices (DVs) in a low-background environment, extending the sensitivity of the main detectors by orders of magnitude in the long-lifetime regime. We study the LLP physics opportunities afforded by a MATHUSLA-like detector at the HL-LHC, assuming backgrounds can be rejected as expected. We develop a model-independent approach to describe the sensitivity of MATHUSLA to BSM LLP signals, and compare it to DV and missing energy searches at ATLAS or CMS. We then explore the BSM motivations for LLPs in considerable detail, presenting a large number of new sensitivity studies. While our discussion is especially oriented towards the long-lifetime regime at MATHUSLA, this survey underlines the importance of a varied LLP search program at the LHC in general. By synthesizing these results into a general discussion of the top-down and bottom-up motivations for LLP searches, it is our aim to demonstrate the exceptional strength and breadth of the physics case for the construction of the MATHUSLA detector.

Journal ArticleDOI
TL;DR: The Sustainable Development Goals and the New Urban Agenda recognise the role of cities in achieving sustainable development as discussed by the authors, however, these agendas were agreed and signed by national goverment authorities.
Abstract: The Sustainable Development Goals and the New Urban Agenda recognise the role of cities in achieving sustainable development. However, these agendas were agreed and signed by national gover...

Journal ArticleDOI
Morad Aaboud, Alexander Kupco1, Samuel Webb2, Timo Dreyer3  +2969 moreInstitutions (195)
TL;DR: Algorithms used for the reconstruction and identification of electrons in the central region of the ATLAS detector at the Large Hadron Collider (LHC) are presented in this article, these algorithms a...
Abstract: Algorithms used for the reconstruction and identification of electrons in the central region of the ATLAS detector at the Large Hadron Collider (LHC) are presented in this paper; these algorithms a ...

Journal ArticleDOI
TL;DR: The complex interplay between abiotic stressors, host trees, insect herbivores and their natural enemies makes it very difficult to predict overall consequences of climate change on forest health, so process-based models are needed to simulate pest population dynamics under climate change scenarios.
Abstract: Climate change is a multi-faceted phenomenon, including elevated CO2, warmer temperatures, more severe droughts and more frequent storms. All these components can affect forest pests directly, or indirectly through interactions with host trees and natural enemies. Most of the responses of forest insect herbivores to climate change are expected to be positive, with shorter generation time, higher fecundity and survival, leading to increased range expansion and outbreaks. Forest insect pest can also benefit from synergistic effects of several climate change pressures, such as hotter droughts or warmer storms. However, lesser known negative effects are also likely, such as lethal effects of heat waves or thermal shocks, less palatable host tissues or more abundant parasitoids and predators. The complex interplay between abiotic stressors, host trees, insect herbivores and their natural enemies makes it very difficult to predict overall consequences of climate change on forest health. This calls for the development of process-based models to simulate pest population dynamics under climate change scenarios.

Journal ArticleDOI
TL;DR: Sequenceserver is a tool for running BLAST and visually inspecting BLAST results for biological interpretation and uses simple algorithms to prevent potential analysis errors and provides flexible text-based and visual outputs to support researcher productivity.
Abstract: Comparing newly obtained and previously known nucleotide and amino-acid sequences underpins modern biological research. BLAST is a well-established tool for such comparisons but is challenging to use on new data sets. We combined a user-centric design philosophy with sustainable software development approaches to create Sequenceserver, a tool for running BLAST and visually inspecting BLAST results for biological interpretation. Sequenceserver uses simple algorithms to prevent potential analysis errors and provides flexible text-based and visual outputs to support researcher productivity. Our software can be rapidly installed for use by individuals or on shared servers.

Posted ContentDOI
Naihui Zhou1, Yuxiang Jiang2, Timothy Bergquist3, Alexandra J. Lee4  +178 moreInstitutions (67)
29 May 2019-bioRxiv
TL;DR: It is reported that the CAFA community now involves a broad range of participants with expertise in bioinformatics, biological experimentation, biocuration, and bioontologies, working together to improve functional annotation, computational function prediction, and the ability to manage big data in the era of large experimental screens.
Abstract: The Critical Assessment of Functional Annotation (CAFA) is an ongoing, global, community-driven effort to evaluate and improve the computational annotation of protein function. Here we report on the results of the third CAFA challenge, CAFA3, that featured an expanded analysis over the previous CAFA rounds, both in terms of volume of data analyzed and the types of analysis performed. In a novel and major new development, computational predictions and assessment goals drove some of the experimental assays, resulting in new functional annotations for more than 1000 genes. Specifically, we performed experimental whole-genome mutation screening in Candida albicans and Pseudomonas aureginosa genomes, which provided us with genome-wide experimental data for genes associated with biofilm formation and motility (P. aureginosa only). We further performed targeted assays on selected genes in Drosophila melanogaster, which we suspected of being involved in long-term memory. We conclude that, while predictions of the molecular function and biological process annotations have slightly improved over time, those of the cellular component have not. Term-centric prediction of experimental annotations remains equally challenging; although the performance of the top methods is significantly better than expectations set by baseline methods in C. albicans and D. melanogaster, it leaves considerable room and need for improvement. We finally report that the CAFA community now involves a broad range of participants with expertise in bioinformatics, biological experimentation, biocuration, and bioontologies, working together to improve functional annotation, computational function prediction, and our ability to manage big data in the era of large experimental screens.

Journal ArticleDOI
TL;DR: Although the lion's share of scholarship in management and organization studies conceives of organizations as entities within which communication occurs, "Communication Constitutes Organization" (C... as mentioned in this paper ).
Abstract: Although the lion’s share of scholarship in management and organization studies conceives of organizations as entities within which communication occurs, “Communication Constitutes Organization” (C...

Journal ArticleDOI
TL;DR: In 2018, the International Union of Geological Sciences formally ratified a proposal to subdivide the Holocene into three stages/ages, along with their equivalent subseries/subepochs, each anchored by a Global boundary Stratotype Section and Point (GSSP) as mentioned in this paper.
Abstract: The Holocene, which currently spans ~11 700 years, is the shortest series/epoch within the geological time scale (GTS), yet it contains a rich archive of evidence in stratigraphical contexts that are frequently continuous and often preserved at high levels of resolution. On 14 June 2018, the Executive Committee of the International Union of Geological Sciences formally ratified a proposal to subdivide the Holocene into three stages/ages, along with their equivalent subseries/subepochs, each anchored by a Global boundary Stratotype Section and Point (GSSP). The new stages are the Greenlandian (Lower/Early Holocene Subseries/Subepoch) with its GSSP in the Greenland NGRIP2 ice core and dated at 11 700 a b2k (before 2000 CE); the Northgrippian (Middle Holocene Subseries/Subepoch) with its GSSP in the Greenland NGRIP1 ice core and dated at 8236 a b2k; and the Meghalayan (Upper/Late Holocene Subseries/Subepoch) with its GSSP in a speleothem from Mawmluh Cave, north-eastern India, with a date of 4250 a b2k. We explain the nomenclature of the new divisions, describe the procedures involved in the ratification process, designate auxiliary stratotypes to support the GSSPs and consider the implications of the subdivision for defining the Anthropocene as a new unit within the GTS. (Less)

Journal ArticleDOI
TL;DR: In this paper, a global 1.5°C scenario, the number of days above 0°C with up to 130 days each year in the northern Peninsula of Antarctica was studied.
Abstract: Warming of the Antarctic Peninsula in the latter half of the 20th century was greater than any other terrestrial environment in the Southern Hemisphere, and obvious cryospheric and biological consequences have been observed. Under a global 1.5°C scenario, warming in the Antarctic Peninsula is likely increase the number of days above 0°C, with up to 130 of such days each year in the northern Peninsula. Ocean turbulence will increase, making the circumpolar deep water (CDW) both warmer and shallower, delivering heat to the sea surface and to coastal margins. Thinning and recession of marine margins of glaciers and ice caps is expected to accelerate to terrestrial limits, increasing iceberg production, after which glacier retreat may slow on land. Ice shelves will experience continued increase in meltwater production and consequent structural change, but not imminent regional collapses. Marine biota can respond in multiple ways to climatic changes, with effects complicated by past resource extraction activities. Southward distribution shifts have been observed in multiple taxa during the last century and these are likely to continue. Exposed (ice free) terrestrial areas will expand, providing new habitats for native and non-native organisms, but with a potential loss of genetic diversity. While native terrestrial biota are likely to benefit from modest warming, the greatest threat to native biodiversity is from non-native terrestrial species.

Journal ArticleDOI
Morad Aaboud, Alexander Kupco, Samuel Webb1, Timo Dreyer  +2921 moreInstitutions (67)
TL;DR: In this article, the authors measured the yield and nuclear modification factor (R-AA) of the Pb+Pb data at root s(NN) = 5.02 TeV and 25 pb−Pb−1 data at r...

Journal ArticleDOI
TL;DR: In this article, leptonic decays of W bosons extracted from 13 TeV proton-proton collisions at the LHC are used to search for heavy neutral leptons (HNLs) that are produced through mixing with muon or electron neutrinos.
Abstract: The problems of neutrino masses, matter-antimatter asymmetry, and dark matter could be successfully addressed by postulating right-handed neutrinos with Majorana masses below the electroweak scale. In this work, leptonic decays of W bosons extracted from 32.9 fb−1 to 36.1 fb−1 of 13 TeV proton–proton collisions at the LHC are used to search for heavy neutral leptons (HNLs) that are produced through mixing with muon or electron neutrinos. The search is conducted using the ATLAS detector in both prompt and displaced leptonic decay signatures. The prompt signature requires three leptons produced at the interaction point (either μμe or eeμ) with a veto on same-flavour opposite-charge topologies. The displaced signature comprises a prompt muon from the W boson decay and the requirement of a dilepton vertex (either μμ or μe) displaced in the transverse plane by 4–300 mm from the interaction point. The search sets constraints on the HNL mixing to muon and electron neutrinos for HNL masses in the range 4.5–50 GeV.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Dale Charles Abbott3, Ovsat Abdinov4  +2951 moreInstitutions (199)
TL;DR: This Letter describes the observation of the light-by-light scattering process, γγ→γγ, in Pb+Pb collisions at sqrt[s_{NN}]=5.02 TeV, and the observed excess of events over the expected background has a significance of 8.2 standard deviations.
Abstract: This Letter describes the observation of the light-by-light scattering process, γγ→γγ, in Pb+Pb collisions at sqrt[s_{NN}]=5.02 TeV. The analysis is conducted using a data sample corresponding to an integrated luminosity of 1.73 nb^{-1}, collected in November 2018 by the ATLAS experiment at the LHC. Light-by-light scattering candidates are selected in events with two photons produced exclusively, each with transverse energy E_{T}^{γ}>3 GeV and pseudorapidity |η_{γ}|<2.4, diphoton invariant mass above 6 GeV, and small diphoton transverse momentum and acoplanarity. After applying all selection criteria, 59 candidate events are observed for a background expectation of 12±3 events. The observed excess of events over the expected background has a significance of 8.2 standard deviations. The measured fiducial cross section is 78±13(stat)±7(syst)±3(lumi) nb.

Journal ArticleDOI
Georges Aad, Brad Abbott1, Dale Charles Abbott2, Ovsat Abdinov3  +2952 moreInstitutions (60)
TL;DR: In this paper, a search for a heavy charged-boson resonance decaying into a charged lepton (electron or muon) and a neutrino is reported, where the observed transverse mass distribution computed from the lepton and missing transverse momenta is consistent with the distribution expected from the Standard Model.
Abstract: A search for a heavy charged-boson resonance decaying into a charged lepton (electron or muon) and a neutrino is reported. A data sample of 139 fb−1 of proton-proton collisions at √s=13 TeV collected with the ATLAS detector at the LHC during 2015–2018 is used in the search. The observed transverse mass distribution computed from the lepton and missing transverse momenta is consistent with the distribution expected from the Standard Model, and upper limits on the cross section for pp→W′→lν are extracted (l=e or μ). These vary between 1.3 pb and 0.05 fb depending on the resonance mass in the range between 0.15 and 7.0 TeV at 95% confidence level for the electron and muon channels combined. Gauge bosons with a mass below 6.0 and 5.1 TeV are excluded in the electron and muon channels, respectively, in a model with a resonance that has couplings to fermions identical to those of the Standard Model W boson. Cross-section limits are also provided for resonances with several fixed Γ/m values in the range between 1% and 15%. Model-independent limits are derived in single-bin signal regions defined by a varying minimum transverse mass threshold. The resulting visible cross-section upper limits range between 4.6 (15) pb and 22 (22) ab as the threshold increases from 130 (110) GeV to 5.1 (5.1) TeV in the electron (muon) channel.