scispace - formally typeset
Search or ask a question
Author

Georges Aad

Bio: Georges Aad is an academic researcher from Aix-Marseille University. The author has contributed to research in topics: Large Hadron Collider & Higgs boson. The author has an hindex of 135, co-authored 1121 publications receiving 88811 citations. Previous affiliations of Georges Aad include Centre national de la recherche scientifique & University of Udine.


Papers
More filters
Journal ArticleDOI
01 Apr 2010
TL;DR: This work discusses how the Detector Control System information about the status of individual sub-detectors should be used, and the technicalities of making this summary.
Abstract: At the ATLAS experiment, the Detector Control System (DCS) is used to oversee detector conditions and supervise the running of equipment. It is essential that information from the DCS about the status of individual sub-detectors be extracted and taken into account when determining the quality of data taken and its suitability for different analyses. DCS information is written to the ATLAS conditions database and then summarised to provide a status flag for each sub-detector and displayed on the web. We discuss how this DCS information should be used, and the technicalities of making this summary.

12 citations

Journal ArticleDOI
TL;DR: In this paper, a measurement of leptonically decaying $W^\pm $ bosons, normalised by the total number of minimum bias events and the nuclear thickness function, is reported using data recorded by the ATLAS experiment at the LHC in 2015.
Abstract: A measurement of $W^\pm $ boson production in Pb+Pb collisions at $\sqrt{s_\mathrm {NN}} = 5.02~\text {Te}\text {V}$ is reported using data recorded by the ATLAS experiment at the LHC in 2015, corresponding to a total integrated luminosity of $0.49\,\mathrm {nb^{-1}}$. The $W^\pm $ bosons are reconstructed in the electron or muon leptonic decay channels. Production yields of leptonically decaying $W^\pm $ bosons, normalised by the total number of minimum-bias events and the nuclear thickness function, are measured within a fiducial region defined by the detector acceptance and the main kinematic requirements. These normalised yields are measured separately for $W^+$ and $W^-$ bosons, and are presented as a function of the absolute value of pseudorapidity of the charged lepton and of the collision centrality. The lepton charge asymmetry is also measured as a function of the absolute value of lepton pseudorapidity. In addition, nuclear modification factors are calculated using the $W^\pm $ boson production cross-sections measured in pp collisions. The results are compared with predictions based on next-to-leading-order calculations with CT14 parton distribution functions as well as with predictions obtained with the EPPS16 and nCTEQ15 nuclear parton distribution functions. No dependence of normalised production yields on centrality and a good agreement with predictions are observed for mid-central and central collisions. For peripheral collisions, the data agree with predictions within 1.7 (0.9) standard deviations for $W^-$ ($W^+$) bosons.

11 citations

01 May 2009
TL;DR: In this paper, the Hidden-Valley scenario is used for exploring the challenges posed by long-lived particles with long decay paths to the trigger and the reconstruction capabilities of the ATLAS apparatus.
Abstract: Neutral particles with long decay paths that decay to many-particle final states represent, from an experimental point of view, a challenge both for the trigger and for the reconstruction capabilities of the ATLAS apparatus. The Hidden Valley scenario serves as an excellent setting for the purpose of exploring the challenges to the trigger posed by long-lived particles.

11 citations

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Dale Charles Abbott3, A. Abed Abud4  +5211 moreInstitutions (274)
TL;DR: The combination of measurements of the W boson polarization in top quark decays performed by the ATLAS and CMS collaborations is presented in this paper, where the measurements are based on proton-proton collision data.
Abstract: The combination of measurements of the W boson polarization in top quark decays performed by the ATLAS and CMS collaborations is presented. The measurements are based on proton-proton collision dat ...

11 citations

15 Nov 2017
TL;DR: In this article, the authors presented a set of observables sensitive to the anomalous production of hadronic jets and missing momentum in the plane transverse to the proton beams at the Large Hadron Collider.
Abstract: Observables sensitive to the anomalous production of events containing hadronic jets and missing momentum in the plane transverse to the proton beams at the Large Hadron Collider are presented. The observables are defined as a ratio of cross sections, for events containing jets and large missing transverse momentum to events containing jets and a pair of charged leptons from the decay of a Z/γ ∗ boson. This definition minimises experimental and theoretical systematic uncertainties in the measurements. This ratio is measured differentially with respect to a number of kinematic properties of the hadronic system in two phase-space regions; one inclusive single-jet region and one region sensitive to vectorboson-fusion topologies. The data are found to be in agreement with the Standard Model predictions and used to constrain a variety of theoretical models for dark-matter production, including simplified models, effective field theory models, and invisible decays of the Higgs boson. The measurements use 3.2 fb−1 of proton–proton collision data recorded by the ATLAS experiment at a centre-of-mass energy of 13 TeV and are fully corrected for detector effects, meaning that the data can be used to constrain new-physics models beyond those shown in this paper.

11 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
01 Apr 1988-Nature
TL;DR: In this paper, a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) is presented.
Abstract: Deposits of clastic carbonate-dominated (calciclastic) sedimentary slope systems in the rock record have been identified mostly as linearly-consistent carbonate apron deposits, even though most ancient clastic carbonate slope deposits fit the submarine fan systems better. Calciclastic submarine fans are consequently rarely described and are poorly understood. Subsequently, very little is known especially in mud-dominated calciclastic submarine fan systems. Presented in this study are a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) that reveals a >250 m thick calciturbidite complex deposited in a calciclastic submarine fan setting. Seven facies are recognised from core and thin section characterisation and are grouped into three carbonate turbidite sequences. They include: 1) Calciturbidites, comprising mostly of highto low-density, wavy-laminated bioclast-rich facies; 2) low-density densite mudstones which are characterised by planar laminated and unlaminated muddominated facies; and 3) Calcidebrites which are muddy or hyper-concentrated debrisflow deposits occurring as poorly-sorted, chaotic, mud-supported floatstones. These

9,929 citations

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations