scispace - formally typeset
Search or ask a question
Author

Yang Gao

Bio: Yang Gao is an academic researcher from University of Surrey. The author has contributed to research in topics: Large Hadron Collider & Higgs boson. The author has an hindex of 168, co-authored 2047 publications receiving 146301 citations. Previous affiliations of Yang Gao include China Agricultural University & University of Kassel.


Papers
More filters
Journal ArticleDOI
Roel Aaij, Bernardo Adeva1, Marco Adinolfi2, C. Adrover3  +570 moreInstitutions (48)
TL;DR: In this article, a beam imaging method is used to determine the absolute scale of LHCb measurements for proton-proton collisions at the LHC with a centre-of-mass energy of 7 TeV.
Abstract: Absolute luminosity measurements are of general interest for colliding-beam experiments at storage rings. These measurements are necessary to determine the absolute cross-sections of reaction processes and are valuable to quantify the performance of the accelerator. LHCb has applied two methods to determine the absolute scale of its luminosity measurements for proton-proton collisions at the LHC with a centre-of-mass energy of 7 TeV. In addition to the classic "van der Meer scan" method a novel technique has been developed which makes use of direct imaging of the individual beams using beam-gas and beam-beam interactions. This beam imaging method is made possible by the high resolution of the LHCb vertex detector and the close proximity of the detector to the beams, and allows beam parameters such as positions, angles and widths to be determined. The results of the two methods have comparable precision and are in good agreement. Combining the two methods, an overal precision of 3.5% in the absolute luminosity determination is reached. The techniques used to transport the absolute luminosity calibration to the full data-taking period are presented.

98 citations

Journal ArticleDOI
Georges Aad, Brad Abbott1, Dale Charles Abbott2, Ovsat Abdinov3  +2952 moreInstitutions (60)
TL;DR: In this paper, a search for a heavy charged-boson resonance decaying into a charged lepton (electron or muon) and a neutrino is reported, where the observed transverse mass distribution computed from the lepton and missing transverse momenta is consistent with the distribution expected from the Standard Model.
Abstract: A search for a heavy charged-boson resonance decaying into a charged lepton (electron or muon) and a neutrino is reported. A data sample of 139 fb−1 of proton-proton collisions at √s=13 TeV collected with the ATLAS detector at the LHC during 2015–2018 is used in the search. The observed transverse mass distribution computed from the lepton and missing transverse momenta is consistent with the distribution expected from the Standard Model, and upper limits on the cross section for pp→W′→lν are extracted (l=e or μ). These vary between 1.3 pb and 0.05 fb depending on the resonance mass in the range between 0.15 and 7.0 TeV at 95% confidence level for the electron and muon channels combined. Gauge bosons with a mass below 6.0 and 5.1 TeV are excluded in the electron and muon channels, respectively, in a model with a resonance that has couplings to fermions identical to those of the Standard Model W boson. Cross-section limits are also provided for resonances with several fixed Γ/m values in the range between 1% and 15%. Model-independent limits are derived in single-bin signal regions defined by a varying minimum transverse mass threshold. The resulting visible cross-section upper limits range between 4.6 (15) pb and 22 (22) ab as the threshold increases from 130 (110) GeV to 5.1 (5.1) TeV in the electron (muon) channel.

98 citations

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, J. Abdallah4  +2889 moreInstitutions (195)
TL;DR: In this paper, a search for long-lived particles is performed using a data sample of 4.7 fb(-1) from proton-proton collisions at a centre-of-mass energy.

98 citations

Journal ArticleDOI
TL;DR: In this paper, a search for microscopic black hole production and decay in pp collisions at a center-of-mass energy of 7 TeV was conducted by the CMS Collaboration at the LHC, using a data sample corresponding to an integrated luminosity of 35 inverse picobarns.

98 citations

Journal ArticleDOI
Morad Aaboud, Alexander Kupco1, Samuel Webb2, Timo Dreyer1  +2958 moreInstitutions (58)
TL;DR: In this paper, a search for heavy charged long-lived particles was performed using a data sample of 36.1 fb−1 of proton-proton collisions at s=13µTeV collected by the ATLAS experiment at the Large Hadron Collider.
Abstract: A search for heavy charged long-lived particles is performed using a data sample of 36.1 fb−1 of proton-proton collisions at s=13 TeV collected by the ATLAS experiment at the Large Hadron Collider. The search is based on observables related to ionization energy loss and time of flight, which are sensitive to the velocity of heavy charged particles traveling significantly slower than the speed of light. Multiple search strategies for a wide range of lifetimes, corresponding to path lengths of a few meters, are defined as model independently as possible, by referencing several representative physics cases that yield long-lived particles within supersymmetric models, such as gluinos/squarks (R-hadrons), charginos and staus. No significant deviations from the expected Standard Model background are observed. Upper limits at 95% confidence level are provided on the production cross sections of long-lived R-hadrons as well as directly pair-produced staus and charginos. These results translate into lower limits on the masses of long-lived gluino, sbottom and stop R-hadrons, as well as staus and charginos of 2000, 1250, 1340, 430, and 1090 GeV, respectively.

98 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
Claude Amsler1, Michael Doser2, Mario Antonelli, D. M. Asner3  +173 moreInstitutions (86)
TL;DR: This biennial Review summarizes much of particle physics, using data from previous editions.

12,798 citations