scispace - formally typeset
Search or ask a question
Author

Yang Gao

Bio: Yang Gao is an academic researcher from University of Surrey. The author has contributed to research in topics: Large Hadron Collider & Higgs boson. The author has an hindex of 168, co-authored 2047 publications receiving 146301 citations. Previous affiliations of Yang Gao include China Agricultural University & University of Kassel.


Papers
More filters
Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek4  +2926 moreInstitutions (197)
TL;DR: In this article, the authors performed searches for heavy long-lived charged particles using a data sample of 19.1 fb(-1) from proton-proton collisions at a centre-of-mass energy of root s = 8 TeV collected by the ATLAS detector at the Large Hadron Collider.
Abstract: Searches for heavy long-lived charged particles are performed using a data sample of 19.1 fb(-1) from proton-proton collisions at a centre-of-mass energy of root s = 8 TeV collected by the ATLAS detector at the Large Hadron Collider. No excess is observed above the estimated background and limits are placed on the mass of long-lived particles in various supersymmetric models. Long-lived tau sleptons in models with gauge-mediated symmetry breaking are excluded up to masses between 440 and 385 GeV for tan,3 between 10 and 50, with a 290 GeV limit in the case where only direct tau slepton production is considered. In the context of simplified LeptoSUSY models, where sleptons are stable and have a mass of 300 GeV, squark and gluino masses are excluded up to a mass of 1500 and 1360 GeV, respectively. Directly produced charginos, in simplified models where they are nearly degenerate to the lightest neutralino, are excluded up to a mass of 620 GeV. R-hadrons, composites containing a gluino, bottom squark or top squark, are excluded up to a mass of 1270, 845 and 900 GeV, respectively, using the full detector; and up to a mass of 1260, 835 and 870 GeV using an approach disregarding information from the muon spectrometer.

129 citations

Journal ArticleDOI
Georges Aad, Brad Abbott1, Jalal Abdallah2, S. Abdel Khalek3  +3038 moreInstitutions (175)
TL;DR: In this article, a measurement of jet activity in t (t) over bar events produced in proton-proton collisions at a centre-of-mass energy of 7 TeV is presented, using 2.05 fb(-1) of integrated luminosity collected by the ATLAS detector at the Large Hadron Collider.
Abstract: A measurement of the jet activity in t (t) over bar events produced in proton-proton collisions at a centre-of-mass energy of 7 TeV is presented, using 2.05 fb(-1) of integrated luminosity collected by the ATLAS detector at the Large Hadron Collider. The t (t) over bar events are selected in the dilepton decay channel with two identified b-jets from the top quark decays. Events are vetoed if they contain an additional jet with transverse momentum above a threshold in a central rapidity interval. The fraction of events surviving the jet veto is presented as a function of this threshold for four different central rapidity interval definitions. An alternate measurement is also performed, in which events are vetoed if the scalar transverse momentum sum of the additional jets in each rapidity interval is above a threshold. In both measurements, the data are corrected for detector effects and compared to the theoretical models implemented in MC@NLO, POWHEG, ALPGEN and SHERPA. The experimental uncertainties are often smaller than the spread of theoretical predictions, allowing deviations between data and theory to be observed in some regions of phase space.

129 citations

Journal ArticleDOI
Roel Aaij, Bernardo Adeva1, Marco Adinolfi2, A. A. Affolder3  +693 moreInstitutions (62)
TL;DR: In this paper, the differential cross-section as a function of rapidity has been measured for the exclusive production of $J/psi$ and $psi(2S)$ mesons in proton-proton collisions at $\sqrt{s}=7$ TeV, using data collected by the LHCb experiment.
Abstract: The differential cross-section as a function of rapidity has been measured for the exclusive production of $J/\psi$ and $\psi(2S)$ mesons in proton-proton collisions at $\sqrt{s}=7$ TeV, using data collected by the LHCb experiment, corresponding to an integrated luminosity of 930 pb$^{-1}$. The cross-sections times branching fractions to two muons having pseudorapidities between 2.0 and 4.5 are measured to be $$\begin{array}{rl} \sigma_{pp\rightarrow J/\psi\rightarrow{\mu^+}{\mu^-}}(2.0<\eta_{\mu^\pm }<4.5)=&291\pm 7\pm19 {\rm \ pb},\\ \sigma_{pp\rightarrow\psi(2S)\rightarrow{\mu^+}{\mu^-}}(2.0<\eta_{\mu^\pm}<4.5)=&6.5\pm 0.9\pm 0.4 {\rm \ pb},\end{array}$$ where the first uncertainty is statistical and the second is systematic. The measurements agree with next-to-leading order QCD predictions as well as with models that include saturation effects.

129 citations

Journal ArticleDOI
Roel Aaij1, Gregory Ciezarek, P. Collins1, G. Collazuol2  +767 moreInstitutions (70)
TL;DR: The data sample of Lambda(0)(b) -> J/psi pK(-) decays acquired with the LHCb detector from 7 and 8 TeV pp collisions, corresponding to an integrated luminosity of 3 fb(-1), is inspected for the presence of J/PSi p or J/Psi K- contributions with minimal assumptions about K(-)p contributions as mentioned in this paper.
Abstract: The data sample of Lambda(0)(b) -> J/psi pK(-) decays acquired with the LHCb detector from 7 and 8 TeV pp collisions, corresponding to an integrated luminosity of 3 fb(-1), is inspected for the presence of J/psi p or J/psi K- contributions with minimal assumptions about K(-)p contributions. It is demonstrated at more than nine standard deviations that Lambda(0)(b) -> J/psi pK(-) decays cannot be described with K- p contributions alone, and that J/psi K- contributions play a dominant role in this incompatibility. These model-independent results support the previously obtained model-dependent evidence for P-c(+)-> J/psi p charmonium-pentaquark states in the same data sample.

129 citations

Journal ArticleDOI
TL;DR: In this paper, higher-order harmonic coefficients of charged particles were analyzed using the event plane, multiparticle cumulant, and Lee-Yang zeros methods, which provide different sensitivities to initial-state fluctuations.
Abstract: Measurements are presented by the CMS Collaboration at the Large Hadron Collider (LHC) of the higher-order harmonic coefficients that describe the azimuthal anisotropy of charged particles emitted in √(s_NN)=2.76 TeV PbPb collisions. Expressed in terms of the Fourier components of the azimuthal distribution, the n=3–6 harmonic coefficients are presented for charged particles as a function of their transverse momentum (0.3

128 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
Claude Amsler1, Michael Doser2, Mario Antonelli, D. M. Asner3  +173 moreInstitutions (86)
TL;DR: This biennial Review summarizes much of particle physics, using data from previous editions.

12,798 citations