scispace - formally typeset
Search or ask a question
Author

Richard Polifka

Other affiliations: University of Coimbra, Politehnica University of Bucharest, CERN  ...read more
Bio: Richard Polifka is an academic researcher from Charles University in Prague. The author has contributed to research in topics: Large Hadron Collider & Higgs boson. The author has an hindex of 75, co-authored 336 publications receiving 28838 citations. Previous affiliations of Richard Polifka include University of Coimbra & Politehnica University of Bucharest.


Papers
More filters
Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah  +2942 moreInstitutions (201)
TL;DR: In this paper, the spin and parity quantum numbers of the Higgs boson were studied based on the collision data collected by the ATLAS experiment at the LHC, and the results showed that the standard model spin-parity J(...

608 citations

Journal ArticleDOI
Halina Abramowicz1, Halina Abramowicz2, I. Abt3, Leszek Adamczyk4  +325 moreInstitutions (55)
TL;DR: A combination of all inclusive deep inelastic cross sections previously published by the H1 and ZEUS collaborations at HERA for neutral and charged current scattering for zero beam polarisation is presented in this paper.
Abstract: A combination is presented of all inclusive deep inelastic cross sections previously published by the H1 and ZEUS collaborations at HERA for neutral and charged current $e^{\pm}p$ scattering for zero beam polarisation. The data were taken at proton beam energies of 920, 820, 575 and 460 GeV and an electron beam energy of 27.5 GeV. The data correspond to an integrated luminosity of about 1 fb$^{-1}$ and span six orders of magnitude in negative four-momentum-transfer squared, $Q^2$, and Bjorken $x$. The correlations of the systematic uncertainties were evaluated and taken into account for the combination. The combined cross sections were input to QCD analyses at leading order, next-to-leading order and at next-to-next-to-leading order, providing a new set of parton distribution functions, called HERAPDF2.0. In addition to the experimental uncertainties, model and parameterisation uncertainties were assessed for these parton distribution functions. Variants of HERAPDF2.0 with an alternative gluon parameterisation, HERAPDF2.0AG, and using fixed-flavour-number schemes, HERAPDF2.0FF, are presented. The analysis was extended by including HERA data on charm and jet production, resulting in the variant HERAPDF2.0Jets. The inclusion of jet-production cross sections made a simultaneous determination of these parton distributions and the strong coupling constant possible, resulting in $\alpha_s(M_Z)=0.1183 \pm 0.0009 {\rm(exp)} \pm 0.0005{\rm (model/parameterisation)} \pm 0.0012{\rm (hadronisation)} ^{+0.0037}_{-0.0030}{\rm (scale)}$. An extraction of $xF_3^{\gamma Z}$ and results on electroweak unification and scaling violations are also presented.

514 citations

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2942 moreInstitutions (200)
TL;DR: In this article, the production properties and couplings of the recently discovered Higgs boson using the decays into boson pairs were measured using the complete pp collision data sample recorded by the ATLAS experiment at the CERN Large Hadron Collider at centre-of-mass energies of 7 TeV and 8 TeV, corresponding to an integrated luminosity of about 25/fb.

513 citations

Journal ArticleDOI
Morad Aaboud, Georges Aad1, Brad Abbott2, Jalal Abdallah3  +2845 moreInstitutions (197)
TL;DR: This paper presents a short overview of the changes to the trigger and data acquisition systems during the first long shutdown of the LHC and shows the performance of the trigger system and its components based on the 2015 proton–proton collision data.
Abstract: During 2015 the ATLAS experiment recorded 3.8 fb(-1) of proton-proton collision data at a centre-of-mass energy of 13 TeV. The ATLAS trigger system is a crucial component of the experiment, respons ...

488 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
01 Apr 1988-Nature
TL;DR: In this paper, a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) is presented.
Abstract: Deposits of clastic carbonate-dominated (calciclastic) sedimentary slope systems in the rock record have been identified mostly as linearly-consistent carbonate apron deposits, even though most ancient clastic carbonate slope deposits fit the submarine fan systems better. Calciclastic submarine fans are consequently rarely described and are poorly understood. Subsequently, very little is known especially in mud-dominated calciclastic submarine fan systems. Presented in this study are a sedimentological core and petrographic characterisation of samples from eleven boreholes from the Lower Carboniferous of Bowland Basin (Northwest England) that reveals a >250 m thick calciturbidite complex deposited in a calciclastic submarine fan setting. Seven facies are recognised from core and thin section characterisation and are grouped into three carbonate turbidite sequences. They include: 1) Calciturbidites, comprising mostly of highto low-density, wavy-laminated bioclast-rich facies; 2) low-density densite mudstones which are characterised by planar laminated and unlaminated muddominated facies; and 3) Calcidebrites which are muddy or hyper-concentrated debrisflow deposits occurring as poorly-sorted, chaotic, mud-supported floatstones. These

9,929 citations

Journal ArticleDOI
TL;DR: MadGraph5 aMC@NLO as discussed by the authors is a computer program capable of handling all these computations, including parton-level fixed order, shower-matched, merged, in a unified framework whose defining features are flexibility, high level of parallelisation and human intervention limited to input physics quantities.
Abstract: We discuss the theoretical bases that underpin the automation of the computations of tree-level and next-to-leading order cross sections, of their matching to parton shower simulations, and of the merging of matched samples that differ by light-parton multiplicities. We present a computer program, MadGraph5 aMC@NLO, capable of handling all these computations — parton-level fixed order, shower-matched, merged — in a unified framework whose defining features are flexibility, high level of parallelisation, and human intervention limited to input physics quantities. We demonstrate the potential of the program by presenting selected phenomenological applications relevant to the LHC and to a 1-TeV e + e − collider. While next-to-leading order results are restricted to QCD corrections to SM processes in the first public version, we show that from the user viewpoint no changes have to be expected in the case of corrections due to any given renormalisable Lagrangian, and that the implementation of these are well under way.

6,509 citations

Journal ArticleDOI
TL;DR: In this paper, the authors presented an updated leading-order, next-to-leading order and next-next-ordering order parton distribution function (MSTW 2008) determined from global analysis of hard-scattering data within the standard framework of leading-twist fixed-order collinear factorisation in the $\overline{\mathrm{MS}}$¯¯$¯¯¯¯¯
Abstract: We present updated leading-order, next-to-leading order and next-to-next-to-leading order parton distribution functions (“MSTW 2008”) determined from global analysis of hard-scattering data within the standard framework of leading-twist fixed-order collinear factorisation in the $\overline{\mathrm{MS}}$ scheme. These parton distributions supersede the previously available “MRST” sets and should be used for the first LHC data taking and for the associated theoretical calculations. New data sets fitted include CCFR/NuTeV dimuon cross sections, which constrain the strange-quark and -antiquark distributions, and Tevatron Run II data on inclusive jet production, the lepton charge asymmetry from W decays and the Z rapidity distribution. Uncertainties are propagated from the experimental errors on the fitted data points using a new dynamic procedure for each eigenvector of the covariance matrix. We discuss the major changes compared to previous MRST fits, briefly compare to parton distributions obtained by other fitting groups, and give predictions for the W and Z total cross sections at the Tevatron and LHC.

3,546 citations