scispace - formally typeset
Search or ask a question
Author

Zahari Kassabov

Bio: Zahari Kassabov is an academic researcher from University of Cambridge. The author has contributed to research in topics: Parton & Electroweak interaction. The author has an hindex of 8, co-authored 16 publications receiving 1776 citations. Previous affiliations of Zahari Kassabov include University of Turin & University of Milan.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II.
Abstract: We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

1,098 citations

Journal ArticleDOI
TL;DR: In this article, the authors provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II.
Abstract: We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

683 citations

Journal ArticleDOI
TL;DR: The methodology is developed, and it is shown that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea.
Abstract: We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the MC-H PDF set.

102 citations

Journal ArticleDOI
TL;DR: In this article, a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis, is presented.
Abstract: We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (CMC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the CMC-H PDF set.

92 citations

Journal ArticleDOI
TL;DR: In this article, the authors formulate a general approach to the inclusion of theoretical uncertainties, specifically those related to the missing higher order uncertainty (MHOU), in the determination of parton distribution functions (PDFs), and demonstrate how, under quite generic assumptions, theory uncertainties can be included as an extra contribution to the covariance matrix when determining PDFs.
Abstract: We formulate a general approach to the inclusion of theoretical uncertainties, specifically those related to the missing higher order uncertainty (MHOU), in the determination of parton distribution functions (PDFs). We demonstrate how, under quite generic assumptions, theory uncertainties can be included as an extra contribution to the covariance matrix when determining PDFs from data. We then review, clarify, and systematize the use of renormalization and factorization scale variations as a means to estimate MHOUs consistently in deep inelastic and hadronic processes. We define a set of prescriptions for constructing a theory covariance matrix using scale variations, which can be used in global fits of data from a wide range of different processes, based on choosing a set of independent scale variations suitably correlated within and across processes. We set up an algebraic framework for the choice and validation of an optimal prescription by comparing the estimate of MHOU encoded in the next-to-leading order (NLO) theory covariance matrix to the observed shifts between NLO and NNLO predictions. We perform a NLO PDF determination which includes the MHOU, assess the impact of the inclusion of MHOUs on the PDF central values and uncertainties, and validate the results by comparison to the known shift between NLO and NNLO PDFs. We finally study the impact of the inclusion of MHOUs in a global PDF determination on LHC cross-sections, and provide guidelines for their use in precision phenomenology. In addition, we also compare the results based on the theory covariance matrix formalism to those obtained by performing PDF determinations based on different scale choices.

65 citations


Cited by
More filters
Proceedings ArticleDOI
01 Jan 2007
TL;DR: In this paper, a preliminary set of updated NLO parton distributions and their uncertainties determined from CCFR and NuTeV dimuon cross sections are presented, along with additional jet data from HERA and the Tevatron.
Abstract: We present a preliminary set of updated NLO parton distributions. For the first time we have a quantitative extraction of the strange quark and antiquark distributions and their uncertainties determined from CCFR and NuTeV dimuon cross sections. Additional jet data from HERA and the Tevatron improve our gluon extraction. Lepton asymmetry data and neutrino structure functions improve the flavour separation, particularly constraining the down quark valence distribution.

1,288 citations

Journal ArticleDOI
TL;DR: In this article, the authors provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II.
Abstract: We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

1,098 citations

Journal ArticleDOI
TL;DR: NNPDF31 as discussed by the authors is the first global set of PDFs determined using a methodology validated by a closure test, which is motivated by recent progress in methodology and available data, and involves both on the methodological side, parametrize and determine the charm PDF alongside the light-quark and gluon ones, thereby increasing from seven to eight the number of independent PDFs.
Abstract: We present a new set of parton distributions, NNPDF31, which updates NNPDF30, the first global set of PDFs determined using a methodology validated by a closure test The update is motivated by recent progress in methodology and available data, and involves both On the methodological side, we now parametrize and determine the charm PDF alongside the light-quark and gluon ones, thereby increasing from seven to eight the number of independent PDFs On the data side, we now include the D0 electron and muon W asymmetries from the final Tevatron dataset, the complete LHCb measurements of W and Z production in the forward region at 7 and 8 TeV, and new ATLAS and CMS measurements of inclusive jet and electroweak boson production We also include for the first time top-quark pair differential distributions and the transverse momentum of the Z bosons from ATLAS and CMS We investigate the impact of parametrizing charm and provide evidence that the accuracy and stability of the PDFs are thereby improved We study the impact of the new data by producing a variety of determinations based on reduced datasets We find that both improvements have a significant impact on the PDFs, with some substantial reductions in uncertainties, but with the new PDFs generally in agreement with the previous set at the one-sigma level The most significant changes are seen in the light-quark flavor separation, and in increased precision in the determination of the gluon We explore the implications of NNPDF31 for LHC phenomenology at Run II, compare with recent LHC measurements at 13 TeV, provide updated predictions for Higgs production cross-sections and discuss the strangeness and charm content of the proton in light of our improved dataset and methodology The NNPDF31 PDFs are delivered for the first time both as Hessian sets, and as optimized Monte Carlo sets with a compressed number of replicas

1,014 citations

Journal ArticleDOI
TL;DR: A new set of parton distributions, NNPDF3.1, is presented, which updates NN PDF3.0, the first global set of PDFs determined using a methodology validated by a closure test, and investigates the impact of parametrizing charm and evidence that the accuracy and stability of the PDFs are improved.
Abstract: We present a new set of parton distributions, NNPDF3.1, which updates NNPDF3.0, the first global set of PDFs determined using a methodology validated by a closure test. The update is motivated by recent progress in methodology and available data, and involves both. On the methodological side, we now parametrize and determine the charm PDF alongside the light quarks and gluon ones, thereby increasing from seven to eight the number of independent PDFs. On the data side, we now include the D0 electron and muon W asymmetries from the final Tevatron dataset, the complete LHCb measurements of W and Z production in the forward region at 7 and 8 TeV, and new ATLAS and CMS measurements of inclusive jet and electroweak boson production. We also include for the first time top-quark pair differential distributions and the transverse momentum of the Z bosons from ATLAS and CMS. We investigate the impact of parametrizing charm and provide evidence that the accuracy and stability of the PDFs are thereby improved. We study the impact of the new data by producing a variety of determinations based on reduced datasets. We find that both improvements have a significant impact on the PDFs, with some substantial reductions in uncertainties, but with the new PDFs generally in agreement with the previous set at the one sigma level. The most significant changes are seen in the light-quark flavor separation, and in increased precision in the determination of the gluon. We explore the implications of NNPDF3.1 for LHC phenomenology at Run II, compare with recent LHC measurements at 13 TeV, provide updated predictions for Higgs production cross-sections and discuss the strangeness and charm content of the proton in light of our improved dataset and methodology. The NNPDF3.1 PDFs are delivered for the first time both as Hessian sets, and as optimized Monte Carlo sets with a compressed number of replicas.

921 citations

Journal ArticleDOI
TL;DR: In this article, the authors provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II.
Abstract: We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.

683 citations