scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Performance of the CMS Detector During the LHC Run 2

01 Jan 2016-Acta Physica Polonica B (Jagiellonian University)-Vol. 47, Iss: 6, pp 1451
TL;DR: The second half of the year was driven by an intense activity mainly ongoing on three fronts: analyses based on Run 1 data, studies for the upgrade of the detector in view of the future phases of the LHC and collect and analyse proton-proton and heavy ions data as mentioned in this paper.
Abstract: With the restart of the LHC operations in April 2015, the Run 2 has started. For the CMS Collaboration [1], the second half of the year was driven by an intense activity mainly ongoing on three fronts: analyses based on Run 1 data, studies for the upgrade of the detector in view of the future phases of the LHC and, of course, collect and analyse proton–proton and heavy ions data. This paper focuses on the latter highlighting examples where the intervention completed during the LHC Long Shutdown 1 (LS1) concretised in an improvement of the performance for the CMS detector and its hardware and software infrastructure.

Summary (2 min read)

1. Introduction

  • With the beams circulating in the LHC in April 2015, the Run 2 has started.
  • This paper focuses on the latter highlighting examples where the intervention completed during the LHC Long Shutdown 1 (LS1) concretised in an improvement of the performance for the CMS detector and its hardware and software infrastructure.

2. Operations

  • Sustained magnet operation has been difficult since the beginning of the data taking due an apparent build up of contaminant in the filters, adsorbs, turbines and heat exchangers of the cold box.
  • The CMS collaboration exploited the initial data to advance in the commissioning of the detector.
  • The detector availability has been excellent with all the subsystems participating with a fraction of working channels above 97%.

3. Detector updates since LHC Run 1 and the LHC Run 2 challenge

  • Run 2 data taking differs from Run 1 under two principal aspects: the center of mass energy has been increased from 8 TeV to 13 TeV and the bunch spacing went from 50 ns to 25 ns.
  • The challenge is to provide an effective pile-up (PU) mitigation mechanism both for in-time and out-of-time pile-up while coping with higher event physics rate and an increased radiation level.
  • During the LHC LS1, a number of upgrades was completed both from the hardware and the software point of view in order to guarantee the excellent CMS performance achieved during Run 1.
  • In the following sections the improvements in the signal reconstruction of the electromagnetic calorimeter (ECAL) and the upgrade of the Trigger and Data Aquisition systems are described.

3.1. The ECAL Performance

  • The CMS ECAL [2] is a high-resolution, hermetic, and homogeneous electromagnetic calorimeter made of 75,848 scintillating lead tungstate crystals.
  • This increases the probability of single calorimeter cells to be hit by a particle in successive bunch crossings and makes it more difficult to differentiate contributions from preceding and trailing bunches.
  • The very precise and reproducible pulse shaping of the ECAL electronics allows to fit the 10 digitized samples with additional pulse hypotheses at different bunch crossings, in order to estimate the energy of the in-time energy deposit and remove the out-of-time contribution.
  • All these different methods are needed to obtain the excellent energy resolution that had been exploited during LHC Run 1 for new physics searches.

3.2. DAQ, Trigger, Monitoring and Computing

  • The CMS experiment has installed a two-stage upgrade to their calorimeter trigger to ensure that the trigger rates can be controlled and the thresholds can stay low, so that physics data collection will not be compromised.
  • The full HLT farm, comprising three generations of processing nodes, now provides a processing capacity of about 200 ms per event at an input rate of 100 kHz.
  • The new design achieved a much better separation of responsibilities between the DAQ and DQM team, yet this required a deep re-thinking of the processing logic of the online DQM data in order to provide a short latency monitoring.
  • 2.3. Computing Infrastructure and Software During the first months of collisions, the CMS computing infrastructure demonstrated the ability to sustain a load as high as ∼150k jobs running in parallel.

4. Summary

  • The first few months of data taking at 13 TeV were mainly devoted to the recommissioning of the CMS detector after the LS1.
  • All the systems proved to be able to cope with the new and challenging beam conditions: higher energy and reduced bunch spacing.
  • An integrated luminosity of 2.7 fb−1 has been analyzed not without surprises: even if not significant enough to claim a discovery an excess in the di-photon mass spectrum around 750 GeV was revealed.
  • The first few months of data taking in 2016 will possibly give some confirmation of the 2015 results.

Did you find this useful? Give us your feedback

Figures (4)

Content maybe subject to copyright    Report

Available on CMS information server CMS CR -2016/055
The Compact Muon Solenoid Experiment
Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland
Conference Report
08 April 2016 (v2, 02 May 2016)
Run2 performance of the CMS detector
Federico De Guio for the CMS Collaboration
Abstract
Short description of the Run2 performace of the CMS detector.
Presented at Epiphany 2016 XXII Cracow EPIPHANY Conference on the Physics in LHC Run2

Performance of the CMS detector in Run 2
Federico De Guio on behalf of the CMS collaboration
CERN
Highlights of the performance of the CMS detector in the LHC Run 2.
1. Introduction
With the beams circulating in the LHC in April 2015, the Run 2 has
started. For the CMS [1] collaboration the second half of the year was driven
by an intense activity mainly ongoing on three fronts: analyses based on
Run 1 data, studies for the upgrade of the detector in view of the future
phases of the LHC and of course collect and analyse proton-proton and
heavy ions data. This paper focuses on the latter highlighting examples
where the intervention completed during the LHC Long Shutdown 1 (LS1)
concretised in an improvement of the performance for the CMS detector
and its hardware and software infrastructure.
2. Operations
Sustained magnet operation has been difficult since the beginning of the
data taking due an apparent build up of contaminant in the filters, adsorbs,
turbines and heat exchangers of the cold box. Besides very intensive, diag-
nostic measurements, which are complicated by the very nature of cryogenic
installations, an invasive programme of filter, absorber and turbine replace-
ment has been undertaken using the pre-scheduled technical stops of the
LHC where possible.
The CMS collaboration exploited the initial data to advance in the com-
missioning of the detector. Continuous changes in data taking conditions
made the first period quite demanding in terms of organization for online
and offline operation teams, with frequent changes of trigger menus, cali-
brations, etc.
Presented at XXII Cracow EPIPHANY Conference on the Physics in LHC Run2, 7-9
Jan 2016, Institute of Nuclear Physics IFJ PAN, Krakow (Poland)
(1)

2 template printed on April 25, 2016
Wednesday November 4th marked the end of the high-energy 2015 pro-
ton run. During the full 2015 data taking period, LHC delivered 4.1 fb
1
,
3.7 fb
1
of which were recorded by CMS. Out of these, 2.7 fb
1
have
been taken at full field, B=3.8 T and are usable for analysis. The detector
availability has been excellent with all the subsystems participating with a
fraction of working channels above 97%.
Fig. 1. Cumulative curves for the luminosity delivered by LHC (azure), recorded
by CMS (orange) and certified as good for physics analysis during stable beams
(light orange). The green histogram shows the recorded luminosity while CMS was
taking data with full magnetic field (3.8 T).
3. Detector updates since LHC Run 1 and the LHC Run 2
challenge
Run 2 data taking differs from Run 1 under two principal aspects: the
center of mass energy has been increased from 8 TeV to 13 TeV and the
bunch spacing went from 50 ns to 25 ns. The challenge is to provide an ef-
fective pile-up (PU) mitigation mechanism both for in-time and out-of-time
pile-up while coping with higher event physics rate and an increased radia-
tion level. With this in mind, during the LHC LS1, a number of upgrades
was completed both from the hardware and the software point of view in
order to guarantee the excellent CMS performance achieved during Run 1.
In the following sections the improvements in the signal reconstruction of

template printed on April 25, 2016 3
the electromagnetic calorimeter (ECAL) and the upgrade of the Trigger and
Data Aquisition systems are described.
3.1. The ECAL Performance
The CMS ECAL [2] is a high-resolution, hermetic, and homogeneous
electromagnetic calorimeter made of 75,848 scintillating lead tungstate crys-
tals. An important challenge of the CMS ECAL operation at LHC Run 2
is the increased rate of PU collisions and the reduced LHC bunch spacing
of 25 ns. This increases the probability of single calorimeter cells to be hit
by a particle in successive bunch crossings and makes it more difficult to
differentiate contributions from preceding and trailing bunches. The pulse
from each crystal is sampled every 25 ns and a buffer of 10 digitized values
is used to reconstruct the energy deposit. The very precise and reproducible
pulse shaping of the ECAL electronics allows to fit the 10 digitized samples
with additional pulse hypotheses at different bunch crossings, in order to es-
timate the energy of the in-time energy deposit and remove the out-of-time
contribution. The described method proved to be very effective in measur-
ing the amplitude of the in-time pulse shape. An example of fitted pulse
for simulated events with 20 average pileup interactions and 25 ns bunch
spacing is reported in Fig. 2.
Fig. 2. ECAL barrel pulse shape: dots represent the 10 digitized samples, the red
distributions (other light colors) represent the fitted in-time (out-of time) pulses
with positive amplitude. The dark blue histograms represent the sum of all the
fitted contributions.
The calibration of the CMS ECAL relies on physics references such as
the di-photon invariant mass of neutral meson decays (π
0
and η into γγ), the

4 template printed on April 25, 2016
ratio between the tracker based momentum and the ECAL reconstructed
energy for electrons from Z and W decays, di-electron invariant mass of
Z decays, as well as azimuthal symmetry of the energy flow in minimum
bias events. All these different methods are needed to obtain the excellent
energy resolution that had been exploited during LHC Run 1 for new physics
searches. The triggers, data flow and calibration procedures for all methods
have been optimized for operation at LHC Run 2 and the analysis of the
data collected in 2015 confirmed that an energy resolution close 1% in the
central barrel is at reach, as shown in Fig. 3.
Fig. 3. Relative electron (ECAL) energy resolution unfolded in bins of pseudo-
rapidity η for the barrel and the endcaps. Electrons from Z e
+
e
decays
are used. The resolution is shown for low bremsstrahlung electrons (R
9
> 0.94,
with R
9
= E
3×3
/E
Supercluster
).The resolution σ
E/E
is extracted from an unbinned
likelihood fit to Z e
+
e
events, using a Breit-Wigner function convoluted with a
Gaussian as the signal model. The resolution is plotted separately for data and MC
events. The MC is generated assuming the calibration precision that was achieved
with the amount of data collected in Run 1.
3.2. DAQ, Trigger, Monitoring and Computing
An intensive program of upgrade and consolidation touching the CMS
trigger, the data acquisition (DAQ), the data quality monitoring (DQM)
and the computing systems has been carried out during the LS1 with the
general goal of improving the performance of the overall infrastructure: from
doubling the High Level Trigger (HLT) bandwidth to improving the CMS

Citations
More filters
31 Aug 2018
TL;DR: In this article, the authors proposed a new heavy neutral gauge boson with a mass that should be in the TeV scale and therefore should be produced in pp collisions at the Large Hadron Collider (LHC).
Abstract: New heavy neutral gauge bosons, generically referred as Z′ bosons, are predicted in several theoretical scenarios beyond the standard model (BSM), such as Grand Unification Theories (GUT), Supersymmetry (SUSY) models and Superstring models. In these kinds of scenarios, the breaking of extended symmetry usually results in an extra U(1)′ symmetry group. This additional symmetry would give rise to the existence of a Z′ boson with a mass that should be in the TeV scale and therefore should be produced in pp collisions at the Large Hadron Collider (LHC). There are some scenarios that predict a generational coupling dependence, where the Z′ boson would decay preferentially to the third generation of fermions, for instance the topcolor-assisted technicolor (TAT) models. These scenarios are a motivation to search for Z′ bosons decaying into tau pairs, which was the main goal of this PhD dissertation. If a Z′-like resonance were found to decay also in the other fermion-pair final states (ee or μμ), the search for Z′ → ττ would be also very interesting, since it would reveal the nature of the couplings. Since a Z′ boson might be produced as a result of the proton-proton collisions at the LHC, it might be observed by the CMS and ATLAS experiments as a massive resonance in the invariant mass distribution of its decay products, which, in case of the Z′ → ττ channel, are two oppositely-charged high pT taus. The search for Z′ decaying into tau-pairs involves four experimental signatures since the τ lepton can decay leptonically (τe, τμ) or hadronically (τh): τeτμ, τeτh, τμτh and τhτh. The τeτh and τμτh channels have a significant sensitivity due to the high reconstruction efficiency of light leptons in CMS and a relatively low QCD background contribution. However, the dihadronic tau channel has the best sensitivity since it has the highest expected signal yield, but it has a high background contribution coming from QCD multijet production. During the Run II of the LHC, the evidence of a Z′ boson has been excluded and, the CMS and ATLAS experiments have constrained its existence in a wide range of mass; in the particular case of the Z′ → ττ search, ATLAS has excluded its existence for masses below 2.42 TeV using the data collected during 2015 and 2016, while CMS has excluded it for masses below 2.1 TeV, using the data collected during 2015. In this dissertation, the search for Z′ bosons in the dihadronic tau final state performed using data collected by CMS during 2016, is presented. This data corresponds to pp collisions at centre-of-mass energy of 13 TeV, with an integrated luminosity of 35.9 fb−1. As a result of this analysis, expected exclusion limits have been established for the mass of the ZSSM and ZTAT bosons.

6 citations


Cites methods from "Performance of the CMS Detector Dur..."

  • ...As an example, the energy resolution obtained in the EB is close to 1% for all electrons that come from Z decays [59]....

    [...]

Dissertation
01 Jan 2018

5 citations


Cites methods from "Performance of the CMS Detector Dur..."

  • ...The response of the detector has been measured at above 97% with the working channels [12]....

    [...]

References
More filters
31 Aug 2018
TL;DR: In this article, the authors proposed a new heavy neutral gauge boson with a mass that should be in the TeV scale and therefore should be produced in pp collisions at the Large Hadron Collider (LHC).
Abstract: New heavy neutral gauge bosons, generically referred as Z′ bosons, are predicted in several theoretical scenarios beyond the standard model (BSM), such as Grand Unification Theories (GUT), Supersymmetry (SUSY) models and Superstring models. In these kinds of scenarios, the breaking of extended symmetry usually results in an extra U(1)′ symmetry group. This additional symmetry would give rise to the existence of a Z′ boson with a mass that should be in the TeV scale and therefore should be produced in pp collisions at the Large Hadron Collider (LHC). There are some scenarios that predict a generational coupling dependence, where the Z′ boson would decay preferentially to the third generation of fermions, for instance the topcolor-assisted technicolor (TAT) models. These scenarios are a motivation to search for Z′ bosons decaying into tau pairs, which was the main goal of this PhD dissertation. If a Z′-like resonance were found to decay also in the other fermion-pair final states (ee or μμ), the search for Z′ → ττ would be also very interesting, since it would reveal the nature of the couplings. Since a Z′ boson might be produced as a result of the proton-proton collisions at the LHC, it might be observed by the CMS and ATLAS experiments as a massive resonance in the invariant mass distribution of its decay products, which, in case of the Z′ → ττ channel, are two oppositely-charged high pT taus. The search for Z′ decaying into tau-pairs involves four experimental signatures since the τ lepton can decay leptonically (τe, τμ) or hadronically (τh): τeτμ, τeτh, τμτh and τhτh. The τeτh and τμτh channels have a significant sensitivity due to the high reconstruction efficiency of light leptons in CMS and a relatively low QCD background contribution. However, the dihadronic tau channel has the best sensitivity since it has the highest expected signal yield, but it has a high background contribution coming from QCD multijet production. During the Run II of the LHC, the evidence of a Z′ boson has been excluded and, the CMS and ATLAS experiments have constrained its existence in a wide range of mass; in the particular case of the Z′ → ττ search, ATLAS has excluded its existence for masses below 2.42 TeV using the data collected during 2015 and 2016, while CMS has excluded it for masses below 2.1 TeV, using the data collected during 2015. In this dissertation, the search for Z′ bosons in the dihadronic tau final state performed using data collected by CMS during 2016, is presented. This data corresponds to pp collisions at centre-of-mass energy of 13 TeV, with an integrated luminosity of 35.9 fb−1. As a result of this analysis, expected exclusion limits have been established for the mass of the ZSSM and ZTAT bosons.

6 citations

Frequently Asked Questions (1)
Q1. What are the contributions in this paper?

De Guio et al. this paper presented at Epiphany 2016 XXII Cracow EPIPHANY Conference on the Physics in LHC Run2 Performance of the CMS detector in Run 2.