ATLAS Collaboration(Aad, G. et al), Cabrera Urban, S., Castillo Gimenez, V., Costa, M. J., Fassi, F., Ferrer, A., et al. (2014). Monitoring and data quality assessment of the ATLAS liquid argon calorimeter. J. Instrum., 9, P07024–55pp.
Abstract: The liquid argon calorimeter is a key component of the ATLAS detector installed at the CERN Large Hadron Collider. The primary purpose of this calorimeter is the measurement of electron and photon kinematic properties. It also provides a crucial input for measuring jets and missing transverse momentum. An advanced data monitoring procedure was designed to quickly identify issues that would affect detector performance and ensure that only the best quality data are used for physics analysis. This article presents the validation procedure developed during the 2011 and 2012 LHC data-taking periods, in which more than 98% of the proton-proton luminosity recorded by ATLAS at a centre-of-mass energy of 7-8 TeV had calorimeter data quality suitable for physics analysis.
|
ATLAS Collaboration(Aad, G. et al), Cabrera Urban, S., Castillo Gimenez, V., Costa, M. J., Fassi, F., Ferrer, A., et al. (2013). Characterisation and mitigation of beam-induced backgrounds observed in the ATLAS detector during the 2011 proton-proton run. J. Instrum., 8, P07004–72pp.
Abstract: This paper presents a summary of beam-induced backgrounds observed in the ATLAS detector and discusses methods to tag and remove background contaminated events in data. Trigger-rate based monitoring of beam-related backgrounds is presented. The correlations of backgrounds with machine conditions, such as residual pressure in the beam-pipe, are discussed. Results from dedicated beam-background simulations are shown, and their qualitative agreement with data is evaluated. Data taken during the passage of unpaired, i.e. non-colliding, proton bunches is used to obtain background-enriched data samples. These are used to identify characteristic features of beam-induced backgrounds, which then are exploited to develop dedicated background tagging tools. These tools, based on observables in the Pixel detector, the muon spectrometer and the calorimeters, are described in detail and their efficiencies are evaluated. Finally an example of an application of these techniques to a monojet analysis is given, which demonstrates the importance of such event cleaning techniques for some new physics searches.
|
Johannesson, G., Ruiz de Austri, R., Vincent, A. C., Moskalenko, I. V., Orlando, E., Porter, T. A., et al. (2016). Bayesian analysis of cosmic-ray propagation: evidence against homogeneous diffusion. Astrophys. J., 824(1), 16–19pp.
Abstract: We present the results of the most complete scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine-learning package. This is the first study to separate out low-mass isotopes (p, (p) over bar and He) from the usual light elements (Be, B, C, N, and O). We find that the propagation parameters that best-fit p, (p) over bar, and He data are significantly different from those that fit light elements, including the B/C and Be-10/Be-9 secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests that each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best-fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of elements from H to Si. The input GALDEF files with these new parameters will be included in an upcoming public GALPROP update.
|
Kasieczka, G. et al, & Sanz, V. (2021). The LHC Olympics 2020: a community challenge for anomaly detection in high energy physics. Rep. Prog. Phys., 84(12), 124201–64pp.
Abstract: A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied by a set of simulated collider events. Participants in these Olympics have developed their methods using an R&D dataset and then tested them on black boxes: datasets with an unknown anomaly (or not). Methods made use of modern machine learning tools and were based on unsupervised learning (autoencoders, generative adversarial networks, normalizing flows), weakly supervised learning, and semi-supervised learning. This paper will review the LHC Olympics 2020 challenge, including an overview of the competition, a description of methods deployed in the competition, lessons learned from the experience, and implications for data analyses with future datasets as well as future colliders.
|
CALICE Collaboration(Lai, S. et al), & Irles, A. (2024). Software compensation for highly granular calorimeters using machine learning. J. Instrum., 19(4), P04037–28pp.
Abstract: A neural network for software compensation was developed for the highly granular CALICE Analogue Hadronic Calorimeter (AHCAL). The neural network uses spatial and temporal event information from the AHCAL and energy information, which is expected to improve sensitivity to shower development and the neutron fraction of the hadron shower. The neural network method produced a depth-dependent energy weighting and a time-dependent threshold for enhancing energy deposits consistent with the timescale of evaporation neutrons. Additionally, it was observed to learn an energy-weighting indicative of longitudinal leakage correction. In addition, the method produced a linear detector response and outperformed a published control method regarding resolution for every particle energy studied.
|