|
Guadilla, V. et al, Tain, J. L., Algora, A., Agramunt, J., Gelletly, W., Jordan, D., et al. (2018). Characterization and performance of the DTAS detector. Nucl. Instrum. Methods Phys. Res. A, 910, 79–89.
Abstract: DTAS is a segmented total absorption y-ray spectrometer developed for the DESPEC experiment at FAIR. It is composed of up to eighteen NaI(Tl) crystals. In this work we study the performance of this detector with laboratory sources and also under real experimental conditions. We present a procedure to reconstruct offline the sum of the energy deposited in all the crystals of the spectrometer, which is complicated by the effect of NaI(Tl) light-yield non-proportionality. The use of a system to correct for time variations of the gain in individual detector modules, based on a light pulse generator, is demonstrated. We describe also an event-based method to evaluate the summing-pileup electronic distortion in segmented spectrometers. All of this allows a careful characterization of the detector with Monte Carlo simulations that is needed to calculate the response function for the analysis of total absorption gamma-ray spectroscopy data. Special attention was paid to the interaction of neutrons with the spectrometer, since they are a source of contamination in studies of beta-delayed neutron emitting nuclei.
|
|
|
ATLAS Collaboration(Aad, G. et al), Cabrera Urban, S., Castillo Gimenez, V., Costa, M. J., Fassi, F., Ferrer, A., et al. (2013). Characterisation and mitigation of beam-induced backgrounds observed in the ATLAS detector during the 2011 proton-proton run. J. Instrum., 8, P07004–72pp.
Abstract: This paper presents a summary of beam-induced backgrounds observed in the ATLAS detector and discusses methods to tag and remove background contaminated events in data. Trigger-rate based monitoring of beam-related backgrounds is presented. The correlations of backgrounds with machine conditions, such as residual pressure in the beam-pipe, are discussed. Results from dedicated beam-background simulations are shown, and their qualitative agreement with data is evaluated. Data taken during the passage of unpaired, i.e. non-colliding, proton bunches is used to obtain background-enriched data samples. These are used to identify characteristic features of beam-induced backgrounds, which then are exploited to develop dedicated background tagging tools. These tools, based on observables in the Pixel detector, the muon spectrometer and the calorimeters, are described in detail and their efficiencies are evaluated. Finally an example of an application of these techniques to a monojet analysis is given, which demonstrates the importance of such event cleaning techniques for some new physics searches.
|
|
|
LHCb Collaboration(Aaij, R. et al), Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Remon Alepuz, C., & Ruiz Vidal, J. (2022). Centrality determination in heavy-ion collisions with the LHCb detector. J. Instrum., 17(5), P05009–31pp.
Abstract: The centrality of heavy-ion collisions is directly related to the created medium in these interactions. A procedure to determine the centrality of collisions with the LHCb detector is implemented for lead-lead collisions root s(NN) = 5 TeV and lead-neon fixed-target collisions at root s(NN) = 69 GeV. The energy deposits in the electromagnetic calorimeter are used to determine and define the centrality classes. The correspondence between the number of participants and the centrality for the lead-lead collisions is in good agreement with the correspondence found in other experiments, and the centrality measurements for the lead-neon collisions presented here are performed for the first time in fixed-target collisions at the LHC.
|
|
|
Etxebeste, A., Dauvergne, D., Fontana, M., Letang, J. M., Llosa, G., Muñoz, E., et al. (2020). CCMod: a GATE module for Compton camera imaging simulation. Phys. Med. Biol., 65(5), 055004–17pp.
Abstract: Compton cameras are gamma-ray imaging systems which have been proposed for a wide variety of applications such as medical imaging, nuclear decommissioning or homeland security. In the design and optimization of such a system Monte Carlo simulations play an essential role. In this work, we propose a generic module to perform Monte Carlo simulations and analyses of Compton Camera imaging which is included in the open-source GATE/Geant4 platform. Several digitization stages have been implemented within the module to mimic the performance of the most commonly employed detectors (e.g. monolithic blocks, pixelated scintillator crystals, strip detectors...). Time coincidence sorter and sequence coincidence reconstruction are also available in order to aim at providing modules to facilitate the comparison and reproduction of the data taken with different prototypes. All processing steps may be performed during the simulation (on-the-fly mode) or as a post-process of the output files (offline mode). The predictions of the module have been compared with experimental data in terms of energy spectra, angular resolution, efficiency and back-projection image reconstruction. Consistent results within a 3-sigma interval were obtained for the energy spectra except for low energies where small differences arise. The angular resolution measure for incident photons of 1275 keV was also in good agreement between both data sets with a value close to 13 degrees. Moreover, with the aim of demonstrating the versatility of such a tool the performance of two different Compton camera designs was evaluated and compared.
|
|
|
de Salas, P. F., Gariazzo, S., Lesgourgues, J., & Pastor, S. (2017). Calculation of the local density of relic neutrinos. J. Cosmol. Astropart. Phys., 09(9), 034–24pp.
Abstract: Nonzero neutrino masses are required by the existence of flavour oscillations, with values of the order of at least 50 meV. We consider the gravitational clustering of relic neutrinos within the Milky Way, and used the N – one-body simulation technique to compute their density enhancement factor in the neighbourhood of the Earth with respect to the average cosmic density. Compared to previous similar studies, we pushed the simulation down to smaller neutrino masses, and included an improved treatment of the baryonic and dark matter distributions in the Milky Way. Our results are important for future experiments aiming at detecting the cosmic neutrino background, such as the Princeton Tritium Observatory for Light, Early-universe, Massive-neutrino Yield (PTOLEMY) proposal. We calculate the impact of neutrino clustering in the Milky Way on the expected event rate for a PTOLEMY-like experiment. We find that the effect of clustering remains negligible for the minimal normal hierarchy scenario, while it enhances the event rate by 10 to 20% (resp. a factor 1.7 to 2.5) for the minimal inverted hierarchy scenario (resp. a degenerate scenario with 150 meV masses). Finally we compute the impact on the event rate of a possible fourth sterile neutrino with a mass of 1.3 eV.
|
|