|
ATLAS Collaboration(Aad, G. et al), Cabrera Urban, S., Castillo Gimenez, V., Costa, M. J., Fassi, F., Ferrer, A., et al. (2014). Monitoring and data quality assessment of the ATLAS liquid argon calorimeter. J. Instrum., 9, P07024–55pp.
Abstract: The liquid argon calorimeter is a key component of the ATLAS detector installed at the CERN Large Hadron Collider. The primary purpose of this calorimeter is the measurement of electron and photon kinematic properties. It also provides a crucial input for measuring jets and missing transverse momentum. An advanced data monitoring procedure was designed to quickly identify issues that would affect detector performance and ensure that only the best quality data are used for physics analysis. This article presents the validation procedure developed during the 2011 and 2012 LHC data-taking periods, in which more than 98% of the proton-proton luminosity recorded by ATLAS at a centre-of-mass energy of 7-8 TeV had calorimeter data quality suitable for physics analysis.
|
|
|
Garcia, A. R., Martinez, T., Cano-Ott, D., Castilla, J., Guerrero, C., Marin, J., et al. (2012). MONSTER: a time of flight spectrometer for beta-delayed neutron emission measurements. J. Instrum., 7, C05012–12pp.
Abstract: The knowledge of the beta-decay properties of nuclei contributes decisively to our understanding of nuclear phenomena: the beta-delayed neutron emission of neutron rich nuclei plays an important role in the nucleosynthesis r-process and constitutes a probe for nuclear structure of very neutron rich nuclei providing information about the high energy part of the full beta strength (S-beta) function. In addition, beta-delayed neutrons are essential for the control and safety of nuclear reactors. In order to determine the neutron energy spectra and emission probabilities from neutron precursors a MOdular Neutron time-of-flight SpectromeTER (MONSTER) has been proposed for the DESPEC experiment at the future FAIR facility. The design of MONSTER and status of its construction are reported in this work.
|
|
|
Super-Kamiokande Collaboration(Abe, K. et al), & Molina Sedgwick, S. (2022). Neutron tagging following atmospheric neutrino events in a water Cherenkov detector. J. Instrum., 17(10), P10029–41pp.
Abstract: We present the development of neutron-tagging techniques in Super-Kamiokande IV using a neural network analysis. The detection efficiency of neutron capture on hydrogen is estimated to be 26%, with a mis-tag rate of 0.016 per neutrino event. The uncertainty of the tagging efficiency is estimated to be 9.0%. Measurement of the tagging efficiency with data from an Americium-Beryllium calibration agrees with this value within 10%. The tagging procedure was performed on 3,244.4 days of SK-IV atmospheric neutrino data, identifying 18,091 neutrons in 26,473 neutrino events. The fitted neutron capture lifetime was measured as 218 +/- 9 μs.
|
|
|
Ortega, P. G., Torres-Espallardo, I., Cerutti, F., Ferrari, A., Gillam, J. E., Lacasta, C., et al. (2015). Noise evaluation of Compton camera imaging for proton therapy. Phys. Med. Biol., 60(5), 1845–1863.
Abstract: Compton Cameras emerged as an alternative for real-time dose monitoring techniques for Particle Therapy (PT), based on the detection of prompt-gammas. As a consequence of the Compton scattering process, the gamma origin point can be restricted onto the surface of a cone (Compton cone). Through image reconstruction techniques, the distribution of the gamma emitters can be estimated, using cone-surfaces backprojections of the Compton cones through the image space, along with more sophisticated statistical methods to improve the image quality. To calculate the Compton cone required for image reconstruction, either two interactions, the last being photoelectric absorption, or three scatter interactions are needed. Because of the high energy of the photons in PT the first option might not be adequate, as the photon is not absorbed in general. However, the second option is less efficient. That is the reason to resort to spectral reconstructions, where the incoming. energy is considered as a variable in the reconstruction inverse problem. Jointly with prompt gamma, secondary neutrons and scattered photons, not strongly correlated with the dose map, can also reach the imaging detector and produce false events. These events deteriorate the image quality. Also, high intensity beams can produce particle accumulation in the camera, which lead to an increase of random coincidences, meaning events which gather measurements from different incoming particles. The noise scenario is expected to be different if double or triple events are used, and consequently, the reconstructed images can be affected differently by spurious data. The aim of the present work is to study the effect of false events in the reconstructed image, evaluating their impact in the determination of the beam particle ranges. A simulation study that includes misidentified events (neutrons and random coincidences) in the final image of a Compton Telescope for PT monitoring is presented. The complete chain of detection, from the beam particle entering a phantom to the event classification, is simulated using FLUKA. The range determination is later estimated from the reconstructed image obtained from a two and three-event algorithm based on Maximum Likelihood Expectation Maximization. The neutron background and random coincidences due to a therapeutic-like time structure are analyzed for mono-energetic proton beams. The time structure of the beam is included in the simulations, which will affect the rate of particles entering the detector.
|
|
|
Double Chooz collaboration(Abrahao, T. et al), & Novella, P. (2018). Novel event classification based on spectral analysis of scintillation waveforms in Double Chooz. J. Instrum., 13, P01031–26pp.
Abstract: Liquid scintillators are a common choice for neutrino physics experiments, but their capabilities to perform background rejection by scintillation pulse shape discrimination is generally limited in large detectors. This paper describes a novel approach for a pulse shape based event classification developed in the context of the Double Chooz reactor antineutrino experiment. Unlike previous implementations, this method uses the Fourier power spectra of the scintillation pulse shapes to obtain event-wise information. A classification variable built from spectral information was able to achieve an unprecedented performance, despite the lack of optimization at the detector design level. Several examples of event classification are provided, ranging from differentiation between the detector volumes and an efficient rejection of instrumental light noise, to some sensitivity to the particle type, such as stopping muons, ortho-positronium formation, alpha particles as well as electrons and positrons. In combination with other techniques the method is expected to allow for a versatile and more efficient background rejection in the future, especially if detector optimization is taken into account at the design level.
|
|