NEXT Collaboration(Simon, A. et al), Gomez-Cadenas, J. J., Alvarez, V., Benlloch-Rodriguez, J. M., Botas, A., Carcel, S., et al. (2017). Application and performance of an ML-EM algorithm in NEXT. J. Instrum., 12, P08009–22pp.
Abstract: The goal of the NEXT experiment is the observation of neutrinoless double beta decay in Xe-136 using a gaseous xenon TPC with electroluminescent amplification and specialized photodetector arrays for calorimetry and tracking. The NEXT Collaboration is exploring a number of reconstruction algorithms to exploit the full potential of the detector. This paper describes one of them: the Maximum Likelihood Expectation Maximization (ML-EM) method, a generic iterative algorithm to find maximum-likelihood estimates of parameters that has been applied to solve many different types of complex inverse problems. In particular, we discuss a bi-dimensional version of the method in which the photosensor signals integrated over time are used to reconstruct a transverse projection of the event. First results show that, when applied to detector simulation data, the algorithm achieves nearly optimal energy resolution (better than 0.5% FWHM at the Q value of 136Xe) for events distributed over the full active volume of the TPC.
|
Ahlburg, P. et al, & Marinas, C. (2020). EUDAQ – a data acquisition software framework for common beam telescopes. J. Instrum., 15(1), P01038–30pp.
Abstract: EUDAQ is a generic data acquisition software developed for use in conjunction with common beam telescopes at charged particle beam lines. Providing high-precision reference tracks for performance studies of new sensors, beam telescopes are essential for the research and development towards future detectors for high-energy physics. As beam time is a highly limited resource, EUDAQ has been designed with reliability and ease-of-use in mind. It enables flexible integration of different independent devices under test via their specific data acquisition systems into a top-level framework. EUDAQ controls all components globally, handles the data flow centrally and synchronises and records the data streams. Over the past decade, EUDAQ has been deployed as part of a wide range of successful test beam campaigns and detector development applications.
|
Gololo, M. G. D., Carrio Argos, F., & Mellado, B. (2022). Tile Computer-on-Module for the ATLAS Tile Calorimeter Phase-II upgrades. J. Instrum., 17(6), P06020–14pp.
Abstract: The Tile PreProcessor (TilePPr) is the core element of the Tile Calorimeter (TileCal) off-detector electronics for High-luminosity Large Hadron Collider (HL-LHC). The TilePPr comprises FPGA-based boards to operate and read out the TileCal on-detector electronics. The Tile Computer on Module (TileCoM) mezzanine is embedded within TilePPr to carry out three main functionalities. These include remote configuration of on-detector electronics and TilePPr FPGAs, interface the TilePPr with the ATLAS Trigger and Data Acquisition (TDAQ) system, and interfacing the TilePPr with the ATLAS Detector Control System (DCS) by providing monitoring data. The TileCoM is a 10-layer board with a Zynq UltraScale+ ZU2CG for processing data, interface components to integrate with TilePPr and the power supply to be connected to the Advanced Telecommunication Computing Architecture carrier. A CentOS embedded Linux is deployed on the TileCoM to implement the required functionalities for the HL-LHC. In this paper we present the hardware and firmware developments of the TileCoM system in terms of remote programming, interface with ATLAS TDAQ system and DCS system.
|
ATLAS Collaboration(Aad, G. et al), Akiot, A., Amos, K. R., Aparisi Pozo, J. A., Bailey, A. J., Bouchhar, N., et al. (2023). Fast b-tagging at the high-level trigger of the ATLAS experiment in LHC Run 3. J. Instrum., 18(11), P11006–38pp.
Abstract: The ATLAS experiment relies on real-time hadronic jet reconstruction and b-tagging to record fully hadronic events containing b-jets. These algorithms require track reconstruction, which is computationally expensive and could overwhelm the high-level-trigger farm, even at the reduced event rate that passes the ATLAS first stage hardware-based trigger. In LHC Run 3, ATLAS has mitigated these computational demands by introducing a fast neural-network-based b-tagger, which acts as a low-precision filter using input from hadronic jets and tracks. It runs after a hardware trigger and before the remaining high-level-trigger reconstruction. This design relies on the negligible cost of neural-network inference as compared to track reconstruction, and the cost reduction from limiting tracking to specific regions of the detector. In the case of Standard Model HH -> b (b) over barb (b) over bar, a key signature relying on b-jet triggers, the filter lowers the input rate to the remaining high-level trigger by a factor of five at the small cost of reducing the overall signal efficiency by roughly 2%.
|
Peppa, V., Thomson, R. M., Enger, S. A., Fonseca, G. P., Lee, C. N., Lucero, J. N. E., et al. (2023). A MC-based anthropomorphic test case for commissioning model-based dose calculation in interstitial breast 192-Ir HDR brachytherapy. Med. Phys., 50(7), 4675–4687.
Abstract: PurposeTo provide the first clinical test case for commissioning of Ir-192 brachytherapy model-based dose calculation algorithms (MBDCAs) according to the AAPM TG-186 report workflow. Acquisition and Validation MethodsA computational patient phantom model was generated from a clinical multi-catheter Ir-192 HDR breast brachytherapy case. Regions of interest (ROIs) were contoured and digitized on the patient CT images and the model was written to a series of DICOM CT images using MATLAB. The model was imported into two commercial treatment planning systems (TPSs) currently incorporating an MBDCA. Identical treatment plans were prepared using a generic Ir-192 HDR source and the TG-43-based algorithm of each TPS. This was followed by dose to medium in medium calculations using the MBDCA option of each TPS. Monte Carlo (MC) simulation was performed in the model using three different codes and information parsed from the treatment plan exported in DICOM radiation therapy (RT) format. Results were found to agree within statistical uncertainty and the dataset with the lowest uncertainty was assigned as the reference MC dose distribution. Data Format and Usage NotesThe dataset is available online at ,. Files include the treatment plan for each TPS in DICOM RT format, reference MC dose data in RT Dose format, as well as a guide for database users and all files necessary to repeat the MC simulations. Potential ApplicationsThe dataset facilitates the commissioning of brachytherapy MBDCAs using TPS embedded tools and establishes a methodology for the development of future clinical test cases. It is also useful to non-MBDCA adopters for intercomparing MBDCAs and exploring their benefits and limitations, as well as to brachytherapy researchers in need of a dosimetric and/or a DICOM RT information parsing benchmark. Limitations include specificity in terms of radionuclide, source model, clinical scenario, and MBDCA version used for its preparation.
|