|
Calefice, L., Hennequin, A., Henry, L., Jashal, B. K., Mendoza, D., Oyanguren, A., et al. (2022). Effect of the high-level trigger for detecting long-lived particles at LHCb. Front. Big Data, 5, 1008737–13pp.
Abstract: Long-lived particles (LLPs) show up in many extensions of the Standard Model, but they are challenging to search for with current detectors, due to their very displaced vertices. This study evaluated the ability of the trigger algorithms used in the Large Hadron Collider beauty (LHCb) experiment to detect long-lived particles and attempted to adapt them to enhance the sensitivity of this experiment to undiscovered long-lived particles. A model with a Higgs portal to a dark sector is tested, and the sensitivity reach is discussed. In the LHCb tracking system, the farthest tracking station from the collision point is the scintillating fiber tracker, the SciFi detector. One of the challenges in the track reconstruction is to deal with the large amount of and combinatorics of hits in the LHCb detector. A dedicated algorithm has been developed to cope with the large data output. When fully implemented, this algorithm would greatly increase the available statistics for any long-lived particle search in the forward region and would additionally improve the sensitivity of analyses dealing with Standard Model particles of large lifetime, such as KS0 or Lambda (0) hadrons.
|
|
|
Dimmock, M. R., Nikulin, D. A., Gillam, J. E., & Nguyen, C. V. (2012). An OpenCL Implementation of Pinhole Image Reconstruction. IEEE Trans. Nucl. Sci., 59(4), 1738–1749.
Abstract: AC++/OpenCL software platform for emission image reconstruction of data from pinhole cameras has been developed. The software incorporates a new, accurate but computationally costly, probability distribution function for operating on list-mode data from detector stacks. The platform architecture is more general than previous works, supporting advanced models such as arbitrary probability distribution, collimation geometry and detector stack geometry. The software was implemented such that all performance-critical operations occur on OpenCL devices, generally GPUs. The performance of the software is tested on several commodity CPU and GPU devices.
|
|
|
Vidal, F. P. et al, & Albiol, F. (2025). X-ray simulations with gVXR in education, digital twining, experiment planning, and data analysis. Nucl. Instrum. Methods Phys. Res. B, 568, 165804–32pp.
Abstract: gVirtualXray (gVXR) is an open-source framework that relies on the Beer-Lambert law to simulate X-ray images in real time on a graphics processor unit (GPU) using triangular meshes. A wide range of programming languages is supported (C/C++, Python, R, Ruby, Tcl, C#, Java, and GNU Octave). Simulations generated with gVXR have been benchmarked with clinically realistic phantoms (i.e. complex structures and materials) using Monte Carlo (MC) simulations, real radiographs and real digitally reconstructed radiographs (DRRs), and X-ray computed tomography (CT). It has been used in a wide range of applications, including real-time medical simulators, proposing a new densitometric radiographic modality in clinical imaging, studying noise removal techniques in fluoroscopy, teaching particle physics and X-ray imaging to undergraduate students in engineering, and XCT to masters students, predicting image quality and artifacts in material science, etc. gVXR has also been used to produce a high number of realistic simulated images in optimisation problems and to train machine learning algorithms. This paper presents a comprehensive review of such applications of gVXR.
|
|