|
Lerendegui-Marco, J., Balibrea-Correa, J., Babiano-Suarez, V., Ladarescu, I., & Domingo-Pardo, C. (2022). Towards machine learning aided real-time range imaging in proton therapy. Sci Rep, 12(1), 2735–17pp.
Abstract: Compton imaging represents a promising technique for range verification in proton therapy treatments. In this work, we report on the advantageous aspects of the i-TED detector for proton-range monitoring, based on the results of the first Monte Carlo study of its applicability to this field. i-TED is an array of Compton cameras, that have been specifically designed for neutron-capture nuclear physics experiments, which are characterized by gamma-ray energies spanning up to 5-6 MeV, rather low gamma-ray emission yields and very intense neutron induced gamma-ray backgrounds. Our developments to cope with these three aspects are concomitant with those required in the field of hadron therapy, especially in terms of high efficiency for real-time monitoring, low sensitivity to neutron backgrounds and reliable performance at the high gamma-ray energies. We find that signal-to-background ratios can be appreciably improved with i-TED thanks to its light-weight design and the low neutron-capture cross sections of its LaCl3 crystals, when compared to other similar systems based on LYSO, CdZnTe or LaBr3. Its high time-resolution (CRT similar to 500 ps) represents an additional advantage for background suppression when operated in pulsed HT mode. Each i-TED Compton module features two detection planes of very large LaCl3 monolithic crystals, thereby achieving a high efficiency in coincidence of 0.2% for a point-like 1 MeV gamma-ray source at 5 cm distance. This leads to sufficient statistics for reliable image reconstruction with an array of four i-TED detectors assuming clinical intensities of 10(8) protons per treatment point. The use of a two-plane design instead of three-planes has been preferred owing to the higher attainable efficiency for double time-coincidences than for threefold events. The loss of full-energy events for high energy gamma-rays is compensated by means of machine-learning based algorithms, which allow one to enhance the signal-to-total ratio up to a factor of 2.
|
|
|
Tortajada, S., Albiol, F., Caballero, L., Albiol, A., & Leganes-Nieto, J. L. (2023). A portable geometry-independent tomographic system for gamma-ray, a next generation of nuclear waste characterization. Sci Rep, 13(1), 12284–10pp.
Abstract: One of the main activities of the nuclear industry is the characterisation of radioactive waste based on the detection of gamma radiation. Large volumes of radioactive waste are classified according to their average activity, but often the radioactivity exceeds the maximum allowed by regulators in specific parts of the bulk. In addition, the detection of the radiation is currently based on static detection systems where the geometry of the bulk is fixed and well known. Furthermore, these systems are not portable and depend on the transport of waste to the places where the detection systems are located. However, there are situations where the geometry varies and where moving waste is complex. This is especially true in compromised situations.We present a new model for nuclear waste management based on a portable and geometry-independent tomographic system for three-dimensional image reconstruction for gamma radiation detection. The system relies on a combination of a gamma radiation camera and a visible camera that allows to visualise radioactivity using augmented reality and artificial computer vision techniques. This novel tomographic system has the potential to be a disruptive innovation in the nuclear industry for nuclear waste management.
|
|
|
Penas, J., Alejo, A., Bembibre, A., Apiñaniz, J. I., Garcia-Garcia, E., Guerrero, C., et al. (2024). Production of carbon-11 for PET preclinical imaging using a high-repetition rate laser-driven proton source. Sci Rep, 14(1), 11448–12pp.
Abstract: Most advanced medical imaging techniques, such as positron-emission tomography (PET), require tracers that are produced in conventional particle accelerators. This paper focuses on the evaluation of a potential alternative technology based on laser-driven ion acceleration for the production of radioisotopes for PET imaging. We report for the first time the use of a high-repetition rate, ultra-intense laser system for the production of carbon-11 in multi-shot operation. Proton bunches with energies up to 10-14 MeV were systematically accelerated in long series at pulse rates between 0.1 and 1 Hz using a PW-class laser. These protons were used to activate a boron target via the 11 \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$<^>{11}$$\end{document} B(p,n) 11 \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$<^>{11}$$\end{document} C nuclear reaction. A peak activity of 234 kBq was obtained in multi-shot operation with laser pulses with an energy of 25 J. Significant carbon-11 production was also achieved for lower pulse energies. The experimental carbon-11 activities measured in this work are comparable to the levels required for preclinical PET, which would be feasible by operating at the repetition rate of current state-of-the-art technology (10 Hz). The scalability of next-generation laser-driven accelerators in terms of this parameter for sustained operation over time could increase these overall levels into the clinical PET range.
|
|
|
Gomez-Cadenas, J. J., Martin-Albo, J., Menendez, J., Mezzetto, M., Monrabal, F., & Sorel, M. (2024). The search for neutrinoless double-beta decay. Riv. Nuovo Cimento, 46, 619–692.
Abstract: Neutrinos are the only particles in the Standard Model that could be Majorana fermions, that is, completely neutral fermions that are their own antiparticles. The most sensitive known experimental method to verify whether neutrinos are Majorana particles is the search for neutrinoless double-beta decay. The last 2 decades have witnessed the development of a vigorous program of neutrinoless double-beta decay experiments, spanning several isotopes and developing different strategies to handle the backgrounds masking a possible signal. In addition, remarkable progress has been made in the understanding of the nuclear matrix elements of neutrinoless double-beta decay, thus reducing a substantial part of the theoretical uncertainties affecting the particle-physics interpretation of the process. On the other hand, the negative results by several experiments, combined with the hints that the neutrino mass ordering could be normal, may imply very long lifetimes for the neutrinoless double-beta decay process. In this report, we review the main aspects of such process, the recent progress on theoretical ideas and the experimental state of the art. We then consider the experimental challenges to be addressed to increase the sensitivity to detect the process in the likely case that lifetimes are much longer than currently explored, and discuss a selection of the most promising experimental efforts.
|
|
|
Dorigo, T. et al, Ramos, A., & Ruiz de Austri, R. (2023). Toward the end-to-end optimization of particle physics instruments with differentiable programming. Rev. Phys., 10, 100085– pp.
Abstract: The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, due to the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, “experience-driven” layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized through a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters.
|
|