Home | [1–10] << 11 12 13 14 15 16 17 18 19 20 >> [21–30] |
![]() |
Muñoz, E., Barrio, J., Bernabeu, J., Etxebeste, A., Lacasta, C., Llosa, G., et al. (2018). Study and comparison of different sensitivity models for a two-plane Compton camera. Phys. Med. Biol., 63(13), 135004–19pp.
Abstract: Given the strong variations in the sensitivity of Compton cameras for the detection of events originating from different points in the field of view (FoV), sensitivity correction is often necessary in Compton image reconstruction. Several approaches for the calculation of the sensitivity matrix have been proposed in the literature. While most of these models are easily implemented and can be useful in many cases, they usually assume high angular coverage over the scattered photon, which is not the case for our prototype. In this work, we have derived an analytical model that allows us to calculate a detailed sensitivity matrix, which has been compared to other sensitivity models in the literature. Specifically, the proposed model describes the probability of measuring a useful event in a two-plane Compton camera, including the most relevant physical processes involved. The model has been used to obtain an expression for the system and sensitivity matrices for iterative image reconstruction. These matrices have been validated taking Monte Carlo simulations as a reference. In order to study the impact of the sensitivity, images reconstructed with our sensitivity model and with other models have been compared. Images have been reconstructed from several simulated sources, including point-like sources and extended distributions of activity, and also from experimental data measured with Na-22 sources. Results show that our sensitivity model is the best suited for our prototype. Although other models in the literature perform successfully in many scenarios, they are not applicable in all the geometrical configurations of interest for our system. In general, our model allows to effectively recover the intensity of point-like sources at different positions in the FoV and to reconstruct regions of homogeneous activity with minimal variance. Moreover, it can be employed for all Compton camera configurations, including those with low angular coverage over the scatterer.
Keywords: Compton camera imaging; MLEM; Monte Carlo simulations; image quality
|
Gimenez-Alventosa, V., Gimenez, V., Ballester, F., Vijande, J., & Andreo, P. (2018). Correction factors for ionization chamber measurements with the 'Valencia' and 'large field Valencia' brachytherapy applicators. Phys. Med. Biol., 63(12), 125004–10pp.
Abstract: Treatment of small skin lesions using HDR brachytherapy applicators is a widely used technique. The shielded applicators currently available in clinical practice are based on a tungsten-alloy cup that collimates the source-emitted radiation into a small region, hence protecting nearby tissues. The goal of this manuscript is to evaluate the correction factors required for dose measurements with a plane-parallel ionization chamber typically used in clinical brachytherapy for the 'Valencia' and 'large field Valencia' shielded applicators. Monte Carlo simulations have been performed using the PENELOPE-2014 system to determine the absorbed dose deposited in a water phantom and in the chamber active volume with a Type A uncertainty of the order of 0.1%. The average energies of the photon spectra arriving at the surface of the water phantom differ by approximately 10%, being 384 keV for the 'Valencia' and 343 keV for the 'large field Valencia'. The ionization chamber correction factors have been obtained for both applicators using three methods, their values depending on the applicator being considered. Using a depth-independent global chamber perturbation correction factor and no shift of the effective point of measurement yields depth-dose differences of up to 1% for the 'Valencia' applicator. Calculations using a depth-dependent global perturbation factor, or a shift of the effective point of measurement combined with a constant partial perturbation factor, result in differences of about 0.1% for both applicators. The results emphasize the relevance of carrying out detailed Monte Carlo studies for each shielded brachytherapy applicator and ionization chamber.
|
Calatayud-Jordan, J., Candela-Juan, C., Palma, J. D., Pujades-Claumarchirant, M. C., Soriano, A., Gracia-Ochoa, M., et al. (2021). Influence of the simultaneous calibration of multiple ring dosimeters on the individual absorbed dose. J. Radiol. Prot., 41(2), 384–397.
Abstract: Ring dosimeters for personal dosimetry are calibrated in accredited laboratories following ISO 4037-3 guidelines. The simultaneous irradiation of multiple dosimeters would save time, but has to be carefully studied, since the scattering conditions could change and influence the absorbed dose in nearby dosimeters. Monte Carlo simulations using PENELOPE-2014 were performed to explore the need to increase the uncertainty of H-p (0.07) in the simultaneous irradiation of three and five DXT-RAD 707H-2 (Thermo Scientific) ring dosimeters with beam qualities: N-30, N-80 and N-300. Results show that the absorbed dose in each dosimeter is compatible with each of the others and with the reference simulation (a single dosimeter), with a coverage probability of 95% (k = 2). Comparison with experimental data yielded consistent results with the same coverage probability. Therefore, five ring dosimeters can be simultaneously irradiated with beam qualities ranging, at least, between N-30 and N-300 with a negligible impact on the uncertainty of H-p (0.07).
Keywords: ring dosimeters; personal dosimetry; calibration; Monte Carlo; ISO 4037
|
Garcia-Cases, F., Perez-Calatayud, J., Ballester, F., Vijande, J., & Granero, D. (2018). Peripheral dose around a mobile linac for intraoperative radiotherapy: radiation protection aspects. J. Radiol. Prot., 38(4), 1393–1411.
Abstract: The aim of this work is to analyse the scattered radiation produced by the mobile accelerator Mobetron 1000. To do so, detailed Monte Carlo simulations using two different codes, Penelope2008 and Geant4, were performed. Measurements were also done. To quantify the attenuation due to the internal structures, present in the accelerator head, on the scattered radiation produced, some of the main structural shielding in the Mobetron 1000 has been incorporated into the geometry simulation. Results are compared with measurements. Some discrepancies between the calculated and measured dose values were found. These differences can be traced back to the importance of the radiation component due to low energy scattered electrons. This encouraged us to perform additional calculations to separate the role played by this component. Ambient dose equivalent, H*(10), outside of the operating room (OR) has been evaluated using Geant4. H*(10) has been measured inside and outside the OR, being its values compatible with those reported in the literature once the low energy electron component is removed. With respect to the role played by neutrons, estimations of neutron H*(10) using Geant4 together with H*(10) measurements has been performed for the case of the 12 MeV electron beam. The values obtained agree with the experimental values existing in the literature, being much smaller than those registered in conventional accelerators. This study is a useful tool for the clinical user to investigate the radiation protection issues arising with the use of these accelerators in ORs without structural shielding. These results will also enable to better fix the maximum number of treatments that could be performed while insuring adequate radiological protection of workers and public in the hospital.
|
Penas, J., Alejo, A., Bembibre, A., Apiñaniz, J. I., Garcia-Garcia, E., Guerrero, C., et al. (2024). Production of carbon-11 for PET preclinical imaging using a high-repetition rate laser-driven proton source. Sci Rep, 14(1), 11448–12pp.
Abstract: Most advanced medical imaging techniques, such as positron-emission tomography (PET), require tracers that are produced in conventional particle accelerators. This paper focuses on the evaluation of a potential alternative technology based on laser-driven ion acceleration for the production of radioisotopes for PET imaging. We report for the first time the use of a high-repetition rate, ultra-intense laser system for the production of carbon-11 in multi-shot operation. Proton bunches with energies up to 10-14 MeV were systematically accelerated in long series at pulse rates between 0.1 and 1 Hz using a PW-class laser. These protons were used to activate a boron target via the 11 \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$<^>{11}$$\end{document} B(p,n) 11 \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$<^>{11}$$\end{document} C nuclear reaction. A peak activity of 234 kBq was obtained in multi-shot operation with laser pulses with an energy of 25 J. Significant carbon-11 production was also achieved for lower pulse energies. The experimental carbon-11 activities measured in this work are comparable to the levels required for preclinical PET, which would be feasible by operating at the repetition rate of current state-of-the-art technology (10 Hz). The scalability of next-generation laser-driven accelerators in terms of this parameter for sustained operation over time could increase these overall levels into the clinical PET range.
|
Tortajada, S., Albiol, F., Caballero, L., Albiol, A., & Leganes-Nieto, J. L. (2023). A portable geometry-independent tomographic system for gamma-ray, a next generation of nuclear waste characterization. Sci Rep, 13(1), 12284–10pp.
Abstract: One of the main activities of the nuclear industry is the characterisation of radioactive waste based on the detection of gamma radiation. Large volumes of radioactive waste are classified according to their average activity, but often the radioactivity exceeds the maximum allowed by regulators in specific parts of the bulk. In addition, the detection of the radiation is currently based on static detection systems where the geometry of the bulk is fixed and well known. Furthermore, these systems are not portable and depend on the transport of waste to the places where the detection systems are located. However, there are situations where the geometry varies and where moving waste is complex. This is especially true in compromised situations.We present a new model for nuclear waste management based on a portable and geometry-independent tomographic system for three-dimensional image reconstruction for gamma radiation detection. The system relies on a combination of a gamma radiation camera and a visible camera that allows to visualise radioactivity using augmented reality and artificial computer vision techniques. This novel tomographic system has the potential to be a disruptive innovation in the nuclear industry for nuclear waste management.
|
Muñoz, E., Ros, A., Borja-Lloret, M., Barrio, J., Dendooven, P., Oliver, J. F., et al. (2021). Proton range verification with MACACO II Compton camera enhanced by a neural network for event selection. Sci Rep, 11(1), 9325–12pp.
Abstract: The applicability extent of hadron therapy for tumor treatment is currently limited by the lack of reliable online monitoring techniques. An active topic of investigation is the research of monitoring systems based on the detection of secondary radiation produced during treatment. MACACO, a multi-layer Compton camera based on LaBr3 scintillator crystals and SiPMs, is being developed at IFIC-Valencia for this purpose. This work reports the results obtained from measurements of a 150 MeV proton beam impinging on a PMMA target. A neural network trained on Monte Carlo simulations is used for event selection, increasing the signal to background ratio before image reconstruction. Images of the measured prompt gamma distributions are reconstructed by means of a spectral reconstruction code, through which the 4.439 MeV spectral line is resolved. Images of the emission distribution at this energy are reconstructed, allowing calculation of the distal fall-off and identification of target displacements of 3 mm.
|
Khachatryan, M. et al, Coloma, P. (2021). Electron-beam energy reconstruction for neutrino oscillation measurements. Nature, 599(7886), 565–570.
Abstract: Neutrinos exist in one of three types or 'flavours'-electron, muon and tau neutrinos-and oscillate from one flavour to another when propagating through space. This phenomena is one of the few that cannot be described using the standard model of particle physics (reviewed in ref. (1)), and so its experimental study can provide new insight into the nature of our Universe (reviewed in ref. (2)). Neutrinos oscillate as a function of their propagation distance (L) divided by their energy (E). Therefore, experiments extract oscillation parameters by measuring their energy distribution at different locations. As accelerator-based oscillation experiments cannot directly measure E, the interpretation of these experiments relies heavily on phenomenological models of neutrino-nucleus interactions to infer E. Here we exploit the similarity of electron-nucleus and neutrino-nucleus interactions, and use electron scattering data with known beam energies to test energy reconstruction methods and interaction models. We find that even in simple interactions where no pions are detected, only a small fraction of events reconstruct to the correct incident energy. More importantly, widely used interaction models reproduce the reconstructed energy distribution only qualitatively and the quality of the reproduction varies strongly with beam energy. This shows both the need and the pathway to improve current models to meet the requirements of next-generation, high-precision experiments such as Hyper-Kamiokande (Japan)(3) and DUNE (USA)(4). Electron scattering measurements are shown to reproduce only qualitatively state-of-the-art lepton-nucleus energy reconstruction models, indicating that improvements to these particle-interaction models are required to ensure the accuracy of future high-precision neutrino oscillation experiments.
|
Wilson, J. N. et al, & Algora, A. (2021). Angular momentum generation in nuclear fission. Nature, 590(7847), 566–570.
Abstract: When a heavy atomic nucleus splits (fission), the resulting fragments are observed to emerge spinning(1); this phenomenon has been a mystery in nuclear physics for over 40 years(2,3). The internal generation of typically six or seven units of angular momentum in each fragment is particularly puzzling for systems that start with zero, or almost zero, spin. There are currently no experimental observations that enable decisive discrimination between the many competing theories for the mechanism that generates the angular momentum(4-12). Nevertheless, the consensus is that excitation of collective vibrational modes generates the intrinsic spin before the nucleus splits (pre-scission). Here we show that there is no significant correlation between the spins of the fragment partners, which leads us to conclude that angular momentum in fission is actually generated after the nucleus splits (post-scission). We present comprehensive data showing that the average spin is strongly mass-dependent, varying in saw-tooth distributions. We observe no notable dependence of fragment spin on the mass or charge of the partner nucleus, confirming the uncorrelated post-scission nature of the spin mechanism. To explain these observations, we propose that the collective motion of nucleons in the ruptured neck of the fissioning system generates two independent torques, analogous to the snapping of an elastic band. A parameterization based on occupation of angular momentum states according to statistical theory describes the full range of experimental data well. This insight into the role of spin in nuclear fission is not only important for the fundamental understanding and theoretical description of fission, but also has consequences for the gamma-ray heating problem in nuclear reactors(13,14), for the study of the structure of neutron-rich isotopes(15,16), and for the synthesis and stability of super-heavy elements(17,18). gamma-ray spectroscopy experiments on the origin of spin in the products of nuclear fission of spin-zero nuclei suggest that the fission fragments acquire their spin after scission, rather than before.
|
Conde, D., Castillo, F. L., Escobar, C., García, C., Garcia Navarro, J. E., Sanz, V., et al. (2023). Forecasting Geomagnetic Storm Disturbances and Their Uncertainties Using Deep Learning. Space Weather, 21(11), e2023SW003474–27pp.
Abstract: Severe space weather produced by disturbed conditions on the Sun results in harmful effects both for humans in space and in high-latitude flights, and for technological systems such as spacecraft or communications. Also, geomagnetically induced currents (GICs) flowing on long ground-based conductors, such as power networks, potentially threaten critical infrastructures on Earth. The first step in developing an alarm system against GICs is to forecast them. This is a challenging task given the highly non-linear dependencies of the response of the magnetosphere to these perturbations. In the last few years, modern machine-learning models have shown to be very good at predicting magnetic activity indices. However, such complex models are on the one hand difficult to tune, and on the other hand they are known to bring along potentially large prediction uncertainties which are generally difficult to estimate. In this work we aim at predicting the SYM-H index characterizing geomagnetic storms multiple-hour ahead, using public interplanetary magnetic field (IMF) data from the Sun-Earth L1 Lagrange point and SYM-H data. We implement a type of machine-learning model called long short-term memory (LSTM) network. Our scope is to estimate the prediction uncertainties coming from a deep-learning model in the context of forecasting the SYM-H index. These uncertainties will be essential to set reliable alarm thresholds. The resulting uncertainties turn out to be sizable at the critical stages of the geomagnetic storms. Our methodology includes as well an efficient optimization of important hyper-parameters of the LSTM network and robustness tests.
|