|
Oliver, S., Rodriguez Bosca, S., & Gimenez-Alventosa, V. (2024). Enabling particle transport on CAD-based geometries for radiation simulations with penRed. Comput. Phys. Commun., 298, 109091–11pp.
Abstract: Geometry construction is a fundamental aspect of any radiation transport simulation, regardless of the Monte Carlo code being used. Typically, this process is tedious, time-consuming, and error-prone. The conventional approach involves defining geometries using mathematical objects or surfaces. However, this method comes with several limitations, especially when dealing with complex models, particularly those with organic shapes. Furthermore, since each code employs its own format and methodology for defining geometries, sharing and reproducing simulations among researchers becomes a challenging task. Consequently, many codes have implemented support for simulating over geometries constructed via Computer-Aided Design (CAD) tools. Unfortunately, this feature is lacking in penRed and other PENELOPE physics-based codes. Therefore, the objective of this work is to implement such support within the penRed framework. New version program summary Program Title: Parallel Engine for Radiation Energy Deposition (penRed) CPC Library link to program files: https://doi.org/10.17632/rkw6tvtngy.2 Developer's repository link: https://github.com/PenRed/PenRed Code Ocean capsule: https://codeocean.com/capsule/1041417/tree Licensing provisions: GNU Affero General Public License v3 Programming language: C++ standard 2011. Journal reference of previous version: V. Gimenez-Alventosa, V. Gimenez Gomez, S. Oliver, PenRed: An extensible and parallel Monte-Carlo framework for radiation transport based on PENELOPE, Computer Physics Communications 267 (2021) 108065. doi:https://doi.org/10.1016/j.cpc.2021.108065. Does the new version supersede the previous version?: Yes Reasons for the new version: Implements the capability to simulate on CAD constructed geometries, among many other features and fixes. Summary of revisions: All changes applied through the code versions are summarized in the file CHANGELOG.md in the repository package. Nature of problem: While Monte Carlo codes have proven valuable in simulating complex radiation scenarios, they rely heavily on accurate geometrical representations. In the same way as many other Monte Carlo codes, penRed employs simple geometric quadric surfaces like planes, spheres and cylinders to define geometries. However, since these geometric models offer a certain level of flexibility, these representations have limitations when it comes to simulating highly intricate and irregular shapes. Anatomic structures, for example, require detailed representations of organs, tissues and bones, which are difficult to achieve using basic geometric objects. Similarly, complex devices or intricate mechanical systems may have designs that cannot be accurately represented within the constraints of such geometric models. Moreover, when the complexity of the model increases, geometry construction process becomes more difficult, tedious, time-consuming and error-prone [2]. Also, as each Monte Carlo geometry library uses its own format and construction method, reproducing the same geometry among different codes is a challenging task. Solution method: To face the problems stated above, the objective of this work is to implement the capability to simulate using irregular and adaptable meshed geometries in the penRed framework. This kind of meshes can be constructed using Computer-Aided Design (CAD) tools, the use of which is very widespread and streamline the design process. This feature has been implemented in a new geometry module named “MESH_BODY” specific for this kind of geometries. This one is freely available and usable within the official penRed package1. It can be used since penRed version 1.9.3b and above.
|
|
|
Muñoz, E., Barrio, J., Bernabeu, J., Etxebeste, A., Lacasta, C., Llosa, G., et al. (2018). Study and comparison of different sensitivity models for a two-plane Compton camera. Phys. Med. Biol., 63(13), 135004–19pp.
Abstract: Given the strong variations in the sensitivity of Compton cameras for the detection of events originating from different points in the field of view (FoV), sensitivity correction is often necessary in Compton image reconstruction. Several approaches for the calculation of the sensitivity matrix have been proposed in the literature. While most of these models are easily implemented and can be useful in many cases, they usually assume high angular coverage over the scattered photon, which is not the case for our prototype. In this work, we have derived an analytical model that allows us to calculate a detailed sensitivity matrix, which has been compared to other sensitivity models in the literature. Specifically, the proposed model describes the probability of measuring a useful event in a two-plane Compton camera, including the most relevant physical processes involved. The model has been used to obtain an expression for the system and sensitivity matrices for iterative image reconstruction. These matrices have been validated taking Monte Carlo simulations as a reference. In order to study the impact of the sensitivity, images reconstructed with our sensitivity model and with other models have been compared. Images have been reconstructed from several simulated sources, including point-like sources and extended distributions of activity, and also from experimental data measured with Na-22 sources. Results show that our sensitivity model is the best suited for our prototype. Although other models in the literature perform successfully in many scenarios, they are not applicable in all the geometrical configurations of interest for our system. In general, our model allows to effectively recover the intensity of point-like sources at different positions in the FoV and to reconstruct regions of homogeneous activity with minimal variance. Moreover, it can be employed for all Compton camera configurations, including those with low angular coverage over the scatterer.
|
|
|
Campanario, F., & Kubocz, M. (2014). Higgs boson CP-properties of the gluonic contributions in Higgs plus three jet production via gluon fusion at the LHC. J. High Energy Phys., 10(10), 173–16pp.
Abstract: in high energy hadronic collisions, a general CP-violating Higgs boson Phi with accompanying jets can be efficiently produced via gluon fusion, which is mediated by heavy quark loops. In this article, we study the dominant sub-channel gg -> ggg Phi of the gluon fusion production process with triple real emission corrections at order alpha(5)(s). We go beyond the heavy top-quark approximation and include the full mass dependence of the top- and bottom-quark contributions. Furthermore, in a specific model we demonstrate the features of our program and show the impact of bottom-quark loop contributions in combination with large values of tan beta on differential distributions sensitive to CP-rneasurements of the Higgs boson.
|
|
|
Pujades, M. C., Granero, D., Vijande, J., Ballester, F., Perez-Calatayud, J., Papagiannis, P., et al. (2014). Air-kerma evaluation at the maze entrance of HDR brachytherapy facilities. J. Radiol. Prot., 34(4), 741–753.
Abstract: In the absence of procedures for evaluating the design of brachytherapy (BT) facilities for radiation protection purposes, the methodology used for external beam radiotherapy facilities is often adapted. The purpose of this study is to adapt the NCRP 151 methodology for estimating the air-kerma rate at the door in BT facilities. Such methodology was checked against Monte Carlo (MC) techniques using the code Geant4. Five different facility designs were studied for Ir-192 and Co-60 HDR applications to account for several different bunker layouts. For the estimation of the lead thickness needed at the door, the use of transmission data for the real spectra at the door instead of the ones emitted by Ir-192 and Co-60 will reduce the lead thickness by a factor of five for Ir-192 and ten for Co-60. This will significantly lighten the door and hence simplify construction and operating requirements for all bunkers. The adaptation proposed in this study to estimate the air-kerma rate at the door depends on the complexity of the maze: it provides good results for bunkers with a maze (i.e. similar to those used for linacs for which the NCRP 151 methodology was developed) but fails for less conventional designs. For those facilities, a specific Monte Carlo study is in order for reasons of safety and cost-effectiveness.
|
|
|
Zhang, X., Xiao, Y. T., & Gimeno, B. (2020). Multipactor Suppression by a Resonant Static Magnetic Field on a Dielectric Surface. IEEE Trans. Electron Devices, 67(12), 5723–5728.
Abstract: In this article, we study the suppression of the multipactor phenomenon on a dielectric surface by a resonant static magnetic field. A homemade Monte Carlo algorithm is developed for multipactor simulations on a dielectric surface driven by two orthogonal radio frequency (RF) electric field components. When the static magnetic field is perpendicular to the tangential and normal RF electric fields, it is shown that if the normal electric field lags the tangential electric field by pi/2, the superposition of the normal and tangential electric fields will trigger a gyro-acceleration of the electron cloud and restrain the multipactor discharge effectively. By contrast, when the normal electric field is in advance of the tangential electric field by pi/2, the difference between the normal and tangential electric fields drives gyro-motion of the electron cloud. Consequently, two enhanced discharge zones are inevitable. The suppression effects of the resonant static magnetic field that is parallel to the tangential RF electric field or to the normal RF electric field are also presented.
|
|
|
Guadilla, V. et al, Algora, A., Tain, J. L., Agramunt, J., Jordan, D., Monserrate, M., et al. (2017). Characterization of a cylindrical plastic beta-detector with Monte Carlo simulations of optical photons. Nucl. Instrum. Methods Phys. Res. A, 854, 134–138.
Abstract: In this work we report on the Monte Carlo study performed to understand and reproduce experimental measurements of a new plastic beta-detector with cylindrical geometry. Since energy deposition simulations differ from the experimental measurements for such a geometry, we show how the simulation of production and transport of optical photons does allow one to obtain the shapes of the experimental spectra. Moreover, taking into account the computational effort associated with this kind of simulation, we develop a method to convert the simulations of energy deposited into light collected, depending only on the interaction point in the detector. This method represents a useful solution when extensive simulations have to be done, as in the case of the calculation of the response function of the spectrometer in a total absorption gamma-ray spectroscopy analysis.
|
|
|
Oliver, S., Vijande, J., Tejedor-Aguilar, N., Miro, R., Rovira-Escutia, J. J., Ballester, F., et al. (2023). Monte Carlo flattening filter design to high energy intraoperative electron beam homogenization. Radiat. Phys. Chem., 212, 111102–6pp.
Abstract: Intraoperative radiotherapy using mobile linear accelerators is used for a wide variety of malignancies. However, when large fields are used in combination with high energies, a deterioration of the flatness dose profile is measured with respect to smaller fields and lower energies. Indeed, for the LIAC HWL of Sordina, this deterioration is observed for the 12 MeV beam combined with 10 cm (or larger) diameter applicator. Aimed to solve this problem, a flattening filter has been designed and validated evaluating the feasibility of its usage at the upper part of the applicator. The design of the filter was based on Monte Carlo simulations because of its accuracy in modeling components of clinical devices, among other purposes. The LIAC 10 cm diameter applicator was modeled and simulated independently by two different research groups using two different MC codes, reproducing the heterogeneity of the 12 MeV energy beam. Then, an iterative process of filter design was carried out. Finally, the MC designed conical filter with the optimal size and height to obtain the desired flattened beam was built in-house using a 3D printer. During the experimental validation of the applicator-filter, percentage depth dose, beam profiles, absolute and peripheral dose measurements were performed to demonstrate the effectiveness of the filter addition in the applicator. These measurements conclude that the beam has been flattened, from 5.9% with the standard configuration to 1.6% for the configuration with the filter, without significant increase of the peripheral dose. Consequently, the new filter-applicator LIAC configuration can be used also in a conventional surgery room. A reduction of 16% of the output dose and a reduction of 1.1 mm in the D50 of the percentage depth dose was measured with respect to the original configuration. This work is a proof-of-concept that demonstrates that it is possible to add a filter able to flatten the beam delivered by the Sordina LIAC HWL. Future studies will focus on more refined technical solutions fully compatible with the integrity of the applicator, including its sterilization, to be safely introduced in the clinical practice.
|
|
|
Rivard, M. J., Granero, D., Perez-Calatayud, J., & Ballester, F. (2010). Influence of photon energy spectra from brachytherapy sources on Monte Carlo simulations of kerma and dose rates in water and air. Med. Phys., 37(2), 869–876.
Abstract: Methods: For Ir-192, I-125, and Pd-103, the authors considered from two to five published spectra. Spherical sources approximating common brachytherapy sources were assessed. Kerma and dose results from GEANT4, MCNP5, and PENELOPE-2008 were compared for water and air. The dosimetric influence of Ir-192, I-125, and Pd-103 spectral choice was determined. Results: For the spectra considered, there were no statistically significant differences between kerma or dose results based on Monte Carlo code choice when using the same spectrum. Water-kerma differences of about 2%, 2%, and 0.7% were observed due to spectrum choice for Ir-192, I-125, and Pd-103, respectively (independent of radial distance), when accounting for photon yield per Bq. Similar differences were observed for air-kerma rate. However, their ratio (as used in the dose-rate constant) did not significantly change when the various photon spectra were selected because the differences compensated each other when dividing dose rate by air-kerma strength. Conclusions: Given the standardization of radionuclide data available from the National Nuclear Data Center (NNDC) and the rigorous infrastructure for performing and maintaining the data set evaluations, NNDC spectra are suggested for brachytherapy simulations in medical physics applications.
|
|
|
Gimenez-Alventosa, V., Gimenez, V., & Oliver, S. (2021). PenRed: An extensible and parallel Monte-Carlo framework for radiation transport based on PENELOPE. Comput. Phys. Commun., 267, 108065–12pp.
Abstract: Monte Carlo methods provide detailed and accurate results for radiation transport simulations. Unfortunately, the high computational cost of these methods limits its usage in real-time applications. Moreover, existing computer codes do not provide a methodology for adapting these kinds of simulations to specific problems without advanced knowledge of the corresponding code system, and this restricts their applicability. To help solve these current limitations, we present PenRed, a general-purpose, standalone, extensible and modular framework code based on PENELOPE for parallel Monte Carlo simulations of electron-photon transport through matter. It has been implemented in C++ programming language and takes advantage of modern object-oriented technologies. In addition, PenRed offers the capability to read and process DICOM images as well as to construct and simulate image-based voxelized geometries, so as to facilitate its usage in medical applications. Our framework has been successfully verified against the original PENELOPE Fortran code. Furthermore, the implemented parallelism has been tested showing a significant improvement in the simulation time without any loss in precision of results. Program summary Program title: PenRed: Parallel Engine for Radiation Energy Deposition. CPC Library link to program files: https://doi .org /10 .17632/rkw6tvtngy.1 Licensing provision: GNU Affero General Public License (AGPL). Programming language: C++ standard 2011. Nature of problem: Monte Carlo simulations usually require a huge amount of computation time to achieve low statistical uncertainties. In addition, many applications necessitate particular characteristics or the extraction of specific quantities from the simulation. However, most available Monte Carlo codes do not provide an efficient parallel and truly modular structure which allows users to easily customise their code to suit their needs without an in-depth knowledge of the code system. Solution method: PenRed is a fully parallel, modular and customizable framework for Monte Carlo simulations of the passage of radiation through matter. It is based on the PENELOPE [1] code system, from which inherits its unique physics models and tracking algorithms for charged particles. PenRed has been coded in C++ following an object-oriented programming paradigm restricted to the C++11 standard. Our engine implements parallelism via a double approach: on the one hand, by using standard C++ threads for shared memory, improving the access and usage of the memory, and, on the other hand, via the MPI standard for distributed memory infrastructures. Notice that both kinds of parallelism can be combined together in the same simulation. Moreover, both threads and MPI processes, can be balanced using the builtin load balance system (RUPER-LB [30]) to maximise the performance on heterogeneous infrastructures. In addition, PenRed provides a modular structure with methods designed to easily extend its functionality. Thus, users can create their own independent modules to adapt our engine to their needs without changing the original modules. Furthermore, user extensions will take advantage of the builtin parallelism without any extra effort or knowledge of parallel programming. Additional comments including restrictions and unusual features: PenRed has been compiled in linux systems withg++ of GCC versions 4.8.5, 7.3.1, 8.3.1 and 9; clang version 3.4.2 and intel C++ compiler (icc) version 19.0.5.281. Since it is a C++11-standard compliant code, PenRed should be able to compile with any compiler with C++11 support. In addition, if the code is compiled without MPI support, it does not require any non standard library. To enable MPI capabilities, the user needs to install whatever available MPI implementation, such as openMPI [24] or mpich [25], which can be found in the repositories of any linux distribution. Finally, to provide DICOM processing support, PenRed can be optionally compiled using the dicom toolkit (dcmtk) [32] library. Thus, PenRed has only two optional dependencies, an MPI implementation and the dcmtk library.
|
|
|
Borja-Lloret, M., Barrientos, L., Bernabeu, J., Lacasta, C., Muñoz, E., Ros, A., et al. (2023). Influence of the background in Compton camera images for proton therapy treatment monitoring. Phys. Med. Biol., 68(14), 144001–16pp.
Abstract: Objective. Background events are one of the most relevant contributions to image degradation in Compton camera imaging for hadron therapy treatment monitoring. A study of the background and its contribution to image degradation is important to define future strategies to reduce the background in the system. Approach. In this simulation study, the percentage of different kinds of events and their contribution to the reconstructed image in a two-layer Compton camera have been evaluated. To this end, GATE v8.2 simulations of a proton beam impinging on a PMMA phantom have been carried out, for different proton beam energies and at different beam intensities. Main results. For a simulated Compton camera made of Lanthanum (III) Bromide monolithic crystals, coincidences caused by neutrons arriving from the phantom are the most common type of background produced by secondary radiations in the Compton camera, causing between 13% and 33% of the detected coincidences, depending on the beam energy. Results also show that random coincidences are a significant cause of image degradation at high beam intensities, and their influence in the reconstructed images is studied for values of the time coincidence windows from 500 ps to 100 ns. Significance. Results indicate the timing capabilities required to retrieve the fall-off position with good precision. Still, the noise observed in the image when no randoms are considered make us consider further background rejection methods.
|
|