|
Gimenez-Alventosa, V., Gimenez, V., & Oliver, S. (2021). PenRed: An extensible and parallel Monte-Carlo framework for radiation transport based on PENELOPE. Comput. Phys. Commun., 267, 108065–12pp.
Abstract: Monte Carlo methods provide detailed and accurate results for radiation transport simulations. Unfortunately, the high computational cost of these methods limits its usage in real-time applications. Moreover, existing computer codes do not provide a methodology for adapting these kinds of simulations to specific problems without advanced knowledge of the corresponding code system, and this restricts their applicability. To help solve these current limitations, we present PenRed, a general-purpose, standalone, extensible and modular framework code based on PENELOPE for parallel Monte Carlo simulations of electron-photon transport through matter. It has been implemented in C++ programming language and takes advantage of modern object-oriented technologies. In addition, PenRed offers the capability to read and process DICOM images as well as to construct and simulate image-based voxelized geometries, so as to facilitate its usage in medical applications. Our framework has been successfully verified against the original PENELOPE Fortran code. Furthermore, the implemented parallelism has been tested showing a significant improvement in the simulation time without any loss in precision of results. Program summary Program title: PenRed: Parallel Engine for Radiation Energy Deposition. CPC Library link to program files: https://doi .org /10 .17632/rkw6tvtngy.1 Licensing provision: GNU Affero General Public License (AGPL). Programming language: C++ standard 2011. Nature of problem: Monte Carlo simulations usually require a huge amount of computation time to achieve low statistical uncertainties. In addition, many applications necessitate particular characteristics or the extraction of specific quantities from the simulation. However, most available Monte Carlo codes do not provide an efficient parallel and truly modular structure which allows users to easily customise their code to suit their needs without an in-depth knowledge of the code system. Solution method: PenRed is a fully parallel, modular and customizable framework for Monte Carlo simulations of the passage of radiation through matter. It is based on the PENELOPE [1] code system, from which inherits its unique physics models and tracking algorithms for charged particles. PenRed has been coded in C++ following an object-oriented programming paradigm restricted to the C++11 standard. Our engine implements parallelism via a double approach: on the one hand, by using standard C++ threads for shared memory, improving the access and usage of the memory, and, on the other hand, via the MPI standard for distributed memory infrastructures. Notice that both kinds of parallelism can be combined together in the same simulation. Moreover, both threads and MPI processes, can be balanced using the builtin load balance system (RUPER-LB [30]) to maximise the performance on heterogeneous infrastructures. In addition, PenRed provides a modular structure with methods designed to easily extend its functionality. Thus, users can create their own independent modules to adapt our engine to their needs without changing the original modules. Furthermore, user extensions will take advantage of the builtin parallelism without any extra effort or knowledge of parallel programming. Additional comments including restrictions and unusual features: PenRed has been compiled in linux systems withg++ of GCC versions 4.8.5, 7.3.1, 8.3.1 and 9; clang version 3.4.2 and intel C++ compiler (icc) version 19.0.5.281. Since it is a C++11-standard compliant code, PenRed should be able to compile with any compiler with C++11 support. In addition, if the code is compiled without MPI support, it does not require any non standard library. To enable MPI capabilities, the user needs to install whatever available MPI implementation, such as openMPI [24] or mpich [25], which can be found in the repositories of any linux distribution. Finally, to provide DICOM processing support, PenRed can be optionally compiled using the dicom toolkit (dcmtk) [32] library. Thus, PenRed has only two optional dependencies, an MPI implementation and the dcmtk library.
|
|
|
Aiola, S., Amhis, Y., Billoir, P., Jashal, B. K., Henry, L., Oyanguren, A., et al. (2021). Hybrid seeding: A standalone track reconstruction algorithm for scintillating fibre tracker at LHCb. Comput. Phys. Commun., 260, 107713–5pp.
Abstract: We describe the Hybrid seeding, a stand-alone pattern recognition algorithm aiming at finding charged particle trajectories for the LHCb upgrade. A significant improvement to the charged particle reconstruction efficiency is accomplished by exploiting the knowledge of the LHCb magnetic field and the position of energy deposits in the scintillating fibre tracker detector. Moreover, we achieve a low fake rate and a small contribution to the overall timing budget of the LHCb real-time data processing.
|
|
|
KM3NeT Collaboration(Aiello, S. et al), Calvo, D., Coleiro, A., Colomer, M., Gozzini, S. R., Hernandez-Rey, J. J., et al. (2020). The Control Unit of the KM3NeT Data Acquisition System. Comput. Phys. Commun., 256, 107433–16pp.
Abstract: The KM3NeT Collaboration runs a multi-site neutrino observatory in the Mediterranean Sea. Water Cherenkov particle detectors, deep in the sea and far off the coasts of France and Italy, are already taking data while incremental construction progresses. Data Acquisition Control software is operating off-shore detectors as well as testing and qualification stations for their components. The software, named Control Unit, is highly modular. It can undergo upgrades and reconfiguration with the acquisition running. Interplay with the central database of the Collaboration is obtained in a way that allows for data taking even if Internet links fail. In order to simplify the management of computing resources in the long term, and to cope with possible hardware failures of one or more computers, the KM3NeT Control Unit software features a custom dynamic resource provisioning and failover technology, which is especially important for ensuring continuity in case of rare transient events in multi-messenger astronomy. The software architecture relies on ubiquitous tools and broadly adopted technologies and has been successfully tested on several operating systems.
|
|
|
Gonzalez-Iglesias, D., Gimeno, B., Esperante, D., Martinez-Reviriego, P., Martin-Luna, P., Fuster-Martinez, N., et al. (2024). Non-resonant ultra-fast multipactor regime in dielectric-assist accelerating structures. Results Phys., 56, 107245–12pp.
Abstract: The objective of this work is the evaluation of the risk of suffering a multipactor discharge in an S-band dielectric-assist accelerating (DAA) structure for a compact low-energy linear particle accelerator dedicated to hadrontherapy treatments. A DAA structure consists of ultra-low loss dielectric cylinders and disks with irises which are periodically arranged in a metallic enclosure, with the advantage of having an extremely high quality factor and very high shunt impedance at room temperature, and it is therefore proposed as a potential alternative to conventional disk-loaded copper structures. However, it has been observed that these structures suffer from multipactor discharges. In fact, multipactor is one of the main problems of these devices, as it limits the maximum accelerating gradient. Because of this, the analysis of multipactor risk in the early design steps of DAA cavities is crucial to ensure the correct performance of the device after fabrication. In this paper, we present a comprehensive and detailed study of multipactor in our DAA design through numerical simulations performed with an in-house developed code based on the Monte-Carlo method. The phenomenology of the multipactor (resonant electron trajectories, electron flight time between impacts, etc.) is described in detail for different values of the accelerating gradient. It has been found that in these structures an ultra-fast non-resonant multipactor appears, which is different from the types of multipactor theoretically studied in the scientific literature. In addition, the effect of several low electron emission coatings on the multipactor threshold is investigated. Furthermore, a novel design based on the modification of the DAA cell geometry for multipactor mitigation is introduced, which shows a significant increase in the accelerating gradient handling capabilities of our prototype.
|
|
|
Valdes-Cortez, C., Mansour, I., Rivard, M. J., Ballester, F., Mainegra-Hing, E., Thomson, R. M., et al. (2021). A study of Type B uncertainties associated with the photoelectric effect in low-energy Monte Carlo simulations. Phys. Med. Biol., 66(10), 105014–14pp.
Abstract: Purpose. To estimate Type B uncertainties in absorbed-dose calculations arising from the different implementations in current state-of-the-art Monte Carlo (MC) codes of low-energy photon cross-sections (<200 keV). Methods. MC simulations are carried out using three codes widely used in the low-energy domain: PENELOPE-2018, EGSnrc, and MCNP. Three dosimetry-relevant quantities are considered: mass energy-absorption coefficients for water, air, graphite, and their respective ratios; absorbed dose; and photon-fluence spectra. The absorbed dose and the photon-fluence spectra are scored in a spherical water phantom of 15 cm radius. Benchmark simulations using similar cross-sections have been performed. The differences observed between these quantities when different cross-sections are considered are taken to be a good estimator for the corresponding Type B uncertainties. Results. A conservative Type B uncertainty for the absorbed dose (k = 2) of 1.2%-1.7% (<50 keV), 0.6%-1.2% (50-100 keV), and 0.3% (100-200 keV) is estimated. The photon-fluence spectrum does not present clinically relevant differences that merit considering additional Type B uncertainties except for energies below 25 keV, where a Type B uncertainty of 0.5% is obtained. Below 30 keV, mass energy-absorption coefficients show Type B uncertainties (k = 2) of about 1.5% (water and air), and 2% (graphite), diminishing in all materials for larger energies and reaching values about 1% (40-50 keV) and 0.5% (50-75 keV). With respect to their ratios, the only significant Type B uncertainties are observed in the case of the water-to-graphite ratio for energies below 30 keV, being about 0.7% (k = 2). Conclusions. In contrast with the intermediate (about 500 keV) or high (about 1 MeV) energy domains, Type B uncertainties due to the different cross-sections implementation cannot be considered subdominant with respect to Type A uncertainties or even to other sources of Type B uncertainties (tally volume averaging, manufacturing tolerances, etc). Therefore, the values reported here should be accommodated within the uncertainty budget in low-energy photon dosimetry studies.
|
|
|
Albaladejo, M., Bibrzycki, L., Dawid, S. M., Fernandez-Ramirez, C., Gonzalez-Solis, S., Hiller Blin, A. N., et al. (2022). Novel approaches in hadron spectroscopy. Prog. Part. Nucl. Phys., 127, 103981–75pp.
Abstract: The last two decades have witnessed the discovery of a myriad of new and unexpected hadrons. The future holds more surprises for us, thanks to new-generation experiments. Understanding the signals and determining the properties of the states requires a parallel theoretical effort. To make full use of available and forthcoming data, a careful amplitude modeling is required, together with a sound treatment of the statistical uncertainties, and a systematic survey of the model dependencies. We review the contributions made by the Joint Physics Analysis Center to the field of hadron spectroscopy.
|
|
|
Davesne, D., Pastore, A., & Navarro, J. (2021). Linear response theory with finite-range interactions. Prog. Part. Nucl. Phys., 120, 103870–55pp.
Abstract: This review focuses on the calculation of infinite nuclear matter response functions using phenomenological finite-range interactions, equipped or not with tensor terms. These include Gogny and Nakada families, which are commonly used in the literature. Because of the finite-range, the main technical difficulty stems from the exchange terms of the particle-hole interaction. We first present results based on the so-called Landau and Landau-like approximations of the particle-hole interaction. Then, we review two methods which in principle provide numerically exact response functions. The first one is based on a multipolar expansion of both the particle-hole interaction and the particle-hole propagator and the second one consists in a continued fraction expansion of the response function. The numerical precision can be pushed to any degree of accuracy, but it is actually shown that two or three terms suffice to get converged results. Finally, we apply the formalism to the determination of possible finite-size instabilities induced by a finite-range interaction.
|
|
|
Carrasco-Ribelles, L. A., Pardo-Mas, J. R., Tortajada, S., Saez, C., Valdivieso, B., & Garcia-Gomez, J. M. (2021). Predicting morbidity by local similarities in multi-scale patient trajectories. J. Biomed. Inform., 120, 103837–9pp.
Abstract: Patient Trajectories (PTs) are a method of representing the temporal evolution of patients. They can include information from different sources and be used in socio-medical or clinical domains. PTs have generally been used to generate and study the most common trajectories in, for instance, the development of a disease. On the other hand, healthcare predictive models generally rely on static snapshots of patient information. Only a few works about prediction in healthcare have been found that use PTs, and therefore benefit from their temporal dimension. All of them, however, have used PTs created from single-source information. Therefore, the use of longitudinal multi-scale data to build PTs and use them to obtain predictions about health conditions is yet to be explored. Our hypothesis is that local similarities on small chunks of PTs can identify similar patients concerning their future morbidities. The objectives of this work are (1) to develop a methodology to identify local similarities between PTs before the occurrence of morbidities to predict these on new query individuals; and (2) to validate this methodology on risk prediction of cardiovascular diseases (CVD) occurrence in patients with diabetes. We have proposed a novel formal definition of PTs based on sequences of longitudinal multi-scale data. Moreover, a dynamic programming methodology to identify local alignments on PTs for predicting future morbidities is proposed. Both the proposed methodology for PT definition and the alignment algorithm are generic to be applied on any clinical domain. We validated this solution for predicting CVD in patients with diabetes and we achieved a precision of 0.33, a recall of 0.72 and a specificity of 0.38. Therefore, the proposed solution in the diabetes use case can result of utmost utility to secondary screening.
|
|
|
Weber, M. et al, & Esperante, D. (2024). DONES EVO: Risk mitigation for the IFMIF-DONES facility. Nucl. Mater. Energy, 38, 101622–5pp.
Abstract: The International Fusion Materials Irradiation Facility- DEMO Oriented Neutron Source (IFMIF-DONES) is a scientific infrastructure aimed to provide an intense neutron source for the qualification of materials to be used in future fusion power reactors. Its implementation is critical for the construction of the fusion DEMOnstration Power Plant (DEMO). IFMIF-DONES is a unique facility requiring a broad set of technologies. Although most of the necessary technologies have already been validated, there are still some aspects that introduce risks in the evolution of the project. In order to mitigate these risks, a consortium of companies, with the support of research centres and the funding of the CDTI (Centre for the Development of Industrial Technology and Innovation), has launched the DONES EVO Programme, which comprises six lines of research: center dot Improvement of signal transmission and integrity (planning and integration risks) center dot Optimisation of RF conditioning processes (planning and reliability risks) center dot Development of a reliable beam extraction device (reliability risks) center dot Development of technologies for the production of medical isotopes (reliability risks) center dot Improvement of critical parts of the lithium purification system (safety and reliability risks) center dot Validation of the manufacture of critical components with special materials (reliability risk). DONES EVO will focus on developing the appropriate response to the risks identified in the IFMIFDONES project through research and prototyping around the associated technologies.
|
|
|
LHCb Collaboration(Aaij, R. et al), Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Remon Alepuz, C., & Ruiz Vidal, J. (2021). Search for the doubly charmed baryon Omega(+)(cc). Sci. China-Phys. Mech. Astron., 64(10), 101062–12pp.
Abstract: A search for the doubly charmed baryon Omega(+)(cc) with the decay mode Omega(+)(cc) -> Xi K-+(c)-pi(+) is performed using proton-proton collision data at a centre-of-mass energy of 13 TeV collected by the LHCb experiment from 2016 to 2018, corresponding to an integrated luminosity of 5.4 fb(-1). No significant signal is observed within the invariant mass range of 3.6 to 4.0GeV/c(2). Upper limits are set on the ratio R of the production cross-section times the total branching fraction of the Omega(+)(cc) -> Xi K-+(c)-pi(+) decay with respect to the Xi(++)(cc) -> Lambda K-+(c)-pi(+)pi(+) decay. Upper limits at 95% credibility level for R in the range 0.005 to 0.11 are obtained for different hypotheses on the Omega(+)(cc) mass and lifetime in the rapidity range from 2.0 to 4.5 and transverse momentum range from 4 to 15 GeV/c.
|
|