|
Balibrea-Correa, J., Lerendegui-Marco, J., Babiano-Suarez, V., Caballero, L., Calvo, D., Ladarescu, I., et al. (2021). Machine Learning aided 3D-position reconstruction in large LaCl3 crystals. Nucl. Instrum. Methods Phys. Res. A, 1001, 165249–17pp.
Abstract: We investigate five different models to reconstruct the 3D gamma-ray hit coordinates in five large LaCl3(Ce) monolithic crystals optically coupled to pixelated silicon photomultipliers. These scintillators have a base surface of 50 x 50 mm(2) and five different thicknesses, from 10 mm to 30 mm. Four of these models are analytical prescriptions and one is based on a Convolutional Neural Network. Average resolutions close to 1-2 mm fwhm are obtained in the transverse crystal plane for crystal thicknesses between 10 mm and 20 mm using analytical models. For thicker crystals average resolutions of about 3-5 mm fwhm are obtained. Depth of interaction resolutions between 1 mm and 4 mm are achieved depending on the distance of the interaction point to the photosensor surface. We propose a Machine Learning algorithm to correct for linearity distortions and pin-cushion effects. The latter allows one to keep a large field of view of about 70%-80% of the crystal surface, regardless of crystal thickness. This work is aimed at optimizing the performance of the so-called Total Energy Detector with Compton imaging capability (i-TED) for time-of-flight neutron capture cross-section measurements.
|
|
|
Villanueva-Domingo, P., Mena, O., & Palomares-Ruiz, S. (2021). A Brief Review on Primordial Black Holes as Dark Matter. Front. Astron. Space Sci., 8, 681084–10pp.
Abstract: Primordial black holes (PBHs) represent a natural candidate for one of the components of the dark matter (DM) in the Universe. In this review, we shall discuss the basics of their formation, abundance and signatures. Some of their characteristic signals are examined, such as the emission of particles due to Hawking evaporation and the accretion of the surrounding matter, effects which could leave an impact in the evolution of the Universe and the formation of structures. The most relevant probes capable of constraining their masses and population are discussed.
|
|
|
Gimenez-Alventosa, V., Gimenez, V., & Oliver, S. (2021). PenRed: An extensible and parallel Monte-Carlo framework for radiation transport based on PENELOPE. Comput. Phys. Commun., 267, 108065–12pp.
Abstract: Monte Carlo methods provide detailed and accurate results for radiation transport simulations. Unfortunately, the high computational cost of these methods limits its usage in real-time applications. Moreover, existing computer codes do not provide a methodology for adapting these kinds of simulations to specific problems without advanced knowledge of the corresponding code system, and this restricts their applicability. To help solve these current limitations, we present PenRed, a general-purpose, standalone, extensible and modular framework code based on PENELOPE for parallel Monte Carlo simulations of electron-photon transport through matter. It has been implemented in C++ programming language and takes advantage of modern object-oriented technologies. In addition, PenRed offers the capability to read and process DICOM images as well as to construct and simulate image-based voxelized geometries, so as to facilitate its usage in medical applications. Our framework has been successfully verified against the original PENELOPE Fortran code. Furthermore, the implemented parallelism has been tested showing a significant improvement in the simulation time without any loss in precision of results. Program summary Program title: PenRed: Parallel Engine for Radiation Energy Deposition. CPC Library link to program files: https://doi .org /10 .17632/rkw6tvtngy.1 Licensing provision: GNU Affero General Public License (AGPL). Programming language: C++ standard 2011. Nature of problem: Monte Carlo simulations usually require a huge amount of computation time to achieve low statistical uncertainties. In addition, many applications necessitate particular characteristics or the extraction of specific quantities from the simulation. However, most available Monte Carlo codes do not provide an efficient parallel and truly modular structure which allows users to easily customise their code to suit their needs without an in-depth knowledge of the code system. Solution method: PenRed is a fully parallel, modular and customizable framework for Monte Carlo simulations of the passage of radiation through matter. It is based on the PENELOPE [1] code system, from which inherits its unique physics models and tracking algorithms for charged particles. PenRed has been coded in C++ following an object-oriented programming paradigm restricted to the C++11 standard. Our engine implements parallelism via a double approach: on the one hand, by using standard C++ threads for shared memory, improving the access and usage of the memory, and, on the other hand, via the MPI standard for distributed memory infrastructures. Notice that both kinds of parallelism can be combined together in the same simulation. Moreover, both threads and MPI processes, can be balanced using the builtin load balance system (RUPER-LB [30]) to maximise the performance on heterogeneous infrastructures. In addition, PenRed provides a modular structure with methods designed to easily extend its functionality. Thus, users can create their own independent modules to adapt our engine to their needs without changing the original modules. Furthermore, user extensions will take advantage of the builtin parallelism without any extra effort or knowledge of parallel programming. Additional comments including restrictions and unusual features: PenRed has been compiled in linux systems withg++ of GCC versions 4.8.5, 7.3.1, 8.3.1 and 9; clang version 3.4.2 and intel C++ compiler (icc) version 19.0.5.281. Since it is a C++11-standard compliant code, PenRed should be able to compile with any compiler with C++11 support. In addition, if the code is compiled without MPI support, it does not require any non standard library. To enable MPI capabilities, the user needs to install whatever available MPI implementation, such as openMPI [24] or mpich [25], which can be found in the repositories of any linux distribution. Finally, to provide DICOM processing support, PenRed can be optionally compiled using the dicom toolkit (dcmtk) [32] library. Thus, PenRed has only two optional dependencies, an MPI implementation and the dcmtk library.
|
|
|
Carrasco-Ribelles, L. A., Pardo-Mas, J. R., Tortajada, S., Saez, C., Valdivieso, B., & Garcia-Gomez, J. M. (2021). Predicting morbidity by local similarities in multi-scale patient trajectories. J. Biomed. Inform., 120, 103837–9pp.
Abstract: Patient Trajectories (PTs) are a method of representing the temporal evolution of patients. They can include information from different sources and be used in socio-medical or clinical domains. PTs have generally been used to generate and study the most common trajectories in, for instance, the development of a disease. On the other hand, healthcare predictive models generally rely on static snapshots of patient information. Only a few works about prediction in healthcare have been found that use PTs, and therefore benefit from their temporal dimension. All of them, however, have used PTs created from single-source information. Therefore, the use of longitudinal multi-scale data to build PTs and use them to obtain predictions about health conditions is yet to be explored. Our hypothesis is that local similarities on small chunks of PTs can identify similar patients concerning their future morbidities. The objectives of this work are (1) to develop a methodology to identify local similarities between PTs before the occurrence of morbidities to predict these on new query individuals; and (2) to validate this methodology on risk prediction of cardiovascular diseases (CVD) occurrence in patients with diabetes. We have proposed a novel formal definition of PTs based on sequences of longitudinal multi-scale data. Moreover, a dynamic programming methodology to identify local alignments on PTs for predicting future morbidities is proposed. Both the proposed methodology for PT definition and the alignment algorithm are generic to be applied on any clinical domain. We validated this solution for predicting CVD in patients with diabetes and we achieved a precision of 0.33, a recall of 0.72 and a specificity of 0.38. Therefore, the proposed solution in the diabetes use case can result of utmost utility to secondary screening.
|
|
|
Coloma, P., Lopez-Pavon, J., Rosauro-Alcaraz, S., & Urrea, S. (2021). New physics from oscillations at the DUNE near detector, and the role of systematic uncertainties. J. High Energy Phys., 08(8), 065–33pp.
Abstract: We study the capabilities of the DUNE near detector to probe deviations from unitarity of the leptonic mixing matrix, the 3+1 sterile formalism and Non-Standard Interactions affecting neutrino production and detection. We clarify the relation and possible mappings among the three formalisms at short-baseline experiments, and we add to current analyses in the literature the study of the nu(mu)-> nu(tau) appearance channel. We study in detail the impact of spectral uncertainties on the sensitivity to new physics using the DUNE near detector, which has been widely overlooked in the literature. Our analysis shows that this plays an important role on the results and, in particular, that it can lead to a strong reduction in the sensitivity to sterile neutrinos from nu(mu)-> nu(e) transitions, by more than two orders of magnitude. This stresses the importance of a joint experimental and theoretical effort to improve our understanding of neutrino nucleus cross sections, as well as hadron production uncertainties and beam focusing effects. Nevertheless, even with our conservative and more realistic implementation of systematic uncertainties, we find that an improvement over current bounds in the new physics frameworks considered is generally expected if spectral uncertainties are below the 5% level.
|
|