Home | [1–10] << 11 12 13 14 15 16 17 18 19 20 >> [21–30] |
Balibrea-Correa, J., Lerendegui-Marco, J., Babiano-Suarez, V., Caballero, L., Calvo, D., Ladarescu, I., et al. (2021). Machine Learning aided 3D-position reconstruction in large LaCl3 crystals. Nucl. Instrum. Methods Phys. Res. A, 1001, 165249–17pp.
Abstract: We investigate five different models to reconstruct the 3D gamma-ray hit coordinates in five large LaCl3(Ce) monolithic crystals optically coupled to pixelated silicon photomultipliers. These scintillators have a base surface of 50 x 50 mm(2) and five different thicknesses, from 10 mm to 30 mm. Four of these models are analytical prescriptions and one is based on a Convolutional Neural Network. Average resolutions close to 1-2 mm fwhm are obtained in the transverse crystal plane for crystal thicknesses between 10 mm and 20 mm using analytical models. For thicker crystals average resolutions of about 3-5 mm fwhm are obtained. Depth of interaction resolutions between 1 mm and 4 mm are achieved depending on the distance of the interaction point to the photosensor surface. We propose a Machine Learning algorithm to correct for linearity distortions and pin-cushion effects. The latter allows one to keep a large field of view of about 70%-80% of the crystal surface, regardless of crystal thickness. This work is aimed at optimizing the performance of the so-called Total Energy Detector with Compton imaging capability (i-TED) for time-of-flight neutron capture cross-section measurements.
|
Villanueva-Domingo, P., Mena, O., & Palomares-Ruiz, S. (2021). A Brief Review on Primordial Black Holes as Dark Matter. Front. Astron. Space Sci., 8, 681084–10pp.
Abstract: Primordial black holes (PBHs) represent a natural candidate for one of the components of the dark matter (DM) in the Universe. In this review, we shall discuss the basics of their formation, abundance and signatures. Some of their characteristic signals are examined, such as the emission of particles due to Hawking evaporation and the accretion of the surrounding matter, effects which could leave an impact in the evolution of the Universe and the formation of structures. The most relevant probes capable of constraining their masses and population are discussed.
|
Gimenez-Alventosa, V., Gimenez, V., & Oliver, S. (2021). PenRed: An extensible and parallel Monte-Carlo framework for radiation transport based on PENELOPE. Comput. Phys. Commun., 267, 108065–12pp.
Abstract: Monte Carlo methods provide detailed and accurate results for radiation transport simulations. Unfortunately, the high computational cost of these methods limits its usage in real-time applications. Moreover, existing computer codes do not provide a methodology for adapting these kinds of simulations to specific problems without advanced knowledge of the corresponding code system, and this restricts their applicability. To help solve these current limitations, we present PenRed, a general-purpose, standalone, extensible and modular framework code based on PENELOPE for parallel Monte Carlo simulations of electron-photon transport through matter. It has been implemented in C++ programming language and takes advantage of modern object-oriented technologies. In addition, PenRed offers the capability to read and process DICOM images as well as to construct and simulate image-based voxelized geometries, so as to facilitate its usage in medical applications. Our framework has been successfully verified against the original PENELOPE Fortran code. Furthermore, the implemented parallelism has been tested showing a significant improvement in the simulation time without any loss in precision of results. Program summary Program title: PenRed: Parallel Engine for Radiation Energy Deposition. CPC Library link to program files: https://doi .org /10 .17632/rkw6tvtngy.1 Licensing provision: GNU Affero General Public License (AGPL). Programming language: C++ standard 2011. Nature of problem: Monte Carlo simulations usually require a huge amount of computation time to achieve low statistical uncertainties. In addition, many applications necessitate particular characteristics or the extraction of specific quantities from the simulation. However, most available Monte Carlo codes do not provide an efficient parallel and truly modular structure which allows users to easily customise their code to suit their needs without an in-depth knowledge of the code system. Solution method: PenRed is a fully parallel, modular and customizable framework for Monte Carlo simulations of the passage of radiation through matter. It is based on the PENELOPE [1] code system, from which inherits its unique physics models and tracking algorithms for charged particles. PenRed has been coded in C++ following an object-oriented programming paradigm restricted to the C++11 standard. Our engine implements parallelism via a double approach: on the one hand, by using standard C++ threads for shared memory, improving the access and usage of the memory, and, on the other hand, via the MPI standard for distributed memory infrastructures. Notice that both kinds of parallelism can be combined together in the same simulation. Moreover, both threads and MPI processes, can be balanced using the builtin load balance system (RUPER-LB [30]) to maximise the performance on heterogeneous infrastructures. In addition, PenRed provides a modular structure with methods designed to easily extend its functionality. Thus, users can create their own independent modules to adapt our engine to their needs without changing the original modules. Furthermore, user extensions will take advantage of the builtin parallelism without any extra effort or knowledge of parallel programming. Additional comments including restrictions and unusual features: PenRed has been compiled in linux systems withg++ of GCC versions 4.8.5, 7.3.1, 8.3.1 and 9; clang version 3.4.2 and intel C++ compiler (icc) version 19.0.5.281. Since it is a C++11-standard compliant code, PenRed should be able to compile with any compiler with C++11 support. In addition, if the code is compiled without MPI support, it does not require any non standard library. To enable MPI capabilities, the user needs to install whatever available MPI implementation, such as openMPI [24] or mpich [25], which can be found in the repositories of any linux distribution. Finally, to provide DICOM processing support, PenRed can be optionally compiled using the dicom toolkit (dcmtk) [32] library. Thus, PenRed has only two optional dependencies, an MPI implementation and the dcmtk library.
|
Carrasco-Ribelles, L. A., Pardo-Mas, J. R., Tortajada, S., Saez, C., Valdivieso, B., & Garcia-Gomez, J. M. (2021). Predicting morbidity by local similarities in multi-scale patient trajectories. J. Biomed. Inform., 120, 103837–9pp.
Abstract: Patient Trajectories (PTs) are a method of representing the temporal evolution of patients. They can include information from different sources and be used in socio-medical or clinical domains. PTs have generally been used to generate and study the most common trajectories in, for instance, the development of a disease. On the other hand, healthcare predictive models generally rely on static snapshots of patient information. Only a few works about prediction in healthcare have been found that use PTs, and therefore benefit from their temporal dimension. All of them, however, have used PTs created from single-source information. Therefore, the use of longitudinal multi-scale data to build PTs and use them to obtain predictions about health conditions is yet to be explored. Our hypothesis is that local similarities on small chunks of PTs can identify similar patients concerning their future morbidities. The objectives of this work are (1) to develop a methodology to identify local similarities between PTs before the occurrence of morbidities to predict these on new query individuals; and (2) to validate this methodology on risk prediction of cardiovascular diseases (CVD) occurrence in patients with diabetes. We have proposed a novel formal definition of PTs based on sequences of longitudinal multi-scale data. Moreover, a dynamic programming methodology to identify local alignments on PTs for predicting future morbidities is proposed. Both the proposed methodology for PT definition and the alignment algorithm are generic to be applied on any clinical domain. We validated this solution for predicting CVD in patients with diabetes and we achieved a precision of 0.33, a recall of 0.72 and a specificity of 0.38. Therefore, the proposed solution in the diabetes use case can result of utmost utility to secondary screening.
|
Coloma, P., Lopez-Pavon, J., Rosauro-Alcaraz, S., & Urrea, S. (2021). New physics from oscillations at the DUNE near detector, and the role of systematic uncertainties. J. High Energy Phys., 08(8), 065–33pp.
Abstract: We study the capabilities of the DUNE near detector to probe deviations from unitarity of the leptonic mixing matrix, the 3+1 sterile formalism and Non-Standard Interactions affecting neutrino production and detection. We clarify the relation and possible mappings among the three formalisms at short-baseline experiments, and we add to current analyses in the literature the study of the nu(mu)-> nu(tau) appearance channel. We study in detail the impact of spectral uncertainties on the sensitivity to new physics using the DUNE near detector, which has been widely overlooked in the literature. Our analysis shows that this plays an important role on the results and, in particular, that it can lead to a strong reduction in the sensitivity to sterile neutrinos from nu(mu)-> nu(e) transitions, by more than two orders of magnitude. This stresses the importance of a joint experimental and theoretical effort to improve our understanding of neutrino nucleus cross sections, as well as hadron production uncertainties and beam focusing effects. Nevertheless, even with our conservative and more realistic implementation of systematic uncertainties, we find that an improvement over current bounds in the new physics frameworks considered is generally expected if spectral uncertainties are below the 5% level.
Keywords: Beyond Standard Model; Neutrino Physics
|
Fernandez Casani, A., Orduña, J. M., Sanchez, J., & Gonzalez de la Hoz, S. (2021). A Reliable Large Distributed Object Store Based Platform for Collecting Event Metadata. J. Grid Comput., 19(3), 39–19pp.
Abstract: The Large Hadron Collider (LHC) is about to enter its third run at unprecedented energies. The experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousands of physics users. The ATLAS EventIndex project, currently running in production, builds a complete catalogue of particle collisions, or events, for the ATLAS experiment at the LHC. The distributed nature of the experiment data model is exploited by running jobs at over one hundred Grid data centers worldwide. Millions of files with petabytes of data are indexed, extracting a small quantity of metadata per event, that is conveyed with a data collection system in real time to a central Hadoop instance at CERN. After a successful first implementation based on a messaging system, some issues suggested performance bottlenecks for the challenging higher rates in next runs of the experiment. In this work we characterize the weaknesses of the previous messaging system, regarding complexity, scalability, performance and resource consumption. A new approach based on an object-based storage method was designed and implemented, taking into account the lessons learned and leveraging the ATLAS experience with this kind of systems. We present the experiment that we run during three months in the real production scenario worldwide, in order to evaluate the messaging and object store approaches. The results of the experiment show that the new object-based storage method can efficiently support large-scale data collection for big data environments like the next runs of the ATLAS experiment at the LHC.
Keywords: Grid computing; Hadoop file system; Object-Based storage
|
Zornoza, J. D. (2021). Review on Indirect Dark Matter Searches with Neutrino Telescopes. Universe, 7(11), 415–10pp.
Abstract: The search for dark matter is one of the hottest topics in Physics today. The fact that about 80% of the matter of the Universe is of unknown nature has triggered an intense experimental activity to detect this kind of matter and a no less intense effort on the theory side to explain it. Given the fact that we do not know the properties of dark matter well, searches from different fronts are mandatory. Neutrino telescopes are part of this experimental quest and offer specific advantages. Among the targets to look for dark matter, the Sun and the Galactic Center are the most promising ones. Considering models of dark matter densities in the Sun, neutrino telescopes have put the best limits on spin-dependent cross section of proton-WIMP scattering. Moreover, they are competitive in the constraints on the thermally averaged annihilation cross-section for high WIMP masses when looking at the Galactic Centre. Other results are also reviewed.
Keywords: dark matter; neutrino telescopes; IceCube; ANTARES; KM3NeT; SuperK
|
Escribano, P., Hirsch, M., Nava, J., & Vicente, A. (2022). Observable flavor violation from spontaneous lepton number breaking. J. High Energy Phys., 01(1), 098–31pp.
Abstract: We propose a simple model of spontaneous lepton number violation with potentially large flavor violating decays, including the possibility that majoron emitting decays, such as μ-> e J, saturate the experimental bounds. In this model the majoron is a singlet-doublet admixture. It generates a type-I seesaw for neutrino masses and contains also a vector-like lepton. As a by-product, the model can explain the anomalous (g – 2)(mu), in parts of its parameter space, where one expects that the branching ratio of the Higgs to muons is changed with respect to Standard Model expectations. However, the explanation of the muon g – 2 anomaly would lead to tension with recent astrophysical bounds on the majoron coupling to muons.
Keywords: Beyond Standard Model; Neutrino Physics; Global Symmetries
|
de Azcarraga, J. A. (2022). The new Spanish educational legislation: why public education will not improve. Rev. Esp. Pedagog., 80(281), 111–129.
Abstract: This paper provides some reasons that explain, in the view of the author, why the present eagerness of the Spanish Educational Authorities to reform all levels of education, from primary school to the universities, will not improve the quality of the Spanish educational system.
|
Coves, A., Maestre, H., Archiles, R., Andres, M. V., & Gimeno, B. (2022). Surface-Impedance Formulation for Hollow-Core Waveguides Based on Subwavelength Gratings. IEEE Access, 10, 18843–18854.
Abstract: A rigorous Surface Impedance (SI) formulation for planar waveguides is presented. This modal technique splits the modal analysis of the waveguide in two steps. First, we obtain the modes characteristic equations as a function of the SI and, second, we need to obtain the surface impedance values using either analytical or numerical methods. We validate the technique by comparison with well-known analytical cases: the parallel-plate waveguide with losses and the dielectric slab waveguide. Then, we analyze an optical hollow-core waveguide defined by two high-contrast subwavelength gratings validating our results by comparison with reported values. Finally, we show the potential of our formulation with the analysis of a THz hollow-core waveguide defined by two surface-relief subwavelength gratings, including material losses in our formulation.
|