|
Benisty, D., Olmo, G. J., & Rubiera-Garcia, D. (2021). Singularity-Free and Cosmologically Viable Born-Infeld Gravity with Scalar Matter. Symmetry-Basel, 13(11), 2108–24pp.
Abstract: The early cosmology, driven by a single scalar field, both massless and massive, in the context of Eddington-inspired Born-Infeld gravity, is explored. We show the existence of nonsingular solutions of bouncing and loitering type (depending on the sign of the gravitational theory's parameter, epsilon) replacing the Big Bang singularity, and discuss their properties. In addition, in the massive case, we find some new features of the cosmological evolution depending on the value of the mass parameter, including asymmetries in the expansion/contraction phases, or a continuous transition between a contracting phase to an expanding one via an intermediate loitering phase. We also provide a combined analysis of cosmic chronometers, standard candles, BAO, and CMB data to constrain the model, finding that for roughly |epsilon|& LSIM;5 & BULL;10-8m2 the model is compatible with the latest observations while successfully removing the Big Bang singularity. This bound is several orders of magnitude stronger than the most stringent constraints currently available in the literature.
|
|
|
n_TOF Collaboration(Giubrone, G. et al), & Tain, J. L. (2011). The Role of Fe and Ni for S-process Nucleosynthesis and Innovative Nuclear Technologies. J. Korean Phys. Soc., 59(2), 2106–2109.
Abstract: The accurate measurement of neutron capture cross sections of all Fe and Ni isotopes is important for disentangling the contribution of the s-process and the r-process to the stellar nucleosynthesis of elements in the mass range 60 < A < 120. At the same time, Fe and Ni are important components of structural materials and improved neutron cross section data is relevant in the design of new nuclear systems. With the aim of obtaining improved capture data on all stable iron and nickel isotopes, a program of measurements has been launched at the CERN Neutron Time of Flight Facility n_TOF.
|
|
|
Aldana, M., & Lledo, M. A. (2023). The Fuzzy Bit. Symmetry-Basel, 15(12), 2103–25pp.
Abstract: In this paper, the formulation of Quantum Mechanics in terms of fuzzy logic and fuzzy sets is explored. A result by Pykacz, which establishes a correspondence between (quantum) logics (lattices with certain properties) and certain families of fuzzy sets, is applied to the Birkhoff-von Neumann logic, the lattice of projectors of a Hilbert space. Three cases are considered: the qubit, two qubits entangled, and a qutrit 'nested' inside the two entangled qubits. The membership functions of the fuzzy sets are explicitly computed and all the connectives of the fuzzy sets are interpreted as operations with these particular membership functions. In this way, a complete picture of the standard quantum logic in terms of fuzzy sets is obtained for the systems considered.
|
|
|
n_TOF Collaboration, Kappeler, F., Mengoni, A., Mosconi, M., Fujii, K., Heil, M., et al. (2011). Neutron Studies for Dating the Universe. J. Korean Phys. Soc., 59(2), 2094–2099.
Abstract: The neutron capture cross sections of (186)Os and (187)Os are of key importance for defining the 8-process abundance of (187)Os at the formation of the solar system. This quantity can be used to determine the radiogenic abundance component of (187)Os from the decay of (187)Re (t(1/2) = 41.2 Gyr) and to infer the time-duration of the nucleosynthesis in our galaxy (Re/Os cosmochronometer). The neutron capture cross sections of (186)Os, (187)Os, and (188)Os have been measured at the CERN nTOF facility from 1 eV to 1 MeV, covering the entire energy range of astrophysical interest. From these data Maxwellian averaged capture cross sections have been calculated with uncertainties between 3.3 and 4.7%. Additional information was obtained by measuring the inelastic scattering cross section of (187)Os at the Karlsruhe 3.7 MV Van de Graaff accelerator and by neutron resonance analyses of the nTOF capture data to establish a comprehensive experimental basis for the Hauser-Feshbach statistical model. Consistent I-IF calculations for the capture and inelastic reaction channels were performed to determine the stellar enhancement factors, which are required to correct the Maxwellian averaged cross sections for the effect of thermally populated excited states. The consequences of this analysis for the s-process component of the (187)Os abundance and the related impact on the evaluation of the time-duration of Galactic nucleosynthesis via the Re/Os cosmo-chronometer are discussed.
|
|
|
Baker, M. J., Bordes, J., Hong-Mo, C., & Tsun, T. S. (2011). Mass Hierarchy, Mixing, CP-Violation And Higgs Decay – Or Why Rotation Is Good For Us. Int. J. Mod. Phys. A, 26(13), 2087–2124.
Abstract: The idea of a rank-one rotating mass matrix (R2M2) is reviewed detailing how it leads to ready explanations both for the fermion mass hierarchy and for the distinctive mixing patterns between up and down fermion states, which can be and have been tested against experiment and shown to be fully consistent with existing data. Further, R2M2 is seen to offer, as by-products: (i) a new solution to the strong CP problem in QCD by linking the theta-angle there to the Kobayashi-Maskawa CP-violating phase in the CKM matrix, and (ii) some novel predictions of possible anomalies in Higgs decay observable in principle at the LHC. A special effort is made to answer some questions raised.
|
|
|
Real, D., Calvo, D., Zornoza, J. D., Manzaneda, M., Gozzini, R., Ricolfe-Viala, C., et al. (2024). Fast Coincidence Filter for Silicon Photomultiplier Dark Count Rate Rejection. Sensors, 24(7), 2084–12pp.
Abstract: Silicon Photomultipliers find applications across various fields. One potential Silicon Photomultiplier application domain is neutrino telescopes, where they may enhance the angular resolution. However, the elevated dark count rate associated with Silicon Photomultipliers represents a significant challenge to their widespread utilization. To address this issue, it is proposed to use Silicon Photomultipliers and Photomultiplier Tubes together. The Photomultiplier Tube signals serve as a trigger to mitigate the dark count rate, thereby preventing undue saturation of the available bandwidth. This paper presents an investigation into a fast and resource-efficient method for filtering the Silicon Photomultiplier dark count rate. A low-resource and fast coincident filter has been developed, which removes the Silicon Photomultiplier dark count rate by using as a trigger the Photomultiplier Tube input signals. The architecture of the coincidence filter, together with the first results obtained, which validate the effectiveness of this method, is presented.
|
|
|
SCiMMA and SNEWS Collaborations(Baxter, A. L. et al), & Colomer, M. (2022). Collaborative experience between scientific software projects using Agile Scrum development. Softw.-Pract. Exp., 52, 2077–2096.
Abstract: Developing sustainable software for the scientific community requires expertise in software engineering and domain science. This can be challenging due to the unique needs of scientific software, the insufficient resources for software engineering practices in the scientific community, and the complexity of developing for evolving scientific contexts. While open-source software can partially address these concerns, it can introduce complicating dependencies and delay development. These issues can be reduced if scientists and software developers collaborate. We present a case study wherein scientists from the SuperNova Early Warning System collaborated with software developers from the Scalable Cyberinfrastructure for Multi-Messenger Astrophysics project. The collaboration addressed the difficulties of open-source software development, but presented additional risks to each team. For the scientists, there was a concern of relying on external systems and lacking control in the development process. For the developers, there was a risk in supporting a user-group while maintaining core development. These issues were mitigated by creating a second Agile Scrum framework in parallel with the developers' ongoing Agile Scrum process. This Agile collaboration promoted communication, ensured that the scientists had an active role in development, and allowed the developers to evaluate and implement the scientists' software requirements. The collaboration provided benefits for each group: the scientists actuated their development by using an existing platform, and the developers utilized the scientists' use-case to improve their systems. This case study suggests that scientists and software developers can avoid scientific computing issues by collaborating and that Agile Scrum methods can address emergent concerns.
|
|
|
LHCb Collaboration(Aaij, R. et al), Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Remon Alepuz, C., & Ruiz Vidal, J. (2022). Identification of charm jets at LHCb. J. Instrum., 17(2), P02028–23pp.
Abstract: The identification of charm jets is achieved at LHCb for data collected in 2015-2018 using a method based on the properties of displaced vertices reconstructed and matched with jets. The performance of this method is determined using a dijet calibration dataset recorded by the LHCb detector and selected such that the jets are unbiased in quantities used in the tagging algorithm. The charm-tagging efficiency is reported as a function of the transverse momentum of the jet. The measured efficiencies are compared to those obtained from simulation and found to be in good agreement.
|
|
|
Guadilla, V., Algora, A., Estienne, M., Fallot, M., Gelletly, W., Porta, A., et al. (2024). First measurements with a new fl-electron detector for spectral shape studies. J. Instrum., 19(2), P02027–21pp.
Abstract: The shape of the electron spectrum emitted in /3 decay carries a wealth of information about nuclear structure and fundamental physics. In spite of that, few dedicated measurements have been made of /3 -spectrum shapes. In this work we present a newly developed detector for /3 electrons based on a telescope concept. A thick plastic scintillator is employed in coincidence with a thin silicon detector. The first measurements employing this detector have been carried out with mono -energetic electrons from the high-energy resolution electron -beam spectrometer at Bordeaux. Here we report on the good reproduction of the experimental spectra of mono -energetic electrons using Monte Carlo simulations. This is a crucial step for future experiments, where a detailed Monte Carlo characterization of the detector is needed to determine the shape of the /3 -electron spectra by deconvolution of the measured spectra with the response function of the detector. A chamber to contain two telescope assemblies has been designed for future /3 -decay experiments at the Ion Guide Isotope Separator On -Line facility in Jyvaskyla, aimed at improving our understanding of reactor antineutrino spectra.
|
|
|
Conde, D., Castillo, F. L., Escobar, C., García, C., Garcia Navarro, J. E., Sanz, V., et al. (2023). Forecasting Geomagnetic Storm Disturbances and Their Uncertainties Using Deep Learning. Space Weather, 21(11), e2023SW003474–27pp.
Abstract: Severe space weather produced by disturbed conditions on the Sun results in harmful effects both for humans in space and in high-latitude flights, and for technological systems such as spacecraft or communications. Also, geomagnetically induced currents (GICs) flowing on long ground-based conductors, such as power networks, potentially threaten critical infrastructures on Earth. The first step in developing an alarm system against GICs is to forecast them. This is a challenging task given the highly non-linear dependencies of the response of the magnetosphere to these perturbations. In the last few years, modern machine-learning models have shown to be very good at predicting magnetic activity indices. However, such complex models are on the one hand difficult to tune, and on the other hand they are known to bring along potentially large prediction uncertainties which are generally difficult to estimate. In this work we aim at predicting the SYM-H index characterizing geomagnetic storms multiple-hour ahead, using public interplanetary magnetic field (IMF) data from the Sun-Earth L1 Lagrange point and SYM-H data. We implement a type of machine-learning model called long short-term memory (LSTM) network. Our scope is to estimate the prediction uncertainties coming from a deep-learning model in the context of forecasting the SYM-H index. These uncertainties will be essential to set reliable alarm thresholds. The resulting uncertainties turn out to be sizable at the critical stages of the geomagnetic storms. Our methodology includes as well an efficient optimization of important hyper-parameters of the LSTM network and robustness tests.
|
|