|
Gomez-Cadenas, J. J., Martin-Albo, J., Menendez, J., Mezzetto, M., Monrabal, F., & Sorel, M. (2024). The search for neutrinoless double-beta decay. Riv. Nuovo Cimento, 46, 619–692.
Abstract: Neutrinos are the only particles in the Standard Model that could be Majorana fermions, that is, completely neutral fermions that are their own antiparticles. The most sensitive known experimental method to verify whether neutrinos are Majorana particles is the search for neutrinoless double-beta decay. The last 2 decades have witnessed the development of a vigorous program of neutrinoless double-beta decay experiments, spanning several isotopes and developing different strategies to handle the backgrounds masking a possible signal. In addition, remarkable progress has been made in the understanding of the nuclear matrix elements of neutrinoless double-beta decay, thus reducing a substantial part of the theoretical uncertainties affecting the particle-physics interpretation of the process. On the other hand, the negative results by several experiments, combined with the hints that the neutrino mass ordering could be normal, may imply very long lifetimes for the neutrinoless double-beta decay process. In this report, we review the main aspects of such process, the recent progress on theoretical ideas and the experimental state of the art. We then consider the experimental challenges to be addressed to increase the sensitivity to detect the process in the likely case that lifetimes are much longer than currently explored, and discuss a selection of the most promising experimental efforts.
|
|
|
Garcia Canal, C. A., Tarutina, T., & Vento, V. (2023). Analysis of Nuclear Effects in Structure Functions and Their Connection with the Binding Energy of Nuclei. Braz. J. Phys., 53(6), 161–8pp.
Abstract: We describe nuclear effects in structure functions of nuclei in DIS by means of a multiplicative factor beta(A)(x) which differentiates the structure function of the bound nucleons from that of the free nucleons. Our analysis determines that beta(A)(x) establishes a relation between the quark-gluon dynamics expressed by the bound nucleon structure functions and the nuclear dynamics as described by the well-known semi-empirical Bethe-Weizsacker mass formula. This relation corroborates a connection between the underlying quark-gluon dynamics and the phenomenological nuclear dynamics.
|
|
|
Fanchiotti, H., Garcia Canal, C. A., Mayosky, M., Veiga, A., & Vento, V. (2023). The Geometric Phase in Classical Systems and in the Equivalent Quantum Hermitian and Non-Hermitian PT-Symmetric Systems. Braz. J. Phys., 53(6), 143–11pp.
Abstract: The decomplexification procedure allows one to show mathematically (stricto sensu) the equivalence (isomorphism) between the quantum dynamics of a system with a finite number of basis states and a classical dynamics system. This unique way of connecting different dynamics was used in the past to analyze the relationship between the well-known geometric phase present in the quantum evolution discovered by Berry and its generalizations, with their analogs, the Hannay phases, in the classical domain. In here, this analysis is carried out for several quantum hermitian and non-hermitian PT-symmetric Hamiltonians and compared with the Hannay phase analysis in their classical isomorphic equivalent systems. As the equivalence ends in the classical domain with oscillator dynamics, we exploit the analogy to propose resonant electric circuits coupled with a gyrator, to reproduce the geometric phase coming from the theoretical solutions, in simulated laboratory experiments.
|
|
|
Albiol, F., Corbi, A., & Albiol, A. (2017). Evaluation of modern camera calibration techniques for conventional diagnostic X-ray imaging settings. Radiol. Phys. Technol., 10(1), 68–81.
Abstract: We explore three different alternatives for obtaining intrinsic and extrinsic parameters in conventional diagnostic X-ray frameworks: the direct linear transform (DLT), the Zhang method, and the Tsai approach. We analyze and describe the computational, operational, and mathematical background differences for these algorithms when they are applied to ordinary radiograph acquisition. For our study, we developed an initial 3D calibration frame with tin cross-shaped fiducials at specific locations. The three studied methods enable the derivation of projection matrices from 3D to 2D point correlations. We propose a set of metrics to compare the efficiency of each technique. One of these metrics consists of the calculation of the detector pixel density, which can be also included as part of the quality control sequence in general X-ray settings. The results show a clear superiority of the DLT approach, both in accuracy and operational suitability. We paid special attention to the Zhang calibration method. Although this technique has been extensively implemented in the field of computer vision, it has rarely been tested in depth in common radiograph production scenarios. Zhang's approach can operate on much simpler and more affordable 2D calibration frames, which were also tested in our research. We experimentally confirm that even three or four plane-image correspondences achieve accurate focal lengths.
|
|
|
Fernandez Casani, A., Orduña, J. M., Sanchez, J., & Gonzalez de la Hoz, S. (2021). A Reliable Large Distributed Object Store Based Platform for Collecting Event Metadata. J. Grid Comput., 19(3), 39–19pp.
Abstract: The Large Hadron Collider (LHC) is about to enter its third run at unprecedented energies. The experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousands of physics users. The ATLAS EventIndex project, currently running in production, builds a complete catalogue of particle collisions, or events, for the ATLAS experiment at the LHC. The distributed nature of the experiment data model is exploited by running jobs at over one hundred Grid data centers worldwide. Millions of files with petabytes of data are indexed, extracting a small quantity of metadata per event, that is conveyed with a data collection system in real time to a central Hadoop instance at CERN. After a successful first implementation based on a messaging system, some issues suggested performance bottlenecks for the challenging higher rates in next runs of the experiment. In this work we characterize the weaknesses of the previous messaging system, regarding complexity, scalability, performance and resource consumption. A new approach based on an object-based storage method was designed and implemented, taking into account the lessons learned and leveraging the ATLAS experience with this kind of systems. We present the experiment that we run during three months in the real production scenario worldwide, in order to evaluate the messaging and object store approaches. The results of the experiment show that the new object-based storage method can efficiently support large-scale data collection for big data environments like the next runs of the ATLAS experiment at the LHC.
|
|
|
Ikeno, N., Toledo, G., Liang, W. H., & Oset, E. (2023). Consistency of the Molecular Picture of Omega(2012) with the Latest Belle Results. Few-Body Syst., 64(3), 55–6pp.
Abstract: We report the results of the research on the Omega(2012) state based on themolecular picture and discuss the consistency of the picture with the Belle experimental results. We study the interaction of the (K) over bar Xi*, eta Omega(s-wave) and (K) over bar Xi(d-wave) channels within a coupled channel unitary approach, and obtain the mass and the width of the Omega(2012) state and the decay ratio R-Xi(K) over bar(Xi pi(K) over bar). We also present a mechanism for Omega c -> pi(+)Omega(2012) production through an external emission Cabibbo favoredweak decay mode, where the Omega(2012) is dynamically generated from the above interaction. We find that the results obtained by the molecular picture are consistent with all Belle experimental data.
|
|
|
Valcarce, A., Vijande, J., Richard, J. M., & Garcilazo, H. (2018). Stability of Heavy Tetraquarks. Few-Body Syst., 59(2), 9–7pp.
Abstract: We discuss the stability of tetraquark systems with two different masses. After some reminders about the stability of very asymmetric QQ (q) over bar(q) over bar tetraquarks, we demonstrate that in the all-heavy limit q -> Q, the system becomes unstable for standard color-additive models. We also analyze the consequences of symmetry breaking for Qq (Q) over bar(q) over bar configurations: we find a kind of metastability between the lowest threshold Q (Q) over bar + q (q) over bar and the highest one, Q (q) over bar + (Q) over barq, and we calculate the width of the resonance. Our results are consistent with the experimental observation of narrow hadrons lying well above their lowest decay threshold.
|
|
|
Blanton, T. D., Romero-Lopez, F., & Sharpe, S. R. (2019). Implementing the three-particle quantization condition including higher partial waves. J. High Energy Phys., 03(3), 106–56pp.
Abstract: We present an implementation of the relativistic three-particle quantization condition including both s- and d-wave two-particle channels. For this, we develop a systematic expansion of the three-particle K matrix, K-df,K-3, about threshold, which is the generalization of the effective range expansion of the two-particle K matrix, K-2. Relativistic invariance plays an important role in this expansion. We find that d-wave two-particle channels enter first at quadratic order. We explain how to implement the resulting multichannel quantization condition, and present several examples of its application. We derive the leading dependence of the threshold three-particle state on the two-particle d-wave scattering amplitude, and use this to test our implementation. We show how strong two-particle d-wave interactions can lead to significant effects on the finite-volume three-particle spectrum, including the possibility of a generalized three-particle Efimov-like bound state. We also explore the application to the 3 pi(+) system, which is accessible to lattice QCD simulations, where we study the sensitivity of the spectrum to the components of K-df,K-3. Finally, we investigate the circumstances under which the quantization condition has unphysical solutions.
|
|
|
LHCb Collaboration(Aaij, R. et al), Jaimes Elles, S. J., Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Rebollo De Miguel, M., et al. (2024). Study of Bc+ → χc π+ decays. J. High Energy Phys., 02(2), 173–30pp.
Abstract: A study of B-c(+) -> chi(c) pi(+) decays is reported using proton-proton collision data, collected with the LHCb detector at centre-of-mass energies of 7, 8, and 13TeV, corresponding to an integrated luminosity of 9 fb(-1). The decay B-c(+) -> chi(c2)pi(+) is observed for the first time, with a significance exceeding seven standard deviations. The relative branching fraction with respect to the B-c(+) -> J/psi pi(+) decay is measured to be BBc+ ->chi c2 pi+/BBc+ -> (J/psi pi+) = 0.37 +/- 0.06 +/- 0.02 +/- 0.01, where the first uncertainty is statistical, the second is systematic, and the third is due to the knowledge of the chi(c2) -> J/psi gamma branching fraction. No significant B-c(+) -> chi(+)(c1 pi) signal is observed and an upper limit for the relative branching fraction for the B-c(+) -> chi(c1)pi(+) and B-c(+) -> chi(c2)pi(+) decays of BBc+ ->chi c1 pi+/BBc+ -> chi(c2)pi(+) < 0.49 is set at the 90% confidence level.
|
|
|
SCiMMA and SNEWS Collaborations(Baxter, A. L. et al), & Colomer, M. (2022). Collaborative experience between scientific software projects using Agile Scrum development. Softw.-Pract. Exp., 52, 2077–2096.
Abstract: Developing sustainable software for the scientific community requires expertise in software engineering and domain science. This can be challenging due to the unique needs of scientific software, the insufficient resources for software engineering practices in the scientific community, and the complexity of developing for evolving scientific contexts. While open-source software can partially address these concerns, it can introduce complicating dependencies and delay development. These issues can be reduced if scientists and software developers collaborate. We present a case study wherein scientists from the SuperNova Early Warning System collaborated with software developers from the Scalable Cyberinfrastructure for Multi-Messenger Astrophysics project. The collaboration addressed the difficulties of open-source software development, but presented additional risks to each team. For the scientists, there was a concern of relying on external systems and lacking control in the development process. For the developers, there was a risk in supporting a user-group while maintaining core development. These issues were mitigated by creating a second Agile Scrum framework in parallel with the developers' ongoing Agile Scrum process. This Agile collaboration promoted communication, ensured that the scientists had an active role in development, and allowed the developers to evaluate and implement the scientists' software requirements. The collaboration provided benefits for each group: the scientists actuated their development by using an existing platform, and the developers utilized the scientists' use-case to improve their systems. This case study suggests that scientists and software developers can avoid scientific computing issues by collaborating and that Agile Scrum methods can address emergent concerns.
|
|