|
Baran, J. et al, & Brzezinski, K. (2024). Feasibility of the J-PET to monitor the range of therapeutic proton beams. Phys. Medica, 118, 103301–9pp.
Abstract: Purpose: The aim of this work is to investigate the feasibility of the Jagiellonian Positron Emission Tomography (J -PET) scanner for intra-treatment proton beam range monitoring. Methods: The Monte Carlo simulation studies with GATE and PET image reconstruction with CASToR were performed in order to compare six J -PET scanner geometries. We simulated proton irradiation of a PMMA phantom with a Single Pencil Beam (SPB) and Spread -Out Bragg Peak (SOBP) of various ranges. The sensitivity and precision of each scanner were calculated, and considering the setup's cost-effectiveness, we indicated potentially optimal geometries for the J -PET scanner prototype dedicated to the proton beam range assessment. Results: The investigations indicate that the double -layer cylindrical and triple -layer double -head configurations are the most promising for clinical application. We found that the scanner sensitivity is of the order of 10-5 coincidences per primary proton, while the precision of the range assessment for both SPB and SOBP irradiation plans was found below 1 mm. Among the scanners with the same number of detector modules, the best results are found for the triple -layer dual -head geometry. The results indicate that the double -layer cylindrical and triple -layer double -head configurations are the most promising for the clinical application, Conclusions: We performed simulation studies demonstrating that the feasibility of the J -PET detector for PET -based proton beam therapy range monitoring is possible with reasonable sensitivity and precision enabling its pre -clinical tests in the clinical proton therapy environment. Considering the sensitivity, precision and cost-effectiveness, the double -layer cylindrical and triple -layer dual -head J -PET geometry configurations seem promising for future clinical application.
|
|
|
Ramirez-Uribe, S., Hernandez-Pinto, R. J., Rodrigo, G., & Sborlini, G. F. R. (2022). From Five-Loop Scattering Amplitudes to Open Trees with the Loop-Tree Duality. Symmetry-Basel, 14(12), 2571–14pp.
Abstract: Characterizing multiloop topologies is an important step towards developing novel methods at high perturbative orders in quantum field theory. In this article, we exploit the Loop-Tree Duality (LTD) formalism to analyse multiloop topologies that appear for the first time at five loops. Explicitly, we open the loops into connected trees and group them according to their topological properties. Then, we identify a kernel generator, the so-called N7MLT universal topology, that allows us to describe any scattering amplitude of up to five loops. Furthermore, we provide factorization and recursion relations that enable us to write these multiloop topologies in terms of simpler subtopologies, including several subsets of Feynman diagrams with an arbitrary number of loops. Our approach takes advantage of many symmetries present in the graphical description of the original fundamental five-loop topologies. The results obtained in this article might shed light into a more efficient determination of higher-order corrections to the running couplings, which are crucial in the current and future precision physics program.
|
|
|
Ayala, C., Cvetic, G., & Kogerler, R. (2017). Lattice-motivated holomorphic nearly perturbative QCD. J. Phys. G, 44(7), 075001–30pp.
Abstract: Newer lattice results indicate that, in the Landau gauge at low spacelike momenta, the gluon propagator and the ghost dressing function are finite non-zero. This leads to a definition of the QCD running coupling, in a specific scheme, that goes to zero at low spacelike momenta. We construct a running coupling which fulfills these conditions, and at the same time reproduces to a high precision the perturbative behavior at high momenta. The coupling is constructed in such a way that it reflects qualitatively correctly the holomorphic (analytic) behavior of spacelike observables in the complex plane of the squared momenta, as dictated by the general principles of quantum field theories. Further, we require the coupling to reproduce correctly the nonstrange semihadronic decay rate of tau lepton which is the best measured low-momentum QCD observable with small higher-twist effects. Subsequent application of the Borel sum rules to the V + A spectral functions of tau lepton decays, as measured by OPAL Collaboration, determines the values of the gluon condensate and of the V + A six-dimensional condensate, and reproduces the data to a significantly higher precision than the usual (MS) over bar running coupling.
|
|
|
Peinado, E., Reig, M., Srivastava, R., & Valle, J. W. F. (2020). Dirac neutrinos from Peccei-Quinn symmetry: A fresh look at the axion. Mod. Phys. Lett. A, 35(21), 2050176–9pp.
Abstract: We show that a very simple solution to the strong CP problem naturally leads to Dirac neutrinos. Small effective neutrino masses emerge from a type-I Dirac seesaw mechanism. Neutrino mass limits probe the axion parameters in regions currently inaccessible to conventional searches.
|
|
|
Dias, A. G., Leite, J., Valle, J. W. F., & Vaquera-Araujo, C. A. (2020). Reloading the axion in a 3-3-1 setup. Phys. Lett. B, 810, 135829–12pp.
Abstract: We generalize the idea of the axion to an extended electroweak gauge symmetry setup. We propose a minimal axion extension of the Singer-Valle-Schechter (SVS) theory, in which the standard model fits in SU(3)(L) circle times U(1)(X), the number of families results from anomaly cancellation, and the Peccei-Quinn (PQ) solution to the strong-CP problem is implemented. Neutrino masses arise from a type-I Dirac seesaw mechanism, suppressed by the ratio of SVS and PQ scales, suggesting the existence of new physics at a moderate SVS scale. Novel features include an enhanced axion coupling to photons when compared to the DFSZ axion, as well as flavor-changing axion couplings to quarks.
|
|
|
LHCb Collaboration(Aaij, R. et al), Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Remon Alepuz, C., & Ruiz Vidal, J. (2022). Centrality determination in heavy-ion collisions with the LHCb detector. J. Instrum., 17(5), P05009–31pp.
Abstract: The centrality of heavy-ion collisions is directly related to the created medium in these interactions. A procedure to determine the centrality of collisions with the LHCb detector is implemented for lead-lead collisions root s(NN) = 5 TeV and lead-neon fixed-target collisions at root s(NN) = 69 GeV. The energy deposits in the electromagnetic calorimeter are used to determine and define the centrality classes. The correspondence between the number of participants and the centrality for the lead-lead collisions is in good agreement with the correspondence found in other experiments, and the centrality measurements for the lead-neon collisions presented here are performed for the first time in fixed-target collisions at the LHC.
|
|
|
LHCb Collaboration(Aaij, R. et al), Garcia Martin, L. M., Henry, L., Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., et al. (2019). Measurement of the electron reconstruction efficiency at LHCb. J. Instrum., 14, P11023–20pp.
Abstract: The single electron track-reconstruction efficiency is calibrated using a sample corresponding to 1.3 fb(-1) of pp collision data recorded with the LHCb detector in 2017. This measurement exploits B+ -> J/psi (e(+)e(-))K+ decays, where one of the electrons is fully reconstructed and paired with the kaon, while the other electron is reconstructed using only the information of the vertex detector. Despite this partial reconstruction, kinematic and geometric constraints allow the B meson mass to be reconstructed and the signal to be well separated from backgrounds. This in turn allows the electron reconstruction efficiency to be measured by matching the partial track segment found in the vertex detector to tracks found by LHCb's regular reconstruction algorithms. The agreement between data and simulation is evaluated, and corrections are derived for simulated electrons in bins of kinematics. These correction factors allow LHCb to measure branching fractions involving single electrons with a systematic uncertainty below 1%.
|
|
|
Renner, J., Cervera-Villanueva, A., Hernando, J. A., Izmaylov, A., Monrabal, F., Muñoz, J., et al. (2015). Improved background rejection in neutrinoless double beta decay experiments using a magnetic field in a high pressure xenon TPC. J. Instrum., 10, P12020–19pp.
Abstract: We demonstrate that the application of an external magnetic field could lead to an improved background rejection in neutrinoless double-beta (0 nu beta beta) decay experiments using a high-pressure xenon (HPXe) TPC. HPXe chambers are capable of imaging electron tracks, a feature that enhances the separation between signal events (the two electrons emitted in the 0 nu beta beta decay of Xe-136) and background events, arising chiefly from single electrons of kinetic energy compatible with the end-point of the 0 nu beta beta decay (Q(beta beta)). Applying an external magnetic field of sufficiently high intensity (in the range of 0.5-1 Tesla for operating pressures in the range of 5-15 atmospheres) causes the electrons to produce helical tracks. Assuming the tracks can be properly reconstructed, the sign of the curvature can be determined at several points along these tracks, and such information can be used to separate signal (0 nu beta beta) events containing two electrons producing a track with two different directions of curvature from background (single-electron) events producing a track that should spiral in a single direction. Due to electron multiple scattering, this strategy is not perfectly efficient on an event-by-event basis, but a statistical estimator can be constructed which can be used to reject background events by one order of magnitude at a moderate cost (about 30%) in signal efficiency. Combining this estimator with the excellent energy resolution and topological signature identification characteristic of the HPXe TPC, it is possible to reach a background rate of less than one count per ton-year of exposure. Such a low background rate is an essential feature of the next generation of 0 nu beta beta experiments, aiming to fully explore the inverse hierarchy of neutrino masses.
|
|
|
NEXT Collaboration(Alvarez, V. et al), Carcel, S., Cervera-Villanueva, A., Diaz, J., Ferrario, P., Gil, A., et al. (2013). Operation and first results of the NEXT-DEMO prototype using a silicon photomultiplier tracking array. J. Instrum., 8, P09011–20pp.
Abstract: NEXT-DEMO is a high-pressure xenon gas TPC which acts as a technological test-bed and demonstrator for the NEXT-100 neutrinoless double beta decay experiment. In its current configuration the apparatus fully implements the NEXT-100 design concept. This is an asymmetric TPC, with an energy plane made of photomultipliers and a tracking plane made of silicon photomultipliers (SiPM) coated with TPB. The detector in this new configuration has been used to reconstruct the characteristic signature of electrons in dense gas, demonstrating the ability to identify the MIP and “blob” regions. Moreover, the SiPM tracking plane allows for the definition of a large fiducial region in which an excellent energy resolution of 1.82% FWHM at 511 keV has been measured (a value which extrapolates to 0.83% at the xenon Q(beta beta)).
|
|
|
Carrasco-Ribelles, L. A., Pardo-Mas, J. R., Tortajada, S., Saez, C., Valdivieso, B., & Garcia-Gomez, J. M. (2021). Predicting morbidity by local similarities in multi-scale patient trajectories. J. Biomed. Inform., 120, 103837–9pp.
Abstract: Patient Trajectories (PTs) are a method of representing the temporal evolution of patients. They can include information from different sources and be used in socio-medical or clinical domains. PTs have generally been used to generate and study the most common trajectories in, for instance, the development of a disease. On the other hand, healthcare predictive models generally rely on static snapshots of patient information. Only a few works about prediction in healthcare have been found that use PTs, and therefore benefit from their temporal dimension. All of them, however, have used PTs created from single-source information. Therefore, the use of longitudinal multi-scale data to build PTs and use them to obtain predictions about health conditions is yet to be explored. Our hypothesis is that local similarities on small chunks of PTs can identify similar patients concerning their future morbidities. The objectives of this work are (1) to develop a methodology to identify local similarities between PTs before the occurrence of morbidities to predict these on new query individuals; and (2) to validate this methodology on risk prediction of cardiovascular diseases (CVD) occurrence in patients with diabetes. We have proposed a novel formal definition of PTs based on sequences of longitudinal multi-scale data. Moreover, a dynamic programming methodology to identify local alignments on PTs for predicting future morbidities is proposed. Both the proposed methodology for PT definition and the alignment algorithm are generic to be applied on any clinical domain. We validated this solution for predicting CVD in patients with diabetes and we achieved a precision of 0.33, a recall of 0.72 and a specificity of 0.38. Therefore, the proposed solution in the diabetes use case can result of utmost utility to secondary screening.
|
|