|
Brzezinski, K. et al. (2023). Detection of range shifts in proton beam therapy using the J-PET scanner: a patient simulation study. Phys. Med. Biol., 68(14), 145016–17pp.
Abstract: Objective. The Jagiellonian positron emission tomography (J-PET) technology, based on plastic scintillators, has been proposed as a cost effective tool for detecting range deviations during proton therapy. This study investigates the feasibility of using J-PET for range monitoring by means of a detailed Monte Carlo simulation study of 95 patients who underwent proton therapy at the Cyclotron Centre Bronowice (CCB) in Krakow, Poland. Approach. Discrepancies between prescribed and delivered treatments were artificially introduced in the simulations by means of shifts in patient positioning and in the Hounsfield unit to the relative proton stopping power calibration curve. A dual-layer, cylindrical J-PET geometry was simulated in an in-room monitoring scenario and a triple-layer, dual-head geometry in an in-beam protocol. The distribution of range shifts in reconstructed PET activity was visualized in the beam's eye view. Linear prediction models were constructed from all patients in the cohort, using the mean shift in reconstructed PET activity as a predictor of the mean proton range deviation. Main results. Maps of deviations in the range of reconstructed PET distributions showed agreement with those of deviations in dose range in most patients. The linear prediction model showed a good fit, with coefficient of determination r (2) = 0.84 (in-room) and 0.75 (in-beam). Residual standard error was below 1 mm: 0.33 mm (in-room) and 0.23 mm (in-beam). Significance. The precision of the proposed prediction models shows the sensitivity of the proposed J-PET scanners to shifts in proton range for a wide range of clinical treatment plans. Furthermore, it motivates the use of such models as a tool for predicting proton range deviations and opens up new prospects for investigations into the use of intra-treatment PET images for predicting clinical metrics that aid in the assessment of the quality of delivered treatment.
|
|
|
Gisbert, H., & Pich, A. (2018). Direct CP violation in K-0 -> pi pi : Standard Model Status. Rep. Prog. Phys., 81(7), 076201–22pp.
Abstract: In 1988 the NA31 experiment presented the first evidence of direct CP violation in the K-0 -> pi pi decay amplitudes. A clear signal with a 7.2 sigma statistical significance was later established with the full data samples from the NA31, E731, NA48 and KTeV experiments, confirming that CP violation is associated with a Delta S = 1 quark transition, as predicted by the Standard Model. However, the theoretical prediction for the measured ratio epsilon'/epsilon has been a subject of strong controversy along the years. Although the underlying physics was already clarified in 2001, the recent release of improved lattice data has revived again the theoretical debate. We review the current status, discussing in detail the different ingredients that enter into the calculation of this observable and the reasons why seemingly contradictory predictions were obtained in the past by several groups. An update of the Standard Model prediction is presented and the prospects for future improvements are analysed. Taking into account all known short-distance and long-distance contributions, one obtains Re (epsilon' / epsilon) = (15 +/- 7) . 10(-4), in good agreement with the experimental measurement.
|
|
|
Curtin, D. et al, & Hirsch, M. (2019). Long-lived particles at the energy frontier: the MATHUSLA physics case. Rep. Prog. Phys., 82(11), 116201–133pp.
Abstract: We examine the theoretical motivations for long-lived particle (LLP) signals at the LHC in a comprehensive survey of standard model (SM) extensions. LLPs are a common prediction of a wide range of theories that address unsolved fundamental mysteries such as naturalness, dark matter, baryogenesis and neutrino masses, and represent a natural and generic possibility for physics beyond the SM (BSM). In most cases the LLP lifetime can be treated as a free parameter from the μm scale up to the Big Bang Nucleosynthesis limit of similar to 10(7) m. Neutral LLPs with lifetimes above similar to 100 m are particularly difficult to probe, as the sensitivity of the LHC main detectors is limited by challenging backgrounds, triggers, and small acceptances. MATHUSLA is a proposal for a minimally instrumented, large-volume surface detector near ATLAS or CMS. It would search for neutral LLPs produced in HL-LHC collisions by reconstructing displaced vertices (DVs) in a low-background environment, extending the sensitivity of the main detectors by orders of magnitude in the long-lifetime regime. We study the LLP physics opportunities afforded by a MATHUSLA-like detector at the HL-LHC, assuming backgrounds can be rejected as expected. We develop a model-independent approach to describe the sensitivity of MATHUSLA to BSM LLP signals, and compare it to DV and missing energy searches at ATLAS or CMS. We then explore the BSM motivations for LLPs in considerable detail, presenting a large number of new sensitivity studies. While our discussion is especially oriented towards the long-lifetime regime at MATHUSLA, this survey underlines the importance of a varied LLP search program at the LHC in general. By synthesizing these results into a general discussion of the top-down and bottom-up motivations for LLP searches, it is our aim to demonstrate the exceptional strength and breadth of the physics case for the construction of the MATHUSLA detector.
|
|
|
Kasieczka, G. et al, & Sanz, V. (2021). The LHC Olympics 2020: a community challenge for anomaly detection in high energy physics. Rep. Prog. Phys., 84(12), 124201–64pp.
Abstract: A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied by a set of simulated collider events. Participants in these Olympics have developed their methods using an R&D dataset and then tested them on black boxes: datasets with an unknown anomaly (or not). Methods made use of modern machine learning tools and were based on unsupervised learning (autoencoders, generative adversarial networks, normalizing flows), weakly supervised learning, and semi-supervised learning. This paper will review the LHC Olympics 2020 challenge, including an overview of the competition, a description of methods deployed in the competition, lessons learned from the experience, and implications for data analyses with future datasets as well as future colliders.
|
|
|
Borsato, M. et al, Zurita, J., Henry, L., Jashal, B. K., & Oyanguren, A. (2022). Unleashing the full power of LHCb to probe stealth new physics. Rep. Prog. Phys., 85(2), 024201–45pp.
Abstract: In this paper, we describe the potential of the LHCb experiment to detect stealth physics. This refers to dynamics beyond the standard model that would elude searches that focus on energetic objects or precision measurements of known processes. Stealth signatures include long-lived particles and light resonances that are produced very rarely or together with overwhelming backgrounds. We will discuss why LHCb is equipped to discover this kind of physics at the Large Hadron Collider and provide examples of well-motivated theoretical models that can be probed with great detail at the experiment.
|
|
|
AbdusSalam, S. S. et al, & Eberhardt, O. (2022). Simple and statistically sound recommendations for analysing physical theories. Rep. Prog. Phys., 85(5), 052201–11pp.
Abstract: Physical theories that depend on many parameters or are tested against data from many different experiments pose unique challenges to statistical inference. Many models in particle physics, astrophysics and cosmology fall into one or both of these categories. These issues are often sidestepped with statistically unsound ad hoc methods, involving intersection of parameter intervals estimated by multiple experiments, and random or grid sampling of model parameters. Whilst these methods are easy to apply, they exhibit pathologies even in low-dimensional parameter spaces, and quickly become problematic to use and interpret in higher dimensions. In this article we give clear guidance for going beyond these procedures, suggesting where possible simple methods for performing statistically sound inference, and recommendations of readily-available software tools and standards that can assist in doing so. Our aim is to provide any physicists lacking comprehensive statistical training with recommendations for reaching correct scientific conclusions, with only a modest increase in analysis burden. Our examples can be reproduced with the code publicly available at Zenodo.
|
|
|
Kogler, R., Nachman, B., Schmidt, A., Asquith, L., Winkels, E., Campanelli, M., et al. (2019). Jet substructure at the Large Hadron Collider. Rev. Mod. Phys., 91(4), 045003–44pp.
Abstract: Jet substructure has emerged to play a central role at the Large Hadron Collider, where it has provided numerous innovative ways to search for new physics and to probe the standard model, particularly in extreme regions of phase space. This review focuses on the development and use of state-of-the-art jet substructure techniques by the ATLAS and CMS experiments.
|
|
|
de Azcarraga, J. A. (2022). The new Spanish educational legislation: why public education will not improve. Rev. Esp. Pedagog., 80(281), 111–129.
Abstract: This paper provides some reasons that explain, in the view of the author, why the present eagerness of the Spanish Educational Authorities to reform all levels of education, from primary school to the universities, will not improve the quality of the Spanish educational system.
|
|
|
ANTARES Collaboration(Albert, A. et al), Barrios-Marti, J., Coleiro, A., Colomer, M., Hernandez-Rey, J. J., Illuminati, G., et al. (2019). The search for high-energy neutrinos coincident with fast radio bursts with the ANTARES neutrino telescope. Mon. Not. Roy. Astron. Soc., 482(1), 184–193.
Abstract: In the past decade, a new class of bright transient radio sources with millisecond duration has been discovered. The origin of these so-called fast radio bursts (FRBs) is still a mystery, despite the growing observational efforts made by various multiwavelength and multimessenger facilities. To date, many models have been proposed to explain FRBs, but neither the progenitors nor the radiative and the particle acceleration processes at work have been clearly identified. In this paper, we assess whether hadronic processes may occur in the vicinity of the FRB source. If they do, FRBs may contribute to the high-energy cosmic-ray and neutrino fluxes. A search for these hadronic signatures was carried out using the ANTARES neutrino telescope. The analysis consists in looking for high-energy neutrinos, in the TeV-PeV regime, that are spatially and temporally coincident with the detected FRBs. Most of the FRBs discovered in the period 2013-2017 were in the field of view of the ANTARES detector, which is sensitive mostly to events originating from the Southern hemisphere. From this period, 12 FRBs were selected and no coincident neutrino candidate was observed. Upper limits on the per-burst neutrino fluence were derived using a power-law spectrum, dN/DE nu proportional to E-nu(-gamma), for the incoming neutrino flux, assuming spectral indexes gamma = 1.0, 2.0, 2.5. Finally, the neutrino energy was constrained by computing the total energy radiated in neutrinos, assuming different distances for the FRBs. Constraints on the neutrino fluence and on the energy released were derived from the associated null results.
|
|
|
Vagnozzi, S., Visinelli, L., Mena, O., & Mota, D. F. (2020). Do we have any hope of detecting scattering between dark energy and baryons through cosmology? Mon. Not. Roy. Astron. Soc., 493(1), 1139–1152.
Abstract: We consider the possibility that dark energy and baryons might scatter off each other. The type of interaction we consider leads to a pure momentum exchange, and does not affect the background evolution of the expansion history. We parametrize this interaction in an effective way at the level of Boltzmann equations. We compute the effect of dark energy-baryon scattering on cosmological observables, focusing on the cosmic microwave background (CMB) temperature anisotropy power spectrum and the matter power spectrum. Surprisingly, we find that even huge dark energy-baryon cross-sections sigma(xb) similar to O(b), which are generically excluded by non-cosmological probes such as collider searches or precision gravity tests, only leave an insignificant imprint on the observables considered. In the case of the CMB temperature power spectrum, the only imprint consists in a sub-per cent enhancement or depletion of power (depending whether or not the dark energy equation of state lies above or below -1) at very low multipoles, which is thus swamped by cosmic variance. These effects are explained in terms of differences in how gravitational potentials decay in the presence of a dark energy-baryon scattering, which ultimately lead to an increase or decrease in the late-time integrated Sachs-Wolfe power. Even smaller related effects are imprinted on the matter power spectrum. The imprints on the CMB are not expected to be degenerate with the effects due to altering the dark energy sound speed. We conclude that, while strongly appealing, the prospects for a direct detection of dark energy through cosmology do not seem feasible when considering realistic dark energy-baryon cross-sections. As a caveat, our results hold to linear order in perturbation theory.
|
|