Kim, J. S., Reuter, J., Rolbiecki, K., & Ruiz de Austri, R. (2016). A resonance without resonance: Scrutinizing the diphoton excess at 750 GeV. Phys. Lett. B, 755, 403–408.
Abstract: Motivated by the recent diphoton excesses reported by both ATLAS and CMS collaborations, we suggest that a new heavy spinless particle is produced in gluon fusion at the LHC and decays to a couple of lighter pseudoscalars which then decay to photons. The new resonances could arise from a new strongly interacting sector and couple to Standard Model gauge bosons only via the corresponding Wess-Zumino-Witten anomaly. We present a detailed recast of the newest 13 TeV data from ATLAS and CMS together with the 8 TeV data to scan the consistency of the parameter space for those resonances.
|
MoEDAL Collaboration(Acharya, B. et al), Bernabeu, J., Garcia, C., Mamuzic, J., Mitsou, V. A., Ruiz de Austri, R., et al. (2018). Search for magnetic monopoles with the MoEDAL forward trapping detector in 2.11 fb(-1) of 13 TeV proton-proton collisions at the LHC. Phys. Lett. B, 782, 510–516.
Abstract: We update our previous search for trapped magnetic monopoles in LHC Run 2 using nearly six times more integrated luminosity and including additional models for the interpretation of the data. The MoEDAL forward trapping detector, comprising 222 kg of aluminium samples, was exposed to 2.11 fb(-1) of 13 TeV proton-proton collisions near the LHCb interaction point and analysed by searching for induced persistent currents after passage through a superconducting magnetometer. Magnetic charges equal to the Dirac charge or above are excluded in all samples. The results are interpreted in Drell-Yan production models for monopoles with spins 0, 1/2 and 1: in addition to standard point-like couplings, we also consider couplings with momentum-dependent form factors. The search provides the best current laboratory constraints for monopoles with magnetic charges ranging from two to five times the Dirac charge.
|
Begone, G., Deisenroth, M. P., Kim, J. S., Liem, S., Ruiz de Austri, R., & Welling, M. (2019). Accelerating the BSM interpretation of LHC data with machine learning. Phys. Dark Universe, 24, 100293–5pp.
Abstract: The interpretation of Large Hadron Collider (LHC) data in the framework of Beyond the Standard Model (BSM) theories is hampered by the need to run computationally expensive event generators and detector simulators. Performing statistically convergent scans of high-dimensional BSM theories is consequently challenging, and in practice unfeasible for very high-dimensional BSM theories. We present here a new machine learning method that accelerates the interpretation of LHC data, by learning the relationship between BSM theory parameters and data. As a proof-of-concept, we demonstrate that this technique accurately predicts natural SUSY signal events in two signal regions at the High Luminosity LHC, up to four orders of magnitude faster than standard techniques. The new approach makes it possible to rapidly and accurately reconstruct the theory parameters of complex BSM theories, should an excess in the data be discovered at the LHC.
|
Kim, J. S., Lopez-Fogliani, D. E., Perez, A. D., & Ruiz de Austri, R. (2022). The new (g-2)(mu) and right-handed sneutrino dark matter. Nucl. Phys. B, 974, 115637–23pp.
Abstract: In this paper we investigate the (g – 2)(mu) discrepancy in the context of the R-parity conserving next-to minimal supersymmetric Standard Model plus right-handed neutrinos superfields. The model has the ability to reproduce neutrino physics data and includes the interesting possibility to have the right-handed sneutrino as the lightest supersymmetric particle and a viable dark matter candidate. Since right-handed sneutrinos are singlets, no new contributions for delta a(mu) with respect to the MSSM and NMSSM are present. However, the possibility to have the right-handed sneutrino as the lightest supersymmetric particle opens new ways to escape Large Hadron Collider and direct detection constraints. In particular, we find that dark matter masses within 10 less than or similar to m((upsilon) over tildeR) less than or similar to 600 GeV are fully compatible with current experimental constraints. Remarkably, not only spectra with light sleptons are needed, but we obtain solutions with m((mu) over tilde) greater than or similar to 600 GeV in the entire dark matter mass range that could be probed by new (g – 2)(mu) data in the near future. In addition, dark matter direct detection experiments will be able to explore a sizable portion of the allowed parameter space with mvR < 300 GeV, while indirect detection experiments will be able to probe a much smaller fraction within 200 less than or similar to m((nu)over tilde>R) less than or similar to 350 GeV.
|
MoEDAL Collaboration(Acharya, B. et al), Mitsou, V. A., Papavassiliou, J., Ruiz de Austri, R., Santra, A., Vento, V., et al. (2022). Search for magnetic monopoles produced via the Schwinger mechanism. Nature, 602(7895), 63–67.
Abstract: Electrically charged particles can be created by the decay of strong enough electric fields, a phenomenon known as the Schwinger mechanism(1). By electromagnetic duality, a sufficiently strong magnetic field would similarly produce magnetic monopoles, if they exist(2). Magnetic monopoles are hypothetical fundamental particles that are predicted by several theories beyond the standard model(3-7) but have never been experimentally detected. Searching for the existence of magnetic monopoles via the Schwinger mechanism has not yet been attempted, but it is advantageous, owing to the possibility of calculating its rate through semi-classical techniques without perturbation theory, as well as that the production of the magnetic monopoles should be enhanced by their finite size(8,9) and strong coupling to photons(2,10). Here we present a search for magnetic monopole production by the Schwinger mechanism in Pb-Pb heavy ion collisions at the Large Hadron Collider, producing the strongest known magnetic fields in the current Universe(11). It was conducted by the MoEDAL experiment, whose trapping detectors were exposed to 0.235 per nanobarn, or approximately 1.8 x 10(9), of Pb-Pb collisions with 5.02-teraelectronvolt center-of-mass energy per collision in November 2018. A superconducting quantum interference device (SQUID) magnetometer scanned the trapping detectors of MoEDAL for the presence of magnetic charge, which would induce a persistent current in the SQUID. Magnetic monopoles with integer Dirac charges of 1, 2 and 3 and masses up to 75 gigaelectronvolts per speed of light squared were excluded by the analysis at the 95% confidence level. This provides a lower mass limit for finite-size magnetic monopoles from a collider search and greatly extends previous mass bounds.
|
Otten, S., Caron, S., de Swart, W., van Beekveld, M., Hendriks, L., van Leeuwen, C., et al. (2021). Event generation and statistical sampling for physics with deep generative models and a density information buffer. Nat. Commun., 12(1), 2985–16pp.
Abstract: Simulating nature and in particular processes in particle physics require expensive computations and sometimes would take much longer than scientists can afford. Here, we explore ways to a solution for this problem by investigating recent advances in generative modeling and present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but to also ensure that these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with several generative machine learning models to produce events like Monte Carlo generators. We study three processes: a simple two-body decay, the processes e(+)e(-)-> Z -> l(+)l(-) and pp -> tt<mml:mo><overbar></mml:mover> including the decay of the top quarks and a simulation of the detector response. By buffering density information of encoded Monte Carlo events given the encoder of a Variational Autoencoder we are able to construct a prior for the sampling of new events from the decoder that yields distributions that are in very good agreement with real Monte Carlo events and are generated several orders of magnitude faster. Applications of this work include generic density estimation and sampling, targeted event generation via a principal component analysis of encoded ground truth data, anomaly detection and more efficient importance sampling, e.g., for the phase space integration of matrix elements in quantum field theories. Here, the authors report buffered-density variational autoencoders for the generation of physical events. This method is computationally less expensive over other traditional methods and beyond accelerating the data generation process, it can help to steer the generation and to detect anomalies.
|
Cabrera, M. E., Casas, J. A., & Ruiz de Austri, R. (2010). MSSM forecast for the LHC. J. High Energy Phys., 05(5), 043–48pp.
Abstract: We perform a forecast of the MSSM with universal soft terms (CMSSM) for the LHC, based on an improved Bayesian analysis. We do not incorporate ad hoc measures of the fine-tuning to penalize unnatural possibilities: such penalization arises from the Bayesian analysis itself when the experimental value of M-Z is considered. This allows to scan the whole parameter space, allowing arbitrarily large soft terms. Still the low-energy region is statistically favoured (even before including dark matter or g-2 constraints). Contrary to other studies, the results are almost unaffected by changing the upper limits taken for the soft terms. The results are also remarkable stable when using flat or logarithmic priors, a fact that arises from the larger statistical weight of the low-energy region in both cases. Then we incorporate all the important experimental constrains to the analysis, obtaining a map of the probability density of the MSSM parameter space, i.e. the forecast of the MSSM. Since not all the experimental information is equally robust, we perform separate analyses depending on the group of observables used. When only the most robust ones are used, the favoured region of the parameter space contains a significant portion outside the LHC reach. This effect gets reinforced if the Higgs mass is not close to its present experimental limit and persits when dark matter constraints are included. Only when the g-2 constraint (based on e(+)e(-) data) is considered, the preferred region (for μ> 0) is well inside the LHC scope. We also perform a Bayesian comparison of the positive- and negative-mu possibilities.
|
Bridges, M., Cranmer, K., Feroz, F., Hobson, M., Ruiz de Austri, R., & Trotta, R. (2011). A coverage study of the CMSSM based on ATLAS sensitivity using fast neural networks techniques. J. High Energy Phys., 03(3), 012–23pp.
Abstract: We assess the coverage properties of confidence and credible intervals on the CMSSM parameter space inferred from a Bayesian posterior and the profile likelihood based on an ATLAS sensitivity study. In order to make those calculations feasible, we introduce a new method based on neural networks to approximate the mapping between CMSSM parameters and weak-scale particle masses. Our method reduces the computational effort needed to sample the CMSSM parameter space by a factor of similar to 10(4) with respect to conventional techniques. We find that both the Bayesian posterior and the profile likelihood intervals can significantly over-cover and identify the origin of this effect to physical boundaries in the parameter space. Finally, we point out that the effects intrinsic to the statistical procedure are conflated with simplifications to the likelihood functions from the experiments themselves.
|
Casas, J. A., Moreno, J. M., Rius, N., Ruiz de Austri, R., & Zaldivar, B. (2011). Fair scans of the seesaw. Consequences for predictions on LFV processes. J. High Energy Phys., 03(3), 034–22pp.
Abstract: We give a straightforward procedure to scan the seesaw parameter-space, using the common “R-parametrization”, in a complete way. This includes a very simple rule to incorporate the perturbativity requirement as a condition for the entries of the R-matrix. As a relevant application, we show that the somewhat propagated belief that BR(mu -> e, gamma) in supersymmetric seesaw models depends strongly on the value of theta(13) is an “optical effect” produced by incomplete scans, and does not hold after a careful analytical and numerical study. When the complete scan is done, BR(mu -> e, gamma) gets very insensitive to theta(13). This holds even if the right-handed neutrino masses are kept constant or under control (as is required for succesful leptogenesis). In most cases the values of BR(mu -> e, gamma) are larger than the experimental upper bound. Including (unflavoured) leptogenesis does not introduce any further dependence on theta(13), although decreases the typical value of BR(mu -> e, gamma).
|
Feroz, F., Cranmer, K., Hobson, M., Ruiz de Austri, R., & Trotta, R. (2011). Challenges of profile likelihood evaluation in multi-dimensional SUSY scans. J. High Energy Phys., 06(6), 042–23pp.
Abstract: Statistical inference of the fundamental parameters of supersymmetric theories is a challenging and active endeavor. Several sophisticated algorithms have been employed to this end. While Markov-Chain Monte Carlo (MCMC) and nested sampling techniques are geared towards Bayesian inference, they have also been used to estimate frequentist confidence intervals based on the profile likelihood ratio. We investigate the performance and appropriate configuration of MULTINEST, a nested sampling based algorithm, when used for profile likelihood-based analyses both on toy models and on the parameter space of the Constrained MSSM. We find that while the standard configuration previously used in the literarture is appropriate for an accurate reconstruction of the Bayesian posterior, the profile likelihood is poorly approximated. We identify a more appropriate MULTINEST configuration for profile likelihood analyses, which gives an excellent exploration of the profile likelihood (albeit at a larger computational cost), including the identification of the global maximum likelihood value. We conclude that with the appropriate configuration MULTINEST is a suitable tool for profile likelihood studies, indicating previous claims to the contrary are not well founded.
|