Caron, S., Eckner, C., Hendriks, L., Johannesson, G., Ruiz de Austri, R., & Zaharijas, G. (2023). Mind the gap: the discrepancy between simulation and reality drives interpretations of the Galactic Center Excess. J. Cosmol. Astropart. Phys., 06(6), 013–56pp.
Abstract: The Galactic Center Excess (GCE) in GeV gamma rays has been debated for over a decade, with the possibility that it might be due to dark matter annihilation or undetected point sources such as millisecond pulsars (MSPs). This study investigates how the gamma-ray emission model (-yEM) used in Galactic center analyses affects the interpretation of the GCE's nature. To address this issue, we construct an ultra-fast and powerful inference pipeline based on convolutional Deep Ensemble Networks. We explore the two main competing hypotheses for the GCE using a set of-yEMs with increasing parametric freedom. We calculate the fractional contribution (fsrc) of a dim population of MSPs to the total luminosity of the GCE and analyze its dependence on the complexity of the ryEM. For the simplest ryEM, we obtain fsrc = 0.10 f 0.07, while the most complex model yields fsrc = 0.79 f 0.24. In conclusion, we find that the statement about the nature of the GCE (dark matter or not) strongly depends on the assumed ryEM. The quoted results for fsrc do not account for the additional uncertainty arising from the fact that the observed gamma-ray sky is out-of-distribution concerning the investigated ryEM iterations. We quantify the reality gap between our ryEMs using deep-learning-based One-Class Deep Support Vector Data Description networks, revealing that all employed ryEMs have gaps to reality. Our study casts doubt on the validity of previous conclusions regarding the GCE and dark matter, and underscores the urgent need to account for the reality gap and consider previously overlooked “out of domain” uncertainties in future interpretations.
|
Achterberg, A., Amoroso, S., Caron, S., Hendriks, L., Ruiz de Austri, R., & Weniger, C. (2015). A description of the Galactic Center excess in the Minimal Supersymmetric Standard Model. J. Cosmol. Astropart. Phys., 08(8), 006–27pp.
Abstract: Observations with the Fermi Large Area Telescope (LAT) indicate an excess in gamma rays originating from the center of our Galaxy. A possible explanation for this excess is the annihilation of Dark Matter particles. We have investigated the annihilation of neutralinos as Dark Matter candidates within the phenomenological Minimal Supersymmetric Standard Model (pMSSM). An iterative particle filter approach was used to search for solutions within the pMSSM. We found solutions that are consistent with astroparticle physics and collider experiments, and provide a fit to the energy spectrum of the excess. The neutralino is a Bino/Higgsino or Bino/Wino/Higgsino mixture with a mass in the range 84-92 GeV or 87-97 GeV annihilating into W bosons. A third solutions is found for a neutralino of mass 174-187 GeV annihilating into top quarks. The best solutions yield a Dark Matter relic density 0.06 < Omega h(2) < 0.13. These pMSSM solutions make clear forecasts for LHC, direct and indirect DM detection experiments. If the pMSSM explanation of the excess seen by Fermi-LAT is correct, a DM signal might be discovered soon.
|
Strege, C., Bertone, G., Besjes, G. J., Caron, S., Ruiz de Austri, R., Strubig, A., et al. (2014). Profile likelihood maps of a 15-dimensional MSSM. J. High Energy Phys., 09(9), 081–59pp.
Abstract: We present statistically convergent profile likelihood maps obtained via global fits of a phenomenological Minimal Supersymmetric Standard Model with 15 free parameters (the MSSM-15), based on over 250M points. We derive constraints on the model parameters from direct detection limits on dark matter, the Planck relic density measurement and data from accelerator searches. We provide a detailed analysis of the rich phenomenology of this model, and determine the SUSY mass spectrum and dark matter properties that are preferred by current experimental constraints. We evaluate the impact of the measurement of the anomalous magnetic moment of the muon (g – 2) on our results, and provide an analysis of scenarios in which the lightest neutralino is a subdominant component of the dark matter. The MSSM-15 parameters are relatively weakly constrained by current data sets, with the exception of the parameters related to dark matter phenomenology (M-1, M-2, mu), which are restricted to the sub-TeV regime, mainly due to the relic density constraint. The mass of the lightest neutralino is found to be < 1.5TeV at 99% C.L., but can extend up to 3 TeV when excluding the g – 2 constraint from the analysis. Low-mass bino-like neutralinos are strongly favoured, with spin-independent scattering cross-sections extending to very small values, similar to 10(-20) pb. ATLAS SUSY null searches strongly impact on this mass range, and thus rule out a region of parameter space that is outside the reach of any current or future direct detection experiment. The best-fit point obtained after inclusion of all data corresponds to a squark mass of 2.3 TeV, a gluino mass of 2.1 TeV and a 130 GeV neutralino with a spin-independent cross-section of 2.4 x 10(-10) pb, which is within the reach of future multi-ton scale direct detection experiments and of the upcoming LHC run at increased centre-of-mass energy.
|
van Beekveld, M., Caron, S., Hendriks, L., Jackson, P., Leinweber, A., Otten, S., et al. (2021). Combining outlier analysis algorithms to identify new physics at the LHC. J. High Energy Phys., 09(9), 024–33pp.
Abstract: The lack of evidence for new physics at the Large Hadron Collider so far has prompted the development of model-independent search techniques. In this study, we compare the anomaly scores of a variety of anomaly detection techniques: an isolation forest, a Gaussian mixture model, a static autoencoder, and a beta-variational autoencoder (VAE), where we define the reconstruction loss of the latter as a weighted combination of regression and classification terms. We apply these algorithms to the 4-vectors of simulated LHC data, but also investigate the performance when the non-VAE algorithms are applied to the latent space variables created by the VAE. In addition, we assess the performance when the anomaly scores of these algorithms are combined in various ways. Using supersymmetric benchmark points, we find that the logical AND combination of the anomaly scores yielded from algorithms trained in the latent space of the VAE is the most effective discriminator of all methods tested.
|