|
Achterberg, A., Amoroso, S., Caron, S., Hendriks, L., Ruiz de Austri, R., & Weniger, C. (2015). A description of the Galactic Center excess in the Minimal Supersymmetric Standard Model. J. Cosmol. Astropart. Phys., 08(8), 006–27pp.
Abstract: Observations with the Fermi Large Area Telescope (LAT) indicate an excess in gamma rays originating from the center of our Galaxy. A possible explanation for this excess is the annihilation of Dark Matter particles. We have investigated the annihilation of neutralinos as Dark Matter candidates within the phenomenological Minimal Supersymmetric Standard Model (pMSSM). An iterative particle filter approach was used to search for solutions within the pMSSM. We found solutions that are consistent with astroparticle physics and collider experiments, and provide a fit to the energy spectrum of the excess. The neutralino is a Bino/Higgsino or Bino/Wino/Higgsino mixture with a mass in the range 84-92 GeV or 87-97 GeV annihilating into W bosons. A third solutions is found for a neutralino of mass 174-187 GeV annihilating into top quarks. The best solutions yield a Dark Matter relic density 0.06 < Omega h(2) < 0.13. These pMSSM solutions make clear forecasts for LHC, direct and indirect DM detection experiments. If the pMSSM explanation of the excess seen by Fermi-LAT is correct, a DM signal might be discovered soon.
|
|
|
Achterberg, A., van Beekveld, M., Caron, S., Gomez-Vargas, G. A., Hendriks, L., & Ruiz de Austri, R. (2017). Implications of the Fermi-LAT Pass 8 Galactic Center excess on supersymmetric dark matter. J. Cosmol. Astropart. Phys., 12(12), 040–23pp.
Abstract: The Fermi Collaboration has recently updated their analysis of gamma rays from the center of the Galaxy. They reconfirm the presence of an unexplained emission feature which is most prominent in the region of 1-10 GeV, known as the Galactic Center GeV excess (GCE). Although the GCE is now fi rmly detected, an interpretation of this emission as a signal of self-annihilating dark matter (DM) particles is not unambiguously possible due to systematic effects in the gamma-ray modeling estimated in the Galactic Plane. In this paper we build a covariance matrix, collecting different systematic uncertainties investigated in the Fermi Collaboration's paper that affect the GCE spectrum. We show that models where part of the GCE is due to annihilating DM is still consistent with the new data. We also re-evaluate the parameter space regions of the minimal supersymmetric Standard Model (MSSM) that can contribute dominantly to the GCE via neutralino DM annihilation. All recent constraints from DM direct detection experiments such as PICO, LUX, PandaX and Xenon1T, limits on the annihilation cross section from dwarf spheroidal galaxies and the Large Hadron Collider limits are considered in this analysis. Due to a slight shift in the energy spectrum of the GC excess with respect to the previous Fermi analysis, and the recent limits from direct detection experiments, we find a slightly shifted parameter region of the MSSM, compared to our previous analysis, that is consistent with the GCE. Neutralinos with a mass between 85-220 GeV can describe the excess via annihilation into a pair of W-bosons or top quarks. Remarkably, there are models with low fine-tuning among the regions that we have found. The complete set of solutions will be probed by upcoming direct detection experiments and with dedicated searches in the upcoming data of the Large Hadron Collider.
|
|
|
Caron, S., Gomez-Vargas, G. A., Hendriks, L., & Ruiz de Austri, R. (2018). Analyzing gamma rays of the Galactic Center with deep learning. J. Cosmol. Astropart. Phys., 05(5), 058–24pp.
Abstract: We present the application of convolutional neural networks to a particular problem in gamma ray astronomy. Explicitly, we use this method to investigate the origin of an excess emission of GeV gamma rays in the direction of the Galactic Center, reported by several groups by analyzing Fermi-LAT data. Interpretations of this excess include gamma rays created by the annihilation of dark matter particles and gamma rays originating from a collection of unresolved point sources, such as millisecond pulsars. We train and test convolutional neural networks with simulated Fermi-LAT images based on point and diffuse emission models of the Galactic Center tuned to measured gamma ray data. Our new method allows precise measurements of the contribution and properties of an unresolved population of gamma ray point sources in the interstellar diffuse emission model. The current model predicts the fraction of unresolved point sources with an error of up to 10% and this is expected to decrease with future work.
|
|
|
Otten, S., Caron, S., de Swart, W., van Beekveld, M., Hendriks, L., van Leeuwen, C., et al. (2021). Event generation and statistical sampling for physics with deep generative models and a density information buffer. Nat. Commun., 12(1), 2985–16pp.
Abstract: Simulating nature and in particular processes in particle physics require expensive computations and sometimes would take much longer than scientists can afford. Here, we explore ways to a solution for this problem by investigating recent advances in generative modeling and present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but to also ensure that these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with several generative machine learning models to produce events like Monte Carlo generators. We study three processes: a simple two-body decay, the processes e(+)e(-)-> Z -> l(+)l(-) and pp -> tt<mml:mo><overbar></mml:mover> including the decay of the top quarks and a simulation of the detector response. By buffering density information of encoded Monte Carlo events given the encoder of a Variational Autoencoder we are able to construct a prior for the sampling of new events from the decoder that yields distributions that are in very good agreement with real Monte Carlo events and are generated several orders of magnitude faster. Applications of this work include generic density estimation and sampling, targeted event generation via a principal component analysis of encoded ground truth data, anomaly detection and more efficient importance sampling, e.g., for the phase space integration of matrix elements in quantum field theories. Here, the authors report buffered-density variational autoencoders for the generation of physical events. This method is computationally less expensive over other traditional methods and beyond accelerating the data generation process, it can help to steer the generation and to detect anomalies.
|
|
|
van Beekveld, M., Caron, S., Hendriks, L., Jackson, P., Leinweber, A., Otten, S., et al. (2021). Combining outlier analysis algorithms to identify new physics at the LHC. J. High Energy Phys., 09(9), 024–33pp.
Abstract: The lack of evidence for new physics at the Large Hadron Collider so far has prompted the development of model-independent search techniques. In this study, we compare the anomaly scores of a variety of anomaly detection techniques: an isolation forest, a Gaussian mixture model, a static autoencoder, and a beta-variational autoencoder (VAE), where we define the reconstruction loss of the latter as a weighted combination of regression and classification terms. We apply these algorithms to the 4-vectors of simulated LHC data, but also investigate the performance when the non-VAE algorithms are applied to the latent space variables created by the VAE. In addition, we assess the performance when the anomaly scores of these algorithms are combined in various ways. Using supersymmetric benchmark points, we find that the logical AND combination of the anomaly scores yielded from algorithms trained in the latent space of the VAE is the most effective discriminator of all methods tested.
|
|