Home | << 1 >> |
Panes, B., Eckner, C., Hendriks, L., Caron, S., Dijkstra, K., Johannesson, G., et al. (2021). Identification of point sources in gamma rays using U-shaped convolutional neural networks and a data challenge. Astron. Astrophys., 656, A62–18pp.
Abstract: Context. At GeV energies, the sky is dominated by the interstellar emission from the Galaxy. With limited statistics and spatial resolution, accurately separating point sources is therefore challenging. Aims. Here we present the first application of deep learning based algorithms to automatically detect and classify point sources from gamma-ray data. For concreteness we refer to this approach as AutoSourceID. Methods. To detect point sources, we utilized U-shaped convolutional networks for image segmentation and k-means for source clustering and localization. We also explored the Centroid-Net algorithm, which is designed to find and count objects. Using two algorithms allows for a cross check of the results, while a combination of their results can be used to improve performance. The training data are based on 9.5 years of exposure from The Fermi Large Area Telescope (Fermi-LAT) and we used source properties of active galactic nuclei (AGNs) and pulsars (PSRs) from the fourth Fermi-LAT source catalog in addition to several models of background interstellar emission. The results of the localization algorithm are fed into a classification neural network that is trained to separate the three general source classes (AGNs, PSRs, and FAKE sources). Results. We compared our localization algorithms qualitatively with traditional methods and find them to have similar detection thresholds. We also demonstrate the robustness of our source localization algorithms to modifications in the interstellar emission models, which presents a clear advantage over traditional methods. The classification network is able to discriminate between the three classes with typical accuracy of similar to 70%, as long as balanced data sets are used in classification training. We published online our training data sets and analysis scripts and invite the community to join the data challenge aimed to improve the localization and classification of gamma-ray point sources.
|
Achterberg, A., Amoroso, S., Caron, S., Hendriks, L., Ruiz de Austri, R., & Weniger, C. (2015). A description of the Galactic Center excess in the Minimal Supersymmetric Standard Model. J. Cosmol. Astropart. Phys., 08(8), 006–27pp.
Abstract: Observations with the Fermi Large Area Telescope (LAT) indicate an excess in gamma rays originating from the center of our Galaxy. A possible explanation for this excess is the annihilation of Dark Matter particles. We have investigated the annihilation of neutralinos as Dark Matter candidates within the phenomenological Minimal Supersymmetric Standard Model (pMSSM). An iterative particle filter approach was used to search for solutions within the pMSSM. We found solutions that are consistent with astroparticle physics and collider experiments, and provide a fit to the energy spectrum of the excess. The neutralino is a Bino/Higgsino or Bino/Wino/Higgsino mixture with a mass in the range 84-92 GeV or 87-97 GeV annihilating into W bosons. A third solutions is found for a neutralino of mass 174-187 GeV annihilating into top quarks. The best solutions yield a Dark Matter relic density 0.06 < Omega h(2) < 0.13. These pMSSM solutions make clear forecasts for LHC, direct and indirect DM detection experiments. If the pMSSM explanation of the excess seen by Fermi-LAT is correct, a DM signal might be discovered soon.
|
Achterberg, A., van Beekveld, M., Caron, S., Gomez-Vargas, G. A., Hendriks, L., & Ruiz de Austri, R. (2017). Implications of the Fermi-LAT Pass 8 Galactic Center excess on supersymmetric dark matter. J. Cosmol. Astropart. Phys., 12(12), 040–23pp.
Abstract: The Fermi Collaboration has recently updated their analysis of gamma rays from the center of the Galaxy. They reconfirm the presence of an unexplained emission feature which is most prominent in the region of 1-10 GeV, known as the Galactic Center GeV excess (GCE). Although the GCE is now fi rmly detected, an interpretation of this emission as a signal of self-annihilating dark matter (DM) particles is not unambiguously possible due to systematic effects in the gamma-ray modeling estimated in the Galactic Plane. In this paper we build a covariance matrix, collecting different systematic uncertainties investigated in the Fermi Collaboration's paper that affect the GCE spectrum. We show that models where part of the GCE is due to annihilating DM is still consistent with the new data. We also re-evaluate the parameter space regions of the minimal supersymmetric Standard Model (MSSM) that can contribute dominantly to the GCE via neutralino DM annihilation. All recent constraints from DM direct detection experiments such as PICO, LUX, PandaX and Xenon1T, limits on the annihilation cross section from dwarf spheroidal galaxies and the Large Hadron Collider limits are considered in this analysis. Due to a slight shift in the energy spectrum of the GC excess with respect to the previous Fermi analysis, and the recent limits from direct detection experiments, we find a slightly shifted parameter region of the MSSM, compared to our previous analysis, that is consistent with the GCE. Neutralinos with a mass between 85-220 GeV can describe the excess via annihilation into a pair of W-bosons or top quarks. Remarkably, there are models with low fine-tuning among the regions that we have found. The complete set of solutions will be probed by upcoming direct detection experiments and with dedicated searches in the upcoming data of the Large Hadron Collider.
|
Caron, S., Gomez-Vargas, G. A., Hendriks, L., & Ruiz de Austri, R. (2018). Analyzing gamma rays of the Galactic Center with deep learning. J. Cosmol. Astropart. Phys., 05(5), 058–24pp.
Abstract: We present the application of convolutional neural networks to a particular problem in gamma ray astronomy. Explicitly, we use this method to investigate the origin of an excess emission of GeV gamma rays in the direction of the Galactic Center, reported by several groups by analyzing Fermi-LAT data. Interpretations of this excess include gamma rays created by the annihilation of dark matter particles and gamma rays originating from a collection of unresolved point sources, such as millisecond pulsars. We train and test convolutional neural networks with simulated Fermi-LAT images based on point and diffuse emission models of the Galactic Center tuned to measured gamma ray data. Our new method allows precise measurements of the contribution and properties of an unresolved population of gamma ray point sources in the interstellar diffuse emission model. The current model predicts the fraction of unresolved point sources with an error of up to 10% and this is expected to decrease with future work.
Keywords: gamma ray experiments; dark matter simulations
|
Caron, S., Eckner, C., Hendriks, L., Johannesson, G., Ruiz de Austri, R., & Zaharijas, G. (2023). Mind the gap: the discrepancy between simulation and reality drives interpretations of the Galactic Center Excess. J. Cosmol. Astropart. Phys., 06(6), 013–56pp.
Abstract: The Galactic Center Excess (GCE) in GeV gamma rays has been debated for over a decade, with the possibility that it might be due to dark matter annihilation or undetected point sources such as millisecond pulsars (MSPs). This study investigates how the gamma-ray emission model (-yEM) used in Galactic center analyses affects the interpretation of the GCE's nature. To address this issue, we construct an ultra-fast and powerful inference pipeline based on convolutional Deep Ensemble Networks. We explore the two main competing hypotheses for the GCE using a set of-yEMs with increasing parametric freedom. We calculate the fractional contribution (fsrc) of a dim population of MSPs to the total luminosity of the GCE and analyze its dependence on the complexity of the ryEM. For the simplest ryEM, we obtain fsrc = 0.10 f 0.07, while the most complex model yields fsrc = 0.79 f 0.24. In conclusion, we find that the statement about the nature of the GCE (dark matter or not) strongly depends on the assumed ryEM. The quoted results for fsrc do not account for the additional uncertainty arising from the fact that the observed gamma-ray sky is out-of-distribution concerning the investigated ryEM iterations. We quantify the reality gap between our ryEMs using deep-learning-based One-Class Deep Support Vector Data Description networks, revealing that all employed ryEMs have gaps to reality. Our study casts doubt on the validity of previous conclusions regarding the GCE and dark matter, and underscores the urgent need to account for the reality gap and consider previously overlooked “out of domain” uncertainties in future interpretations.
|
van Beekveld, M., Caron, S., Hendriks, L., Jackson, P., Leinweber, A., Otten, S., et al. (2021). Combining outlier analysis algorithms to identify new physics at the LHC. J. High Energy Phys., 09(9), 024–33pp.
Abstract: The lack of evidence for new physics at the Large Hadron Collider so far has prompted the development of model-independent search techniques. In this study, we compare the anomaly scores of a variety of anomaly detection techniques: an isolation forest, a Gaussian mixture model, a static autoencoder, and a beta-variational autoencoder (VAE), where we define the reconstruction loss of the latter as a weighted combination of regression and classification terms. We apply these algorithms to the 4-vectors of simulated LHC data, but also investigate the performance when the non-VAE algorithms are applied to the latent space variables created by the VAE. In addition, we assess the performance when the anomaly scores of these algorithms are combined in various ways. Using supersymmetric benchmark points, we find that the logical AND combination of the anomaly scores yielded from algorithms trained in the latent space of the VAE is the most effective discriminator of all methods tested.
|
Otten, S., Caron, S., de Swart, W., van Beekveld, M., Hendriks, L., van Leeuwen, C., et al. (2021). Event generation and statistical sampling for physics with deep generative models and a density information buffer. Nat. Commun., 12(1), 2985–16pp.
Abstract: Simulating nature and in particular processes in particle physics require expensive computations and sometimes would take much longer than scientists can afford. Here, we explore ways to a solution for this problem by investigating recent advances in generative modeling and present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but to also ensure that these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with several generative machine learning models to produce events like Monte Carlo generators. We study three processes: a simple two-body decay, the processes e(+)e(-)-> Z -> l(+)l(-) and pp -> tt<mml:mo><overbar></mml:mover> including the decay of the top quarks and a simulation of the detector response. By buffering density information of encoded Monte Carlo events given the encoder of a Variational Autoencoder we are able to construct a prior for the sampling of new events from the decoder that yields distributions that are in very good agreement with real Monte Carlo events and are generated several orders of magnitude faster. Applications of this work include generic density estimation and sampling, targeted event generation via a principal component analysis of encoded ground truth data, anomaly detection and more efficient importance sampling, e.g., for the phase space integration of matrix elements in quantum field theories. Here, the authors report buffered-density variational autoencoders for the generation of physical events. This method is computationally less expensive over other traditional methods and beyond accelerating the data generation process, it can help to steer the generation and to detect anomalies.
|