|
Pierre Auger Collaboration(Aab, A. et al), & Pastor, S. (2014). A search for point sources of EeV photons. Astrophys. J., 789(2), 160–12pp.
Abstract: Measurements of air showers made using the hybrid technique developed with the fluorescence and surface detectors of the Pierre Auger Observatory allow a sensitive search for point sources of EeV photons anywhere in the exposed sky. A multivariate analysis reduces the background of hadronic cosmic rays. The search is sensitive to a declination band from -85 degrees to +20 degrees, in an energy range from 10(17.3) eV to 10(18.5) eV. No photon point source has been detected. An upper limit on the photon flux has been derived for every direction. The mean value of the energy flux limit that results from this, assuming a photon spectral index of -2, is 0.06 eV cm(-2) s(-1), and no celestial direction exceeds 0.25 eV cm(-2) s(-1). These upper limits constrain scenarios in which EeV cosmic ray protons are emitted by non-transient sources in the Galaxy.
|
|
|
Johannesson, G., Ruiz de Austri, R., Vincent, A. C., Moskalenko, I. V., Orlando, E., Porter, T. A., et al. (2016). Bayesian analysis of cosmic-ray propagation: evidence against homogeneous diffusion. Astrophys. J., 824(1), 16–19pp.
Abstract: We present the results of the most complete scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine-learning package. This is the first study to separate out low-mass isotopes (p, (p) over bar and He) from the usual light elements (Be, B, C, N, and O). We find that the propagation parameters that best-fit p, (p) over bar, and He data are significantly different from those that fit light elements, including the B/C and Be-10/Be-9 secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests that each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best-fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of elements from H to Si. The input GALDEF files with these new parameters will be included in an upcoming public GALPROP update.
|
|
|
Villanueva-Domingo, P., Gnedin, N. Y., & Mena, O. (2018). Warm Dark Matter and Cosmic Reionization. Astrophys. J., 852(2), 139–7pp.
Abstract: In models with dark matter made of particles with keV masses, such as a sterile neutrino, small-scale density perturbations are suppressed, delaying the period at which the lowest mass galaxies are formed and therefore shifting the reionization processes to later epochs. In this study, focusing on Warm Dark Matter (WDM) with masses close to its present lower bound, i.e., around the 3. keV region, we derive constraints from galaxy luminosity functions, the ionization history and the Gunn-Peterson effect. We show that even if star formation efficiency in the simulations is adjusted to match the observed UV galaxy luminosity functions in both CDM and WDM models, the full distribution of Gunn-Peterson optical depth retains the strong signature of delayed reionization in the WDM model. However, until the star formation and stellar feedback model used in modern galaxy formation simulations is constrained better, any conclusions on the nature of dark matter derived from reionization observables remain model-dependent.
|
|
|
Ortega, P. G., Torres-Espallardo, I., Cerutti, F., Ferrari, A., Gillam, J. E., Lacasta, C., et al. (2015). Noise evaluation of Compton camera imaging for proton therapy. Phys. Med. Biol., 60(5), 1845–1863.
Abstract: Compton Cameras emerged as an alternative for real-time dose monitoring techniques for Particle Therapy (PT), based on the detection of prompt-gammas. As a consequence of the Compton scattering process, the gamma origin point can be restricted onto the surface of a cone (Compton cone). Through image reconstruction techniques, the distribution of the gamma emitters can be estimated, using cone-surfaces backprojections of the Compton cones through the image space, along with more sophisticated statistical methods to improve the image quality. To calculate the Compton cone required for image reconstruction, either two interactions, the last being photoelectric absorption, or three scatter interactions are needed. Because of the high energy of the photons in PT the first option might not be adequate, as the photon is not absorbed in general. However, the second option is less efficient. That is the reason to resort to spectral reconstructions, where the incoming. energy is considered as a variable in the reconstruction inverse problem. Jointly with prompt gamma, secondary neutrons and scattered photons, not strongly correlated with the dose map, can also reach the imaging detector and produce false events. These events deteriorate the image quality. Also, high intensity beams can produce particle accumulation in the camera, which lead to an increase of random coincidences, meaning events which gather measurements from different incoming particles. The noise scenario is expected to be different if double or triple events are used, and consequently, the reconstructed images can be affected differently by spurious data. The aim of the present work is to study the effect of false events in the reconstructed image, evaluating their impact in the determination of the beam particle ranges. A simulation study that includes misidentified events (neutrons and random coincidences) in the final image of a Compton Telescope for PT monitoring is presented. The complete chain of detection, from the beam particle entering a phantom to the event classification, is simulated using FLUKA. The range determination is later estimated from the reconstructed image obtained from a two and three-event algorithm based on Maximum Likelihood Expectation Maximization. The neutron background and random coincidences due to a therapeutic-like time structure are analyzed for mono-energetic proton beams. The time structure of the beam is included in the simulations, which will affect the rate of particles entering the detector.
|
|
|
Kasieczka, G. et al, & Sanz, V. (2021). The LHC Olympics 2020: a community challenge for anomaly detection in high energy physics. Rep. Prog. Phys., 84(12), 124201–64pp.
Abstract: A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied by a set of simulated collider events. Participants in these Olympics have developed their methods using an R&D dataset and then tested them on black boxes: datasets with an unknown anomaly (or not). Methods made use of modern machine learning tools and were based on unsupervised learning (autoencoders, generative adversarial networks, normalizing flows), weakly supervised learning, and semi-supervised learning. This paper will review the LHC Olympics 2020 challenge, including an overview of the competition, a description of methods deployed in the competition, lessons learned from the experience, and implications for data analyses with future datasets as well as future colliders.
|
|