Home | << 1 2 3 4 5 6 7 8 >> |
![]() |
Ortega, P. G., Torres-Espallardo, I., Cerutti, F., Ferrari, A., Gillam, J. E., Lacasta, C., et al. (2015). Noise evaluation of Compton camera imaging for proton therapy. Phys. Med. Biol., 60(5), 1845–1863.
Abstract: Compton Cameras emerged as an alternative for real-time dose monitoring techniques for Particle Therapy (PT), based on the detection of prompt-gammas. As a consequence of the Compton scattering process, the gamma origin point can be restricted onto the surface of a cone (Compton cone). Through image reconstruction techniques, the distribution of the gamma emitters can be estimated, using cone-surfaces backprojections of the Compton cones through the image space, along with more sophisticated statistical methods to improve the image quality. To calculate the Compton cone required for image reconstruction, either two interactions, the last being photoelectric absorption, or three scatter interactions are needed. Because of the high energy of the photons in PT the first option might not be adequate, as the photon is not absorbed in general. However, the second option is less efficient. That is the reason to resort to spectral reconstructions, where the incoming. energy is considered as a variable in the reconstruction inverse problem. Jointly with prompt gamma, secondary neutrons and scattered photons, not strongly correlated with the dose map, can also reach the imaging detector and produce false events. These events deteriorate the image quality. Also, high intensity beams can produce particle accumulation in the camera, which lead to an increase of random coincidences, meaning events which gather measurements from different incoming particles. The noise scenario is expected to be different if double or triple events are used, and consequently, the reconstructed images can be affected differently by spurious data. The aim of the present work is to study the effect of false events in the reconstructed image, evaluating their impact in the determination of the beam particle ranges. A simulation study that includes misidentified events (neutrons and random coincidences) in the final image of a Compton Telescope for PT monitoring is presented. The complete chain of detection, from the beam particle entering a phantom to the event classification, is simulated using FLUKA. The range determination is later estimated from the reconstructed image obtained from a two and three-event algorithm based on Maximum Likelihood Expectation Maximization. The neutron background and random coincidences due to a therapeutic-like time structure are analyzed for mono-energetic proton beams. The time structure of the beam is included in the simulations, which will affect the rate of particles entering the detector.
Keywords: proton therapy; Compton camera; Monte Carlo methods; FLUKA; prompt gamma; range verification; MLEM
|
ANTARES Collaboration(Bhandari, S. et al), Barrios-Marti, J., Coleiro, A., Hernandez-Rey, J. J., Illuminati, G., Tönnis, C., et al. (2018). The SUrvey for Pulsars and Extragalactic Radio Bursts – II. New FRB discoveries and their follow-up. Mon. Not. Roy. Astron. Soc., 475(2), 1427–1446.
Abstract: We report the discovery of four Fast Radio Bursts (FRBs) in the ongoing SUrvey for Pulsars and Extragalactic Radio Bursts at the Parkes Radio Telescope: FRBs 150610, 151206, 151230 and 160102. Our real-time discoveries have enabled us to conduct extensive, rapid multimessenger follow-up at 12 major facilities sensitive to radio, optical, X-ray, gamma-ray photons and neutrinos on time-scales ranging from an hour to a few months post-burst. No counterparts to the FRBs were found and we provide upper limits on afterglow luminosities. None of the FRBs were seen to repeat. Formal fits to all FRBs show hints of scattering while their intrinsic widths are unresolved in time. FRB 151206 is at low Galactic latitude, FRB 151230 shows a sharp spectral cut-off, and FRB 160102 has the highest dispersion measure (DM = 2596.1 +/- 0.3 pc cm(-3)) detected to date. Three of the FRBs have high dispersion measures (DM > 1500 pc cm(-3)), favouring a scenario where the DMis dominated by contributions from the intergalactic medium. The slope of the Parkes FRB source counts distribution with fluences > 2 Jy ms is alpha = – 2.2(-1.2)(+0.6) and still consistent with a Euclidean distribution (alpha = -3/2). We also find that the all-sky rate is 1.7(-0.9)(+1.5) x 10(3)FRBs/(4 pi sr)/day above similar to 2 Jy ms and there is currently no strong evidence for a latitude- dependent FRB sky rate.
|
Gammaldi, V., Zaldivar, B., Sanchez-Conde, M. A., & Coronado-Blazquez, J. (2023). A search for dark matter among Fermi-LAT unidentified sources with systematic features in machine learning. Mon. Not. Roy. Astron. Soc., 520(1), 1348–1361.
Abstract: Around one-third of the point-like sources in the Fermi-LAT catalogues remain as unidentified sources (unIDs) today. Indeed, these unIDs lack a clear, univocal association with a known astrophysical source. If dark matter (DM) is composed of weakly interacting massive particles (WIMPs), there is the exciting possibility that some of these unIDs may actually be DM sources, emitting gamma-rays from WIMPs annihilation. We propose a new approach to solve the standard, machine learning (ML) binary classification problem of disentangling prospective DM sources (simulated data) from astrophysical sources (observed data) among the unIDs of the 4FGL Fermi-LAT catalogue. We artificially build two systematic features for the DM data which are originally inherent to observed data: the detection significance and the uncertainty on the spectral curvature. We do it by sampling from the observed population of unIDs, assuming that the DM distributions would, if any, follow the latter. We consider different ML models: Logistic Regression, Neural Network (NN), Naive Bayes, and Gaussian Process, out of which the best, in terms of classification accuracy, is the NN, achieving around 93 . 3 per cent +/- 0 . 7 per cent performance. Other ML evaluation parameters, such as the True Ne gativ e and True Positive rates, are discussed in our work. Applying the NN to the unIDs sample, we find that the de generac y between some astrophysical and DM sources can be partially solved within this methodology. None the less, we conclude that there are no DM source candidates among the pool of 4FGL Fermi-LAT unIDs.
|
Norena, J., Verde, L., Jimenez, R., Pena-Garay, C., & Gomez, C. (2012). Cancelling out systematic uncertainties. Mon. Not. Roy. Astron. Soc., 419(2), 1040–1050.
Abstract: We present a method to minimize, or even cancel out, the nuisance parameters affecting a measurement. Our approach is general and can be applied to any experiment or observation where systematic errors are a concern e.g. are larger than statistical errors. We compare it with the Bayesian technique used to deal with nuisance parameters: marginalization, and show how the method compares and improves by avoiding biases. We illustrate the method with several examples taken from the astrophysics and cosmology world: baryonic acoustic oscillations (BAOs), cosmic clocks, Type Ia supernova (SNIa) luminosity distance, neutrino oscillations and dark matter detection. By applying the method we not only recover some known results but also find some interesting new ones. For BAO experiments we show how to combine radial and angular BAO measurements in order to completely eliminate the dependence on the sound horizon at radiation drag. In the case of exploiting SNIa as standard candles we show how the uncertainty in the luminosity distance by a second parameter modelled as a metallicity dependence can be eliminated or greatly reduced. When using cosmic clocks to measure the expansion rate of the universe, we demonstrate how a particular combination of observables nearly removes the metallicity dependence of the galaxy on determining differential ages, thus removing the agemetallicity degeneracy in stellar populations. We hope that these findings will be useful in future surveys to obtain robust constraints on the dark energy equation of state.
Keywords: methods: statistical; cosmology: theory
|
Double Chooz collaboration(Abrahao, T. et al), & Novella, P. (2018). Novel event classification based on spectral analysis of scintillation waveforms in Double Chooz. J. Instrum., 13, P01031–26pp.
Abstract: Liquid scintillators are a common choice for neutrino physics experiments, but their capabilities to perform background rejection by scintillation pulse shape discrimination is generally limited in large detectors. This paper describes a novel approach for a pulse shape based event classification developed in the context of the Double Chooz reactor antineutrino experiment. Unlike previous implementations, this method uses the Fourier power spectra of the scintillation pulse shapes to obtain event-wise information. A classification variable built from spectral information was able to achieve an unprecedented performance, despite the lack of optimization at the detector design level. Several examples of event classification are provided, ranging from differentiation between the detector volumes and an efficient rejection of instrumental light noise, to some sensitivity to the particle type, such as stopping muons, ortho-positronium formation, alpha particles as well as electrons and positrons. In combination with other techniques the method is expected to allow for a versatile and more efficient background rejection in the future, especially if detector optimization is taken into account at the design level.
|
Yepes, H. (2012). The ANTARES neutrino detector instrumentation. J. Instrum., 7, C01022–9pp.
Abstract: ANTARES is actually the fully operational and the largest neutrino telescope in the Northern hemisphere. Located in the Mediterranean Sea, it consists of a 3D array of 885 photomultiplier tubes (PMTs) arranged in 12 detection lines (25 storeys each), able to detect the Cherenkov light induced by upgoing relativistic muons produced in the interaction of high energy cosmic neutrinos with the detector surroundings. Among its physics goals, the search for neutrino astrophysical sources and the indirect detection of dark matter particles coming from the sun are of particular interest. To reach these goals, good accuracy in track reconstruction is mandatory, so several calibration systems for timing and positioning have been developed. In this contribution we will present the design of the detector, calibration systems, associated equipment and its performance on track reconstruction.
|
ATLAS Tile Calorimeter Community(Abdallah, J. et al), Castillo Gimenez, V., Costelo, J., Ferrer, A., Fullana, E., Gonzalez, V., et al. (2013). The optical instrumentation of the ATLAS Tile Calorimeter. J. Instrum., 8, P01005–21pp.
Abstract: The Tile Calorimeter, covering the central region of the ATLAS experiment up to pseudorapidities of +/-1.7, is a sampling device built with scintillating tiles that alternate with iron plates. The light is collected in wave-length shifting (WLS) fibers and is read out with photomultipliers. In the characteristic geometry of this calorimeter the tiles lie in planes perpendicular to the beams, resulting in a very simple and modular mechanical and optical layout. This paper focuses on the procedures applied in the optical instrumentation of the calorimeter, which involved the assembly of about 460,000 scintillator tiles and 550,000 WLS fibers. The outcome is a hadronic calorimeter that meets the ATLAS performance requirements, as shown in this paper.
|
NEXT Collaboration(Renner, J. et al), Benlloch-Rodriguez, J., Botas, A., Ferrario, P., Gomez-Cadenas, J. J., Alvarez, V., et al. (2017). Background rejection in NEXT using deep neural networks. J. Instrum., 12, T01004–21pp.
Abstract: We investigate the potential of using deep learning techniques to reject background events in searches for neutrinoless double beta decay with high pressure xenon time projection chambers capable of detailed track reconstruction. The differences in the topological signatures of background and signal events can be learned by deep neural networks via training over many thousands of events. These networks can then be used to classify further events as signal or background, providing an additional background rejection factor at an acceptable loss of efficiency. The networks trained in this study performed better than previous methods developed based on the use of the same topological signatures by a factor of 1.2 to 1.6, and there is potential for further improvement.
|
Mendez, V., Amoros, G., Garcia, F., & Salt, J. (2010). Emergent algorithms for replica location and selection in data grid. Futur. Gener. Comp. Syst., 26(7), 934–946.
Abstract: Grid infrastructures for e-Science projects are growing in magnitude terms. Improvements in data Grid replication algorithms may be critical in many of these infrastructures. This paper shows a decentralized replica optimization service, providing a general Emergent Artificial Intelligence (EAI) algorithm for the problem definition. Our aim is to set up a theoretical framework for emergent heuristics in Grid environments. Further, we describe two EAI approaches, the Particle Swarm Optimization PSO-Grid Multiswarm Federation and the Ant Colony Optimization ACO-Grid Asynchronous Colonies Optimization replica optimization algorithms, with some examples. We also present extended results with best performance and scalability features for PSO-Grid Multiswarrn Federation.
|
Rivard, M. J., Granero, D., Perez-Calatayud, J., & Ballester, F. (2010). Influence of photon energy spectra from brachytherapy sources on Monte Carlo simulations of kerma and dose rates in water and air. Med. Phys., 37(2), 869–876.
Abstract: Methods: For Ir-192, I-125, and Pd-103, the authors considered from two to five published spectra. Spherical sources approximating common brachytherapy sources were assessed. Kerma and dose results from GEANT4, MCNP5, and PENELOPE-2008 were compared for water and air. The dosimetric influence of Ir-192, I-125, and Pd-103 spectral choice was determined. Results: For the spectra considered, there were no statistically significant differences between kerma or dose results based on Monte Carlo code choice when using the same spectrum. Water-kerma differences of about 2%, 2%, and 0.7% were observed due to spectrum choice for Ir-192, I-125, and Pd-103, respectively (independent of radial distance), when accounting for photon yield per Bq. Similar differences were observed for air-kerma rate. However, their ratio (as used in the dose-rate constant) did not significantly change when the various photon spectra were selected because the differences compensated each other when dividing dose rate by air-kerma strength. Conclusions: Given the standardization of radionuclide data available from the National Nuclear Data Center (NNDC) and the rigorous infrastructure for performing and maintaining the data set evaluations, NNDC spectra are suggested for brachytherapy simulations in medical physics applications.
Keywords: biomedical materials; brachytherapy; dosimetry; iodine; iridium; Monte Carlo methods; palladium; radioisotopes
|