toggle visibility Search & Display Options

Select All    Deselect All
 |   | 
Details
   print
  Records Links
Author Ortega, P.G.; Torres-Espallardo, I.; Cerutti, F.; Ferrari, A.; Gillam, J.E.; Lacasta, C.; Llosa, G.; Oliver, J.F.; Sala, P.R.; Solevi, P.; Rafecas, M. doi  openurl
  Title Noise evaluation of Compton camera imaging for proton therapy Type Journal Article
  Year 2015 Publication Physics in Medicine and Biology Abbreviated Journal (up) Phys. Med. Biol.  
  Volume 60 Issue 5 Pages 1845-1863  
  Keywords proton therapy; Compton camera; Monte Carlo methods; FLUKA; prompt gamma; range verification; MLEM  
  Abstract Compton Cameras emerged as an alternative for real-time dose monitoring techniques for Particle Therapy (PT), based on the detection of prompt-gammas. As a consequence of the Compton scattering process, the gamma origin point can be restricted onto the surface of a cone (Compton cone). Through image reconstruction techniques, the distribution of the gamma emitters can be estimated, using cone-surfaces backprojections of the Compton cones through the image space, along with more sophisticated statistical methods to improve the image quality. To calculate the Compton cone required for image reconstruction, either two interactions, the last being photoelectric absorption, or three scatter interactions are needed. Because of the high energy of the photons in PT the first option might not be adequate, as the photon is not absorbed in general. However, the second option is less efficient. That is the reason to resort to spectral reconstructions, where the incoming. energy is considered as a variable in the reconstruction inverse problem. Jointly with prompt gamma, secondary neutrons and scattered photons, not strongly correlated with the dose map, can also reach the imaging detector and produce false events. These events deteriorate the image quality. Also, high intensity beams can produce particle accumulation in the camera, which lead to an increase of random coincidences, meaning events which gather measurements from different incoming particles. The noise scenario is expected to be different if double or triple events are used, and consequently, the reconstructed images can be affected differently by spurious data. The aim of the present work is to study the effect of false events in the reconstructed image, evaluating their impact in the determination of the beam particle ranges. A simulation study that includes misidentified events (neutrons and random coincidences) in the final image of a Compton Telescope for PT monitoring is presented. The complete chain of detection, from the beam particle entering a phantom to the event classification, is simulated using FLUKA. The range determination is later estimated from the reconstructed image obtained from a two and three-event algorithm based on Maximum Likelihood Expectation Maximization. The neutron background and random coincidences due to a therapeutic-like time structure are analyzed for mono-energetic proton beams. The time structure of the beam is included in the simulations, which will affect the rate of particles entering the detector.  
  Address [Ortega, P. G.; Cerutti, F.; Ferrari, A.] CERN European Org Nucl Res, CH-1217 Meyrin, Switzerland, Email: pgarciao@cern.ch  
  Corporate Author Thesis  
  Publisher Iop Publishing Ltd Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0031-9155 ISBN Medium  
  Area Expedition Conference  
  Notes WOS:000349530700009 Approved no  
  Is ISI yes International Collaboration yes  
  Call Number IFIC @ pastor @ Serial 2115  
Permanent link to this record
 

 
Author Kasieczka, G. et al; Sanz, V. url  doi
openurl 
  Title The LHC Olympics 2020: a community challenge for anomaly detection in high energy physics Type Journal Article
  Year 2021 Publication Reports on Progress in Physics Abbreviated Journal (up) Rep. Prog. Phys.  
  Volume 84 Issue 12 Pages 124201 - 64pp  
  Keywords anomaly detection; machine learning; unsupervised learning; weakly supervised learning; semisupervised learning; beyond the standard model; model-agnostic methods  
  Abstract A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied by a set of simulated collider events. Participants in these Olympics have developed their methods using an R&D dataset and then tested them on black boxes: datasets with an unknown anomaly (or not). Methods made use of modern machine learning tools and were based on unsupervised learning (autoencoders, generative adversarial networks, normalizing flows), weakly supervised learning, and semi-supervised learning. This paper will review the LHC Olympics 2020 challenge, including an overview of the competition, a description of methods deployed in the competition, lessons learned from the experience, and implications for data analyses with future datasets as well as future colliders.  
  Address [Kasieczka, Gregor] Univ Hamburg, Inst Expt Phys, Hamburg, Germany, Email: gregor.kasieczka@uni-hamburg.de;  
  Corporate Author Thesis  
  Publisher IOP Publishing Ltd Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0034-4885 ISBN Medium  
  Area Expedition Conference  
  Notes WOS:000727698500001 Approved no  
  Is ISI yes International Collaboration yes  
  Call Number IFIC @ pastor @ Serial 5039  
Permanent link to this record
Select All    Deselect All
 |   | 
Details
   print

Save Citations:
Export Records:
ific federMinisterio de Ciencia e InnovaciĆ³nAgencia Estatal de Investigaciongva