|
Llosa, G., Trovato, M., Barrio, J., Etxebeste, A., Muñoz, E., Lacasta, C., et al. (2016). First Images of a Three-layer compton Telescope prototype for Treatment Monitoring in hadron Therapy. Front. Oncol., 6, 14–6pp.
Abstract: A Compton telescope for dose monitoring in hadron therapy is under development at IFIC. The system consists of three layers of LaBr3 crystals coupled to silicon photomulti-plier arrays. Na-22 sources have been successfully imaged reconstructing the data with an ML-EM code. Calibration and temperature stabilization are necessary for the prototype operation at low coincidence rates. A spatial resolution of 7.8 mm FWHM has been obtained in the first imaging tests.
|
|
|
Oliver, J. F., & Rafecas, M. (2016). Modelling Random Coincidences in Positron Emission Tomography by Using Singles and Prompts: A Comparison Study. PLoS ONE, 11(9), e0162096–22pp.
Abstract: Random coincidences degrade the image in Positron Emission Tomography, PET. To compensate for their degradation effects, the rate of random coincidences should be estimated. Under certain circumstances, current estimation methods fail to provide accurate results. We propose a novel method, “Singles-Prompts” (SP), that includes the information conveyed by prompt coincidences and models the pile-up. The SP method has the same structure than the well-known “Singles Rate” (SR) approach. Hence, SP can straightforwardly replace SR. In this work, the SP method has been extensively assessed and compared to two conventional methods, SR and the delayed window (DW) method, in a preclinical PET scenario using Monte-Carlo simulations. SP offers accurate estimates for the randoms rates, while SR and DW tend to overestimate the rates (similar to 10%, and 5%, respectively). With pile-up, the SP method is more robust than SR (but less than DW). At the image level, the contrast is overestimated in SR-corrected images, + 16%, while SP produces the correct value. Spill-over is slightly reduced using SP instead of SR. The DW images values are similar to those of SP except for low-statistic scenarios, where DW behaves as if randoms were not compensated for. In particular, the contrast is reduced, -16%. In general, the better estimations of SP translate into better image quality.
|
|
|
Oliver, J. F., Fuster-Garcia, E., Cabello, J., Tortajada, S., & Rafecas, M. (2013). Application of Artificial Neural Network for Reducing Random Coincidences in PET. IEEE Trans. Nucl. Sci., 60(5), 3399–3409.
Abstract: Positron Emission Tomography (PET) is based on the detection in coincidence of the two photons created in a positron annihilation. In conventional PET, this coincidence identification is usually carried out through a coincidence electronic unit. An accidental coincidence occurs when two photons arising from different annihilations are classified as a coincidence. Accidental coincidences are one of the main sources of image degradation in PET. Some novel systems allow coincidences to be selected post-acquisition in software, or in real time through a digital coincidence engine in an FPGA. These approaches provide the user with extra flexibility in the sorting process and allow the application of alternative coincidence sorting procedures. In this work a novel sorting procedure based on Artificial Neural Network (ANN) techniques has been developed. It has been compared to a conventional coincidence sorting algorithm based on a time coincidence window. The data have been obtained from Monte-Carlo simulations. A small animal PET scanner has been implemented to this end. The efficiency (the ratio of correct identifications) can be selected for both methods. In one case by changing the actual value of the coincidence window used, and in the other by changing a threshold at the output of the neural network. At matched efficiencies, the ANN-based method always produces a sorted output with a smaller random fraction. In addition, two differential trends are found: the conventional method presents a maximum achievable efficiency, while the ANN-based method is able to increase the efficiency up to unity, the ideal value, at the cost of increasing the random fraction. Images reconstructed using ANN sorted data (no compensation for randoms) present better contrast, and those image features which are more affected by randoms are enhanced. For the image quality phantom used in the paper, the ANN method decreases the spill-over ratio by a factor of 18%.
|
|
|
Tetrault, M. A., Oliver, J. F., Bergeron, M., Lecomte, R., & Fontaine, R. (2010). Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET. IEEE Trans. Nucl. Sci., 57(1), 117–124.
Abstract: Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET(TM), a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.
|
|
|
Muñoz, E., Barrio, J., Bemmerer, D., Etxebeste, A., Fiedler, F., Hueso-Gonzalez, F., et al. (2018). Tests of MACACO Compton telescope with 4.44 MeV gamma rays. J. Instrum., 13, P05007–13pp.
Abstract: Hadron therapy offers the possibility of delivering a large amount of radiation dose to tumors with minimal absorption by the surrounding healthy tissue. In order to fully exploit the advantages of this technique, the use of real-time beam monitoring devices becomes mandatory. Compton imaging devices can be employed to map the distribution of prompt gamma emission during the treatment and thus assess its correct delivery. The Compton telescope prototype developed at IFIC-Valencia for this purpose is made of three layers of LaBr3 crystals coupled to silicon photomultipliers. The system has been tested in a 4.44 MeV gamma field at the 3 MV Tandetron accelerator at HZDR, Dresden. Images of the target with the system in three different positions separated by 10 mm were successfully reconstructed. This indicates the ability of MACACO for imaging the prompt gamma rays emitted at such energies.
Keywords: Compton imaging; Instrumentation for hadron therapy; Gamma detectors (scintillators, CZT, HPG, HgI etc); Photon detectors for UV, visible and IR photons (solid state) (PIN diodes, APDs, Si PMTs, G APDs, CCDs, EBCCDs, EMCCDs etc)
|
|