|
Bouhova-Thacker, E., Kostyukhin, V., Koffas, T., Liebig, W., Limper, M., Piacquadio, G. N., et al. (2010). Expected Performance of Vertex Reconstruction in the ATLAS Experiment at the LHC. IEEE Trans. Nucl. Sci., 57(2), 760–767.
Abstract: In the harsh environment of the Large Hadron Collider at CERN (design luminosity of 10(34) cm(-2) s(-1)) efficient reconstruction of vertices is crucial for many physics analyses. Described in this paper is the expected performance of the vertex reconstruction used in the ATLAS experiment. The algorithms for the reconstruction of primary and secondary vertices as well as for finding photon conversions and vertex reconstruction in jets are described. The implementation of vertex algorithms which follows a very modular design based on object-oriented C++ is presented. A user-friendly concept allows event reconstruction and physics analyses to compare and optimize their choice among different vertex reconstruction strategies. The performance of implemented algorithms has been studied on a variety of Monte Carlo samples and results are presented.
|
|
|
Nguyen, C. V., Gillam, J. E., Brown, J. M. C., Martin, D. V., Nikulin, D. A., & Dimmock, M. R. (2011). Towards Optimal Collimator Design for the PEDRO Hybrid Imaging System. IEEE Trans. Nucl. Sci., 58(3), 639–650.
Abstract: The Pixelated Emission Detector for RadiOisotopes (PEDRO) is a hybrid imaging system designed for the measurement of single photon emission from small animal models. The proof-of-principle device consists of a Compton-camera situated behind a mechanical collimator and is intended to provide optimal detection characteristics over a broad spectral range, from 30 to 511 keV. An automated routine has been developed for the optimization of large-area slits in the outer regions of a collimator which has a central region allocated for pinholes. The optimization was tested with a GEANT4 model of the experimental prototype. The data were blurred with the expected position and energy resolution parameters and a Bayesian interaction ordering algorithm was applied. Images were reconstructed using cone back-projection. The results show that the optimization technique allows the large-area slits to both sample fully and extend the primary field of view (FoV) determined by the pinholes. The slits were found to provide truncation of the back-projected cones of response and also an increase in the success rate of the interaction ordering algorithm. These factors resulted in an increase in the contrast and signal-to-noise ratio of the reconstructed image estimates. Of the two configurations tested, the cylindrical geometry outperformed the square geometry, primarily because of a decrease in artifacts. This was due to isotropic modulation of the cone surfaces, that can be achieved with a circular shape. Also, the cylindrical geometry provided increased sampling of the FoV due to more optimal positioning of the slits. The use of the cylindrical collimator and application of the transmission function in the reconstruction was found to improve the resolution of the system by a factor of 20, as compared to the uncollimated Compton camera. Although this system is designed for small animal imaging, the technique can be applied to any application of single photon imaging.
|
|
|
Miñano, M. (2011). Radiation Hard Silicon Strips Detectors for the SLHC. IEEE Trans. Nucl. Sci., 58(3), 1135–1140.
Abstract: While the Large Hadron Collider (LHC) began taking data in 2009, scenarios for a machine upgrade to achieve a much higher luminosity are being developed. In the current planning, it is foreseen to increase the luminosity of the LHC at CERN around 2018. As radiation damage scales with integrated luminosity, the particle physics experiments will need to be equipped with a new generation of radiation hard detectors. This article reports on the status of the R&D projects on radiation hard silicon strips detectors for particle physics, linked to the Large Hadron Collider Upgrade, super-LHC (sLHC) of the ATLAS microstrip detector. The primary focus of this report is on measuring the radiation hardness of the silicon materials and the detectors under study. This involves designing silicon detectors, irradiating them to the sLHC radiation levels and studying their performance as particle detectors. The most promising silicon detector for the different radiation levels in the different regions of the ATLAS microstrip detector will be presented. Important challenges related to engineering layout, powering, cooling and reading out a very large strip detector are presented. Ideas on possible schemes for the layout and support mechanics will be shown.
|
|
|
Dimmock, M. R., Nikulin, D. A., Gillam, J. E., & Nguyen, C. V. (2012). An OpenCL Implementation of Pinhole Image Reconstruction. IEEE Trans. Nucl. Sci., 59(4), 1738–1749.
Abstract: AC++/OpenCL software platform for emission image reconstruction of data from pinhole cameras has been developed. The software incorporates a new, accurate but computationally costly, probability distribution function for operating on list-mode data from detector stacks. The platform architecture is more general than previous works, supporting advanced models such as arbitrary probability distribution, collimation geometry and detector stack geometry. The software was implemented such that all performance-critical operations occur on OpenCL devices, generally GPUs. The performance of the software is tested on several commodity CPU and GPU devices.
|
|
|
ATLAS Collaboration(Aad, G. et al), Cabrera Urban, S., Castillo Gimenez, V., Costa, M. J., Fassi, F., Ferrer, A., et al. (2012). A Particle Consistent with the Higgs Boson Observed with the ATLAS Detector at the Large Hadron Collider. Science, 338(6114), 1576–1582.
Abstract: Nearly 50 years ago, theoretical physicists proposed that a field permeates the universe and gives energy to the vacuum. This field was required to explain why some, but not all, fundamental particles have mass. Numerous precision measurements during recent decades have provided indirect support for the existence of this field, but one crucial prediction of this theory has remained unconfirmed despite 30 years of experimental searches: the existence of a massive particle, the standard model Higgs boson. The ATLAS experiment at the Large Hadron Collider at CERN has now observed the production of a new particle with a mass of 126 giga-electron volts and decay signatures consistent with those expected for the Higgs particle. This result is strong support for the standard model of particle physics, including the presence of this vacuum field. The existence and properties of the newly discovered particle may also have consequences beyond the standard model itself.
|
|
|
DEPFET collaboration(Alonso, O. et al), Boronat, M., Esperante-Pereira, D., Fuster, J., Garcia, I. G., Lacasta, C., et al. (2013). DEPFET Active Pixel Detectors for a Future Linear e(+)e(-) Collider. IEEE Trans. Nucl. Sci., 60(2), 1457–1465.
Abstract: The DEPFET collaboration develops highly granular, ultra-transparent active pixel detectors for high-performance vertex reconstruction at future collider experiments. The characterization of detector prototypes has proven that the key principle, the integration of a first amplification stage in a detector-grade sensor material, can provide a comfortable signal to noise ratio of over 40 for a sensor thickness of 50-75 μm. ASICs have been designed and produced to operate a DEPFET pixel detector with the required read-out speed. A complete detector concept is being developed, including solutions for mechanical support, cooling, and services. In this paper, the status of the DEPFET R & D project is reviewed in the light of the requirements of the vertex detector at a future linear e(+)e(-) collider.
|
|
|
Perez, A., & Romanelli, A. (2013). Spatially Dependent Decoherence and Anomalous Diffussion of Quantum Walks. J. Comput. Theor. Nanosci., 10(7), 1591–1595.
Abstract: We analyze the long time behavior of a discrete time quantum walk subject to decoherence with a strong spatial dependence, acting on one half of the lattice. We show that, except for limiting cases on the decoherence parameter, the quantum walk at late times behaves sub-ballistically, meaning that the characteristic features of the quantum walk are not completely spoiled. Contrarily to expectations, the asymptotic behavior is non Markovian, and depends on the amount of decoherence. This feature can be clearly shown on the long time value of the Generalized Chiral Distribution (GCD).
|
|
|
Hinarejos, M., Bañuls, M. C., & Perez, A. (2013). A Study of Wigner Functions for Discrete-Time Quantum Walks. J. Comput. Theor. Nanosci., 10(7), 1626–1633.
Abstract: We perform a systematic study of the discrete time Quantum Walk on one dimension using Wigner functions, which are generalized to include the chirality (or coin) degree of freedom. In particular, we analyze the evolution of the negative volume in phase space, as a function of time, for different initial states. This negativity can be used to quantify the degree of departure of the system from a classical state. We also relate this quantity to the entanglement between the coin and walker subspaces.
|
|
|
Brown, J. M. C., Gillam, J. E., Paganin, D. M., & Dimmock, M. R. (2013). Laplacian Erosion: An Image Deblurring Technique for Multi-Plane Gamma-Cameras. IEEE Trans. Nucl. Sci., 60(5), 3333–3342.
Abstract: Laplacian Erosion, an image deblurring technique for multi-plane Gamma-cameras, has been developed and tested for planar imaging using a GEANT4 Monte Carlo model of the Pixelated Emission Detector for RadioisOtopes (PEDRO) as a test platform. A contrast and Derenzo-like phantom composed of I-125 were both employed to investigate the dependence of detection plane and pinhole geometry on the performance of Laplacian Erosion. Three different pinhole geometries were tested. It was found that, for the test system, the performance of Laplacian Erosion was inversely proportional to the detection plane offset, and directly proportional to the pinhole diameter. All tested pinhole geometries saw a reduction in the level of image blurring associated with the pinhole geometry. However, the reduction in image blurring came at the cost of signal to noise ratio in the image. The application of Laplacian Erosion was shown to reduce the level of image blurring associated with pinhole geometry and improve recovered image quality in multi-plane Gamma-cameras for the targeted radiotracer I-125.
|
|
|
Cabello, J., Torres-Espallardo, I., Gillam, J. E., & Rafecas, M. (2013). PET Reconstruction From Truncated Projections Using Total-Variation Regularization for Hadron Therapy Monitoring. IEEE Trans. Nucl. Sci., 60(5), 3364–3372.
Abstract: Hadron therapy exploits the properties of ion beams to treat tumors by maximizing the dose released to the target and sparing healthy tissue. With hadron beams, the dose distribution shows a relatively low entrance dose which rises sharply at the end of the range, providing the characteristic Bragg peak that drops quickly thereafter. It is of critical importance in order not to damage surrounding healthy tissues and/or avoid targeting underdosage to know where the delivered dose profile ends-the location of the Bragg peak. During hadron therapy, short-lived beta(+)-emitters are produced along the beam path, their distribution being correlated with the delivered dose. Following positron annihilation, two photons are emitted, which can be detected using a positron emission tomography (PET) scanner. The low yield of emitters, their short half-life, and the wash out from the target region make the use of PET, even only a few minutes after hadron irradiation, a challenging application. In-beam PET represents a potential candidate to estimate the distribution of beta(+)-emitters during or immediately after irradiation, at the cost of truncation effects and degraded image quality due to the partial rings required of the PET scanner. Time-of-flight (ToF) information can potentially be used to compensate for truncation effects and to enhance image contrast. However, the highly demanding timing performance required in ToF-PET makes this option costly. Alternatively, the use of maximum-a-posteriori-expectation-maximization (MAP-EM), including total variation (TV) in the cost function, produces images with low noise, while preserving spatial resolution. In this paper, we compare data reconstructed with maximum-likelihood-expectation-maximization (ML-EM) and MAP-EM using TV as prior, and the impact of including ToF information, from data acquired with a complete and a partial-ring PET scanner, of simulated hadron beams interacting with a polymethyl methacrylate (PMMA) target. The results show that MAP-EM, in the absence of ToF information, produces lower noise images and more similar data compared to the simulated beta(+) distributions than ML-EM with ToF information in the order of 200-600 ps. The investigation is extended to the combination of MAP-EM and ToF information to study the limit of performance using both approaches.
|
|