|
Ballester, F., Tedgren, A. C., Granero, D., Haworth, A., Mourtada, F., Fonseca, G. P., et al. (2015). A generic high-dose rate Ir-192 brachytherapy source for evaluation of model-based dose calculations beyond the TG-43 formalism. Med. Phys., 42(6), 3048–3062.
Abstract: Purpose: In order to facilitate a smooth transition for brachytherapy dose calculations from the American Association of Physicists in Medicine (AAPM) Task Group No. 43 (TG-43) formalism to model-based dose calculation algorithms (MBDCAs), treatment planning systems (TPSs) using a MBDCA require a set of well-defined test case plans characterized by Monte Carlo (MC) methods. This also permits direct dose comparison to TG-43 reference data. Such test case plans should be made available for use in the software commissioning process performed by clinical end users. To this end, a hypothetical, generic high-dose rate (HDR) Ir-192 source and a virtual water phantom were designed, which can be imported into a TPS. Methods: A hypothetical, generic HDR Ir-192 source was designed based on commercially available sources as well as a virtual, cubic water phantom that can be imported into any TPS in DICOM format. The dose distribution of the generic Ir-192 source when placed at the center of the cubic phantom, and away from the center under altered scatter conditions, was evaluated using two commercial MBDCAs [Oncentra (R) Brachy with advanced collapsed-cone engine (ACE) and BrachyVision AcuRos (TM)]. Dose comparisons were performed using state-of-the-art MC codes for radiation transport, including ALGEBRA, BrachyDose, GEANT4, MCNP5, MCNP6, and pENELopE2008. The methodologies adhered to recommendations in the AAPM TG-229 report on high-energy brachytherapy source dosimetry. TG-43 dosimetry parameters, an along-away dose-rate table, and primary and scatter separated (PSS) data were obtained. The virtual water phantom of (201)(3) voxels (1 mm sides) was used to evaluate the calculated dose distributions. Two test case plans involving a single position of the generic HDR Ir-192 source in this phantom were prepared: (i) source centered in the phantom and (ii) source displaced 7 cm laterally from the center. Datasets were independently produced by different investigators. MC results were then compared against dose calculated using TG-43 and MBDCA methods. Results: TG-43 and PSS datasets were generated for the generic source, the PSS data for use with the ACE algorithm. The dose-rate constant values obtained from seven MC simulations, performed independently using different codes, were in excellent agreement, yielding an average of 1.1109 +/- 0.0004 cGy/(h U) (k = 1, Type A uncertainty). MC calculated dose-rate distributions for the two plans were also found to be in excellent agreement, with differences within type A uncertainties. Differences between commercial MBDCA and MC results were test, position, and calculation parameter dependent. On average, however, these differences were within 1% for ACUROS and 2% for ACE at clinically relevant distances. Conclusions: A hypothetical, generic HDR Ir-192 source was designed and implemented in two commercially available TPSs employing different MBDCAs. Reference dose distributions for this source were benchmarked and used for the evaluation of MBDCA calculations employing a virtual, cubic water phantom in the form of a CT DICOM image series. The implementation of a generic source of identical design in all TPSs using MBDCAs is an important step toward supporting univocal commissioning procedures and direct comparisons between TPSs.
|
|
|
Agaras, M. N. et al, & Fiorini, L. (2023). Laser calibration of the ATLAS Tile Calorimeter during LHC Run 2. J. Instrum., 18(6), P06023–35pp.
Abstract: This article reports the laser calibration of the hadronic Tile Calorimeter of the ATLAS experiment in the LHC Run 2 data campaign. The upgraded Laser II calibration system is described. The system was commissioned during the first LHC Long Shutdown, exhibiting a stability better than 0.8% for the laser light monitoring. The methods employed to derive the detector calibration factors with data from the laser calibration runs are also detailed. These allowed to correct for the response fluctuations of the 9852 photomultiplier tubes of the Tile Calorimeter with a total uncertainty of 0.5% plus a luminosity-dependent sub-dominant term. Finally, we report the regular monitoring and performance studies using laser events in both standalone runs and during proton collisions. These studies include channel timing and quality inspection, and photomultiplier linearity and response dependence on anode current.
|
|
|
Ortega, P. G., Torres-Espallardo, I., Cerutti, F., Ferrari, A., Gillam, J. E., Lacasta, C., et al. (2015). Noise evaluation of Compton camera imaging for proton therapy. Phys. Med. Biol., 60(5), 1845–1863.
Abstract: Compton Cameras emerged as an alternative for real-time dose monitoring techniques for Particle Therapy (PT), based on the detection of prompt-gammas. As a consequence of the Compton scattering process, the gamma origin point can be restricted onto the surface of a cone (Compton cone). Through image reconstruction techniques, the distribution of the gamma emitters can be estimated, using cone-surfaces backprojections of the Compton cones through the image space, along with more sophisticated statistical methods to improve the image quality. To calculate the Compton cone required for image reconstruction, either two interactions, the last being photoelectric absorption, or three scatter interactions are needed. Because of the high energy of the photons in PT the first option might not be adequate, as the photon is not absorbed in general. However, the second option is less efficient. That is the reason to resort to spectral reconstructions, where the incoming. energy is considered as a variable in the reconstruction inverse problem. Jointly with prompt gamma, secondary neutrons and scattered photons, not strongly correlated with the dose map, can also reach the imaging detector and produce false events. These events deteriorate the image quality. Also, high intensity beams can produce particle accumulation in the camera, which lead to an increase of random coincidences, meaning events which gather measurements from different incoming particles. The noise scenario is expected to be different if double or triple events are used, and consequently, the reconstructed images can be affected differently by spurious data. The aim of the present work is to study the effect of false events in the reconstructed image, evaluating their impact in the determination of the beam particle ranges. A simulation study that includes misidentified events (neutrons and random coincidences) in the final image of a Compton Telescope for PT monitoring is presented. The complete chain of detection, from the beam particle entering a phantom to the event classification, is simulated using FLUKA. The range determination is later estimated from the reconstructed image obtained from a two and three-event algorithm based on Maximum Likelihood Expectation Maximization. The neutron background and random coincidences due to a therapeutic-like time structure are analyzed for mono-energetic proton beams. The time structure of the beam is included in the simulations, which will affect the rate of particles entering the detector.
|
|
|
LHCb Collaboration(Aaij, R. et al), Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Remon Alepuz, C., & Ruiz Vidal, J. (2022). Centrality determination in heavy-ion collisions with the LHCb detector. J. Instrum., 17(5), P05009–31pp.
Abstract: The centrality of heavy-ion collisions is directly related to the created medium in these interactions. A procedure to determine the centrality of collisions with the LHCb detector is implemented for lead-lead collisions root s(NN) = 5 TeV and lead-neon fixed-target collisions at root s(NN) = 69 GeV. The energy deposits in the electromagnetic calorimeter are used to determine and define the centrality classes. The correspondence between the number of participants and the centrality for the lead-lead collisions is in good agreement with the correspondence found in other experiments, and the centrality measurements for the lead-neon collisions presented here are performed for the first time in fixed-target collisions at the LHC.
|
|
|
Cabrera, M. E., Casas, J. A., Mitsou, V. A., Ruiz de Austri, R., & Terron, J. (2012). Histogram comparison tools for the search of new physics at LHC. Application to the CMSSM. J. High Energy Phys., 04(4), 133–27pp.
Abstract: We propose a rigorous and effective way to compare experimental and theoretical histograms, incorporating the different sources of statistical and systematic uncertainties. This is a useful tool to extract as much information as possible from the comparison between experimental data with theoretical simulations, optimizing the chances of identifying New Physics at the LHC. We illustrate this by showing how a search in the CMSSM parameter space, using Bayesian techniques, can effectively find the correct values of the CMSSM parameters by comparing histograms of events with multijets + missing transverse momentum displayed in the effective-mass variable. The procedure is in fact very efficient to identify the true supersymmetric model, in the case supersymmetry is really there and accessible to the LHC.
|
|