|
LHCb Collaboration(Aaij, R. et al), Jaimes Elles, S. J., Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Rebollo De Miguel, M., et al. (2023). Evidence for Modification of b Quark Hadronization in High-Multiplicity pp Collisions at √s=13 TeV. Phys. Rev. Lett., 131(6), 061901–11pp.
Abstract: The production rate of B-s(0) mesons relative to B-0 mesons is measured by the LHCb experiment in pp collisions at a center-of-mass energy root s = 13 TeV over the forward rapidity interval 2 < y < 4.5 as a function of the charged particle multiplicity measured in the event. Evidence at the 3.4 sigma level is found for an increase of the ratio of B-s(0) to B-0 cross sections with multiplicity at transverse momenta below 6 GeV=c, with no significant multiplicity dependence at higher transverse momentum. Comparison with data from e(+)e(-) collisions implies that the density of the hadronic medium may affect the production rates of B mesons. This is qualitatively consistent with the emergence of quark coalescence as an additional hadronization mechanism in high-multiplicity collisions.
|
|
|
Du, M. L., Filin, A., Baru, V., Dong, X. K., Epelbaum, E., Guo, F. K., et al. (2023). Role of Left-Hand Cut Contributions on Pole Extractions from Lattice Data: Case Study for Tcc(3875)+. Phys. Rev. Lett., 131(13), 131903–7pp.
Abstract: We discuss recent lattice data for the T-cc(3875)(+) state to stress, for the first time, a potentially strong impact of left-hand cuts from the one-pion exchange on the pole extraction for near-threshold exotic states. In particular, if the left-hand cut is located close to the two-particle threshold, which happens naturally in the DD* system for the pion mass exceeding its physical value, the effective-range expansion is valid only in a very limited energy range up to the cut and as such is of little use to reliably extract the poles. Then, an accurate extraction of the pole locations requires the one-pion exchange to be implemented explicitly into the scattering amplitudes. Our findings are general and potentially relevant for a wide class of hadronic near-threshold states.
|
|
|
Borja-Lloret, M., Barrientos, L., Bernabeu, J., Lacasta, C., Muñoz, E., Ros, A., et al. (2023). Influence of the background in Compton camera images for proton therapy treatment monitoring. Phys. Med. Biol., 68(14), 144001–16pp.
Abstract: Objective. Background events are one of the most relevant contributions to image degradation in Compton camera imaging for hadron therapy treatment monitoring. A study of the background and its contribution to image degradation is important to define future strategies to reduce the background in the system. Approach. In this simulation study, the percentage of different kinds of events and their contribution to the reconstructed image in a two-layer Compton camera have been evaluated. To this end, GATE v8.2 simulations of a proton beam impinging on a PMMA phantom have been carried out, for different proton beam energies and at different beam intensities. Main results. For a simulated Compton camera made of Lanthanum (III) Bromide monolithic crystals, coincidences caused by neutrons arriving from the phantom are the most common type of background produced by secondary radiations in the Compton camera, causing between 13% and 33% of the detected coincidences, depending on the beam energy. Results also show that random coincidences are a significant cause of image degradation at high beam intensities, and their influence in the reconstructed images is studied for values of the time coincidence windows from 500 ps to 100 ns. Significance. Results indicate the timing capabilities required to retrieve the fall-off position with good precision. Still, the noise observed in the image when no randoms are considered make us consider further background rejection methods.
|
|
|
Brzezinski, K. et al. (2023). Detection of range shifts in proton beam therapy using the J-PET scanner: a patient simulation study. Phys. Med. Biol., 68(14), 145016–17pp.
Abstract: Objective. The Jagiellonian positron emission tomography (J-PET) technology, based on plastic scintillators, has been proposed as a cost effective tool for detecting range deviations during proton therapy. This study investigates the feasibility of using J-PET for range monitoring by means of a detailed Monte Carlo simulation study of 95 patients who underwent proton therapy at the Cyclotron Centre Bronowice (CCB) in Krakow, Poland. Approach. Discrepancies between prescribed and delivered treatments were artificially introduced in the simulations by means of shifts in patient positioning and in the Hounsfield unit to the relative proton stopping power calibration curve. A dual-layer, cylindrical J-PET geometry was simulated in an in-room monitoring scenario and a triple-layer, dual-head geometry in an in-beam protocol. The distribution of range shifts in reconstructed PET activity was visualized in the beam's eye view. Linear prediction models were constructed from all patients in the cohort, using the mean shift in reconstructed PET activity as a predictor of the mean proton range deviation. Main results. Maps of deviations in the range of reconstructed PET distributions showed agreement with those of deviations in dose range in most patients. The linear prediction model showed a good fit, with coefficient of determination r (2) = 0.84 (in-room) and 0.75 (in-beam). Residual standard error was below 1 mm: 0.33 mm (in-room) and 0.23 mm (in-beam). Significance. The precision of the proposed prediction models shows the sensitivity of the proposed J-PET scanners to shifts in proton range for a wide range of clinical treatment plans. Furthermore, it motivates the use of such models as a tool for predicting proton range deviations and opens up new prospects for investigations into the use of intra-treatment PET images for predicting clinical metrics that aid in the assessment of the quality of delivered treatment.
|
|
|
Gammaldi, V., Zaldivar, B., Sanchez-Conde, M. A., & Coronado-Blazquez, J. (2023). A search for dark matter among Fermi-LAT unidentified sources with systematic features in machine learning. Mon. Not. Roy. Astron. Soc., 520(1), 1348–1361.
Abstract: Around one-third of the point-like sources in the Fermi-LAT catalogues remain as unidentified sources (unIDs) today. Indeed, these unIDs lack a clear, univocal association with a known astrophysical source. If dark matter (DM) is composed of weakly interacting massive particles (WIMPs), there is the exciting possibility that some of these unIDs may actually be DM sources, emitting gamma-rays from WIMPs annihilation. We propose a new approach to solve the standard, machine learning (ML) binary classification problem of disentangling prospective DM sources (simulated data) from astrophysical sources (observed data) among the unIDs of the 4FGL Fermi-LAT catalogue. We artificially build two systematic features for the DM data which are originally inherent to observed data: the detection significance and the uncertainty on the spectral curvature. We do it by sampling from the observed population of unIDs, assuming that the DM distributions would, if any, follow the latter. We consider different ML models: Logistic Regression, Neural Network (NN), Naive Bayes, and Gaussian Process, out of which the best, in terms of classification accuracy, is the NN, achieving around 93 . 3 per cent +/- 0 . 7 per cent performance. Other ML evaluation parameters, such as the True Ne gativ e and True Positive rates, are discussed in our work. Applying the NN to the unIDs sample, we find that the de generac y between some astrophysical and DM sources can be partially solved within this methodology. None the less, we conclude that there are no DM source candidates among the pool of 4FGL Fermi-LAT unIDs.
|
|