XENON Collaboration(Aprile, E. et al), & Orrigo, S. E. A. (2015). Exclusion of leptophilic dark matter models using XENON100 electronic recoil data. Science, 349(6250), 851–854.
Abstract: Laboratory experiments searching for galactic dark matter particles scattering off nuclei have so far not been able to establish a discovery. We use data from the XENON100 experiment to search for dark matter interacting with electrons. With no evidence for a signal above the low background of our experiment, we exclude a variety of representative dark matter models that would induce electronic recoils. For axial-vector couplings to electrons, we exclude cross sections above 6 x 10(-35) cm(2) for particle masses of m(chi) = 2 GeV/c(2). Independent of the dark matter halo, we exclude leptophilic models as an explanation for the long-standing DAMA/LIBRA signal, such as couplings to electrons through axial-vector interactions at a 4.4 sigma confidence level, mirror dark matter at 3.6 sigma, and luminous dark matter at 4.6 sigma.
|
Barrientos, D., Gonzalez, V., Bellato, M., Gadea, A., Bazzacco, D., Blasco, J. M., et al. (2013). Multiple Register Synchronization With a High-Speed Serial Link Using the Aurora Protocol. IEEE Trans. Nucl. Sci., 60(5), 3521–3525.
Abstract: In this work, the development and characterization of a multiple synchronous registers interface communicating with a high-speed serial link and using the Aurora protocol is presented. A detailed description of the developing process and the characterization methods and hardware test benches are also included. This interface will implement the slow control buses of the digitizer cards for the second generation of electronics for the Advanced GAmma Tracking Array (AGATA).
|
Barrientos, D., Bellato, M., Bazzacco, D., Bortolato, D., Cocconi, P., Gadea, A., et al. (2015). Performance of the Fully Digital FPGA-Based Front-End Electronics for the GALILEO Array. IEEE Trans. Nucl. Sci., 62(6), 3134–3139.
Abstract: In this work we present the architecture and results of a fully digital Front End Electronics (FEE) read out system developed for the GALILEO array. The FEE system, developed in collaboration with the Advanced Gamma Tracking Array (AGATA) collaboration, is composed of three main blocks: preamplifiers, digitizers and preprocessing electronics. The slow control system contains a custom Linux driver, a dynamic library and a server implementing network services. This work presents the first results of the digital FEE system coupled with a GALILEO germanium detector, which has demonstrated the capability to achieve an energy resolution of 1.53% at an energy of 1.33 MeV, similar to the one obtained with a conventional analog system. While keeping a good performance in terms of energy resolution, digital electronics will allow to instrument the full GALILEO array with a versatile system with high integration and low power consumption and costs.
|
Lloret, E., Fernandez, A., Trbojevich, R., Arnau, J., & Picouet, P. A. (2016). Relevance of nanocomposite packaging on the stability of vacuum-packed dry cured ham. Meat Sci., 118, 8–14.
Abstract: In this study effects of a novel high barrier multilayer polyamide film containing dispersed nanoclays (PAN) on the stability of vacuum packed dry-cured ham were investigated during 90 days refrigerated storage in comparison with non-modified multilayer polyamide (PA) and a commercial high barrier film. Characteristic bands of the mineral in FT-IR spectra confirmed the presence of nanoclays in PAN, enhancing oxygen transmission barrier properties and UV protection. Packaging in PAN films did not originate significant changes on colour or lipid oxidation during prolonged storage of vacuum-packed dry-cured ham. Larger oxygen transmission rates in PA films caused changes in CIE b* during refrigerated storage. Ham quality was not affected by light exposition during 90 days and only curing had a significant benefit on colour and TBARS, being cured samples more stable during storage in all the packages used. Packaging of dry-cured ham in PAN was equivalent to commercial high barrier films.
|
Folgado, M. G., & Sanz, V. (2022). Exploring the political pulse of a country using data science tools. J. Comput. Soc. Sci., 5, 987–1000.
Abstract: In this paper we illustrate the use of Data Science techniques to analyse complex human communication. In particular, we consider tweets from leaders of political parties as a dynamical proxy to political programmes and ideas. We also study the temporal evolution of their contents as a reaction to specific events. We analyse levels of positive and negative sentiment in the tweets using new tools adapted to social media. We also train a Fully-Connected Neural Network (FCNN) to recognise the political affiliation of a tweet. The FCNN is able to predict the origin of the tweet with a precision in the range of 71-75%, and the political leaning (left or right) with a precision of around 90%. This study is meant to be viewed as an example of how to use Twitter data and different types of Data Science tools for a political analysis.
|
Boronat, M., Marinas, C., Frey, A., Garcia, I., Schwenker, B., Vos, M., et al. (2015). Physical Limitations to the Spatial Resolution of Solid-State Detectors. IEEE Trans. Nucl. Sci., 62(1), 381–386.
Abstract: In this paper we explore the effect of delta-ray emission and fluctuations in the signal deposition on the detection of charged particles in silicon-based detectors. We show that these two effects ultimately limit the resolution that can be achieved by interpolation of the signal in finely segmented position-sensitive solid-state devices.
|
Bouhova-Thacker, E., Kostyukhin, V., Koffas, T., Liebig, W., Limper, M., Piacquadio, G. N., et al. (2010). Expected Performance of Vertex Reconstruction in the ATLAS Experiment at the LHC. IEEE Trans. Nucl. Sci., 57(2), 760–767.
Abstract: In the harsh environment of the Large Hadron Collider at CERN (design luminosity of 10(34) cm(-2) s(-1)) efficient reconstruction of vertices is crucial for many physics analyses. Described in this paper is the expected performance of the vertex reconstruction used in the ATLAS experiment. The algorithms for the reconstruction of primary and secondary vertices as well as for finding photon conversions and vertex reconstruction in jets are described. The implementation of vertex algorithms which follows a very modular design based on object-oriented C++ is presented. A user-friendly concept allows event reconstruction and physics analyses to compare and optimize their choice among different vertex reconstruction strategies. The performance of implemented algorithms has been studied on a variety of Monte Carlo samples and results are presented.
|
Cabello, J., Torres-Espallardo, I., Gillam, J. E., & Rafecas, M. (2013). PET Reconstruction From Truncated Projections Using Total-Variation Regularization for Hadron Therapy Monitoring. IEEE Trans. Nucl. Sci., 60(5), 3364–3372.
Abstract: Hadron therapy exploits the properties of ion beams to treat tumors by maximizing the dose released to the target and sparing healthy tissue. With hadron beams, the dose distribution shows a relatively low entrance dose which rises sharply at the end of the range, providing the characteristic Bragg peak that drops quickly thereafter. It is of critical importance in order not to damage surrounding healthy tissues and/or avoid targeting underdosage to know where the delivered dose profile ends-the location of the Bragg peak. During hadron therapy, short-lived beta(+)-emitters are produced along the beam path, their distribution being correlated with the delivered dose. Following positron annihilation, two photons are emitted, which can be detected using a positron emission tomography (PET) scanner. The low yield of emitters, their short half-life, and the wash out from the target region make the use of PET, even only a few minutes after hadron irradiation, a challenging application. In-beam PET represents a potential candidate to estimate the distribution of beta(+)-emitters during or immediately after irradiation, at the cost of truncation effects and degraded image quality due to the partial rings required of the PET scanner. Time-of-flight (ToF) information can potentially be used to compensate for truncation effects and to enhance image contrast. However, the highly demanding timing performance required in ToF-PET makes this option costly. Alternatively, the use of maximum-a-posteriori-expectation-maximization (MAP-EM), including total variation (TV) in the cost function, produces images with low noise, while preserving spatial resolution. In this paper, we compare data reconstructed with maximum-likelihood-expectation-maximization (ML-EM) and MAP-EM using TV as prior, and the impact of including ToF information, from data acquired with a complete and a partial-ring PET scanner, of simulated hadron beams interacting with a polymethyl methacrylate (PMMA) target. The results show that MAP-EM, in the absence of ToF information, produces lower noise images and more similar data compared to the simulated beta(+) distributions than ML-EM with ToF information in the order of 200-600 ps. The investigation is extended to the combination of MAP-EM and ToF information to study the limit of performance using both approaches.
|
LHCb Collaboration(Aaij, R. et al), Henry, L., Jashal, B. K., Martinez-Vidal, F., Oyanguren, A., Remon Alepuz, C., et al. (2021). Evidence of a J/psi Lambda structure and observation of excited Xi(-) states in the Xi(-)(b) -> J/psi Lambda K- decay. Sci. Bull., 66(13), 1278–1287.
Abstract: First evidence of a structure in the J/psi Lambda invariant mass distribution is obtained from an amplitude analysis of Xi(-)(b) -> J/psi Lambda K- decays. The observed structure is consistent with being due to a charmonium pentaquark with strangeness with a significance of 3.1r including systematic uncertainties and lookelsewhere effect. Its mass and width are determined to be 4458.8 +/- 2.9(-1.1)(+4.7) MeV and 17.3 +/- 6.5(-5.7)(+8.0) MeV, respectively, where the quoted uncertainties are statistical and systematic. The structure is also consistent with being due to two resonances. In addition, the narrow excited Xi(-) states, Xi(-)(1690) and Xi(-)(1820)(-), are seen for the first time in a Xi(-)(b) decay, and their masses and widths are measured with improved precision. The analysis is performed using pp collision data corresponding to a total integrated luminosity of 9 fb(-1), collected with the LHCb experiment at centre-of-mass energies of 7, 8 and 13 TeV.
|
Tetrault, M. A., Oliver, J. F., Bergeron, M., Lecomte, R., & Fontaine, R. (2010). Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET. IEEE Trans. Nucl. Sci., 57(1), 117–124.
Abstract: Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET(TM), a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.
|