Ahlburg, P. et al, & Marinas, C. (2020). EUDAQ – a data acquisition software framework for common beam telescopes. J. Instrum., 15(1), P01038–30pp.
Abstract: EUDAQ is a generic data acquisition software developed for use in conjunction with common beam telescopes at charged particle beam lines. Providing high-precision reference tracks for performance studies of new sensors, beam telescopes are essential for the research and development towards future detectors for high-energy physics. As beam time is a highly limited resource, EUDAQ has been designed with reliability and ease-of-use in mind. It enables flexible integration of different independent devices under test via their specific data acquisition systems into a top-level framework. EUDAQ controls all components globally, handles the data flow centrally and synchronises and records the data streams. Over the past decade, EUDAQ has been deployed as part of a wide range of successful test beam campaigns and detector development applications.
|
Double Chooz collaboration(Abrahao, T. et al), & Novella, P. (2018). Novel event classification based on spectral analysis of scintillation waveforms in Double Chooz. J. Instrum., 13, P01031–26pp.
Abstract: Liquid scintillators are a common choice for neutrino physics experiments, but their capabilities to perform background rejection by scintillation pulse shape discrimination is generally limited in large detectors. This paper describes a novel approach for a pulse shape based event classification developed in the context of the Double Chooz reactor antineutrino experiment. Unlike previous implementations, this method uses the Fourier power spectra of the scintillation pulse shapes to obtain event-wise information. A classification variable built from spectral information was able to achieve an unprecedented performance, despite the lack of optimization at the detector design level. Several examples of event classification are provided, ranging from differentiation between the detector volumes and an efficient rejection of instrumental light noise, to some sensitivity to the particle type, such as stopping muons, ortho-positronium formation, alpha particles as well as electrons and positrons. In combination with other techniques the method is expected to allow for a versatile and more efficient background rejection in the future, especially if detector optimization is taken into account at the design level.
|
Yepes, H. (2012). The ANTARES neutrino detector instrumentation. J. Instrum., 7, C01022–9pp.
Abstract: ANTARES is actually the fully operational and the largest neutrino telescope in the Northern hemisphere. Located in the Mediterranean Sea, it consists of a 3D array of 885 photomultiplier tubes (PMTs) arranged in 12 detection lines (25 storeys each), able to detect the Cherenkov light induced by upgoing relativistic muons produced in the interaction of high energy cosmic neutrinos with the detector surroundings. Among its physics goals, the search for neutrino astrophysical sources and the indirect detection of dark matter particles coming from the sun are of particular interest. To reach these goals, good accuracy in track reconstruction is mandatory, so several calibration systems for timing and positioning have been developed. In this contribution we will present the design of the detector, calibration systems, associated equipment and its performance on track reconstruction.
|
ATLAS Collaboration(Aad, G. et al), Amoros, G., Cabrera Urban, S., Castillo Gimenez, V., Costa, M. J., Ferrer, A., et al. (2012). A study of the material in the ATLAS inner detector using secondary hadronic interactions. J. Instrum., 7, P01013–40pp.
Abstract: The ATLAS inner detector is used to reconstruct secondary vertices due to hadronic interactions of primary collision products, so probing the location and amount of material in the inner region of ATLAS. Data collected in 7 TeV pp collisions at the LHC, with a minimum bias trigger, are used for comparisons with simulated events. The reconstructed secondary vertices have spatial resolutions ranging from similar to 200 μm to 1 mm. The overall material description in the simulation is validated to within an experimental uncertainty of about 7%. This will lead to a better understanding of the reconstruction of various objects such as tracks, leptons, jets, and missing transverse momentum.
|
ATLAS Collaboration(Aad, G. et al), Amos, K. R., Aparisi Pozo, J. A., Bailey, A. J., Cabrera Urban, S., Cardillo, F., et al. (2022). Operation and performance of the ATLAS semiconductor tracker in LHC Run 2. J. Instrum., 17(1), P01013–56pp.
Abstract: The semiconductor tracker (SCT) is one of the tracking systems for charged particles in the ATLAS detector. It consists of 4088 silicon strip sensor modules. During Run 2 (2015-2018) the Large Hadron Collider delivered an integrated luminosity of 156 fb(-1) to the ATLAS experiment at a centre-of-mass proton-proton collision energy of 13 TeV. The instantaneous luminosity and pile-up conditions were far in excess of those assumed in the original design of the SCT detector. Due to improvements to the data acquisition system, the SCT operated stably throughout Run 2. It was available for 99.9% of the integrated luminosity and achieved a data-quality efficiency of 99.85%. Detailed studies have been made of the leakage current in SCT modules and the evolution of the full depletion voltage, which are used to study the impact of radiation damage to the modules. '
|
Poley, L., Stolzenberg, U., Schwenker, B., Frey, A., Gottlicher, P., Marinas, C., et al. (2021). Mapping the material distribution of a complex structure in an electron beam. J. Instrum., 16(1), P01010–33pp.
Abstract: The simulation and analysis of High Energy Physics experiments require a realistic simulation of the detector material and its distribution. The challenge is to describe all active and passive parts of large scale detectors like ATLAS in terms of their size, position and material composition. The common method for estimating the radiation length by weighing individual components, adding up their contributions and averaging the resulting material distribution over extended structures provides a good general estimate, but can deviate significantly from the material actually present. A method has been developed to assess its material distribution with high spatial resolution using the reconstructed scattering angles and hit positions of high energy electron tracks traversing an object under investigation. The study presented here shows measurements for an extended structure with a highly inhomogeneous material distribution. The structure under investigation is an End-of-Substructure-card prototype designed for the ATLAS Inner Tracker strip tracker – a PCB populated with components of a large range of material budgets and sizes. The measurements presented here summarise requirements for data samples and reconstructed electron tracks for reliable image reconstruction of large scale, inhomogeneous samples, choices of pixel sizes compared to the size of features under investigation as well as a bremsstrahlung correction for high material densities and thicknesses.
|
Pierre Auger Collaboration(Abreu, P. et al), & Pastor, S. (2011). The Pierre Auger Observatory scaler mode for the study of solar activity modulation of galactic cosmic rays. J. Instrum., 6, P01003–16pp.
Abstract: Since data-taking began in January 2004, the Pierre Auger Observatory has been recording the count rates of low energy secondary cosmic ray particles for the self-calibration of the ground detectors of its surface detector array. After correcting for atmospheric effects, modulations of galactic cosmic rays due to solar activity and transient events are observed. Temporal variations related with the activity of the heliosphere can be determined with high accuracy due to the high total count rates. In this study, the available data are presented together with an analysis focused on the observation of Forbush decreases, where a strong correlation with neutron monitor data is found.
|
Marco-Hernandez, R. (2011). Development of a beam test telescope based on the Alibava readout system. J. Instrum., 6, C01002–7pp.
Abstract: A telescope for a beam test have been developed as a result of a collaboration among the University of Liverpool, Centro Nacional de Microelectronica (CNM) of Barcelona and Instituto de Fisica Corpuscular (IFIC) of Valencia. This system is intended to carry out both analogue charge collection and spatial resolution measurements with different types of microstrip or pixel silicon detectors in a beam test environment. The telescope has four XY measurement as well as trigger planes (XYT board) and it can accommodate up to twelve devices under test (DUT board). The DUT board uses two Beetle ASICs for the readout of chilled silicon detectors. The board could operate in a self-triggering mode. The board features a temperature sensor and it can be mounted on a rotary stage. A peltier element is used for cooling the DUT. Each XYT board measures the track space points using two silicon strip detectors connected to two Beetle ASICs. It can also trigger on the particle tracks in the beam test. The board includes a CPLD which allows for the synchronization of the trigger signal to a common clock frequency, delaying and implementing coincidence with other XYT boards. An Alibava mother board is used to read out and to control each XYT/DUT board from a common trigger signal and a common clock signal. The Alibava board has a TDC on board to have a time stamp of each trigger. The data collected by each Alibava board is sent to a master card by means of a local data/address bus following a custom digital protocol. The master board distributes the trigger, clock and reset signals. It also merges the data streams from up to sixteen Alibava boards. The board has also a test channel for testing in a standard mode a XYT or DUT board. This board is implemented with a Xilinx development board and a custom patch board. The master board is connected with the DAQ software via 100M Ethernet. Track based alignment software has also been developed for the data obtained with the DAQ software.
|
Folgado, M. G., & Sanz, V. (2022). Exploring the political pulse of a country using data science tools. J. Comput. Soc. Sci., 5, 987–1000.
Abstract: In this paper we illustrate the use of Data Science techniques to analyse complex human communication. In particular, we consider tweets from leaders of political parties as a dynamical proxy to political programmes and ideas. We also study the temporal evolution of their contents as a reaction to specific events. We analyse levels of positive and negative sentiment in the tweets using new tools adapted to social media. We also train a Fully-Connected Neural Network (FCNN) to recognise the political affiliation of a tweet. The FCNN is able to predict the origin of the tweet with a precision in the range of 71-75%, and the political leaning (left or right) with a precision of around 90%. This study is meant to be viewed as an example of how to use Twitter data and different types of Data Science tools for a political analysis.
|
Mendez, V., Amoros, G., Garcia, F., & Salt, J. (2010). Emergent algorithms for replica location and selection in data grid. Futur. Gener. Comp. Syst., 26(7), 934–946.
Abstract: Grid infrastructures for e-Science projects are growing in magnitude terms. Improvements in data Grid replication algorithms may be critical in many of these infrastructures. This paper shows a decentralized replica optimization service, providing a general Emergent Artificial Intelligence (EAI) algorithm for the problem definition. Our aim is to set up a theoretical framework for emergent heuristics in Grid environments. Further, we describe two EAI approaches, the Particle Swarm Optimization PSO-Grid Multiswarm Federation and the Ant Colony Optimization ACO-Grid Asynchronous Colonies Optimization replica optimization algorithms, with some examples. We also present extended results with best performance and scalability features for PSO-Grid Multiswarrn Federation.
|