|
Yamamoto, H. (2021). The International Linear Collider Project-Its Physics and Status. Symmetry-Basel, 13(4), 674–15pp.
Abstract: The discovery of Higgs particle has ushered in a new era of particle physics. Even though the list of members of the standard theory of particle physics is now complete, the shortcomings of the theory became ever more acute. It is generally considered that the best solution to the problems is an electron-positron collider that can study Higgs particle with high precision and high sensitivity; namely, a Higgs factory. Among a few candidates for Higgs factory, the International Linear Collider (ILC) is currently the most advanced in its program. In this article, we review the physics and the project status of the ILC including its energy expandability.
|
|
|
Hirn, J., Garcia, J. E., Montesinos-Navarro, A., Sanchez-Martin, R., Sanz, V., & Verdu, M. (2022). A deep Generative Artificial Intelligence system to predict species coexistence patterns. Methods Ecol. Evol., 13, 1052–1061.
Abstract: Predicting coexistence patterns is a current challenge to understand diversity maintenance, especially in rich communities where these patterns' complexity is magnified through indirect interactions that prevent their approximation with classical experimental approaches. We explore cutting-edge Machine Learning techniques called Generative Artificial Intelligence (GenAI) to predict species coexistence patterns in vegetation patches, training generative adversarial networks (GAN) and variational AutoEncoders (VAE) that are then used to unravel some of the mechanisms behind community assemblage. The GAN accurately reproduces real patches' species composition and plant species' affinity to different soil types, and the VAE also reaches a high level of accuracy, above 99%. Using the artificially generated patches, we found that high-order interactions tend to suppress the positive effects of low-order interactions. Finally, by reconstructing successional trajectories, we could identify the pioneer species with larger potential to generate a high diversity of distinct patches in terms of species composition. Understanding the complexity of species coexistence patterns in diverse ecological communities requires new approaches beyond heuristic rules. Generative Artificial Intelligence can be a powerful tool to this end as it allows to overcome the inherent dimensionality of this challenge.
|
|
|
Poley, L., Blue, A., Bloch, I., Buttar, C., Fadeyev, V., Fernandez-Tejero, J., et al. (2019). Mapping the depleted area of silicon diodes using a micro-focused X-ray beam. J. Instrum., 14, P03024–14pp.
Abstract: For the Phase-II Upgrade of the ATLAS detector at CERN, the current ATLAS Inner Detector will be replaced with the ATLAS Inner Tracker (ITk). The ITk will be an all-silicon detector, consisting of a pixel tracker and a strip tracker. Sensors for the ITk strip tracker are required to have a low leakage current up to bias voltages of 500V to maintain a low noise and power dissipation. In order to minimise sensor leakage currents, particularly in the high-radiation environment inside the ATLAS detector, sensors are foreseen to be operated at low temperatures and to be manufactured from wafers with a high bulk resistivity of several k Omega.cm. Simulations showed the electric field inside sensors with high bulk resistivity to extend towards the sensor edge, which could lead to increased surface currents for narrow dicing edges. In order to map the electric field inside biased silicon sensors with high bulk resistivity, three diodes from ATLAS silicon strip sensor prototype wafers were studied with a monochromatic, micro-focused X-ray beam at the Diamond Light Source (Didcot, U.K.). For all devices under investigation, the electric field inside the diode was mapped and its dependence on the applied bias voltage was studied.
|
|
|
ATLAS Collaboration(Aad, G. et al), Alvarez Piqueras, D., Aparisi Pozo, J. A., Bailey, A. J., Barranco Navarro, L., Cabrera Urban, S., et al. (2019). Resolution of the ATLAS muon spectrometer monitored drift tubes in LHC Run 2. J. Instrum., 14, P09011–35pp.
Abstract: The momentum measurement capability of the ATLAS muon spectrometer relies fundamentally on the intrinsic single-hit spatial resolution of the monitored drift tube precision tracking chambers. Optimal resolution is achieved with a dedicated calibration program that addresses the specific operating conditions of the 354 000 high-pressure drift tubes in the spectrometer. The calibrations consist of a set of timing offsets and drift time to drift distance transfer relations, and result in chamber resolution functions. This paper describes novel algorithms to obtain precision calibrations from data collected by ATLAS in LHC Run 2 and from a gas monitoring chamber, deployed in a dedicated gas facility. The algorithm output consists of a pair of correction constants per chamber which are applied to baseline calibrations, and determined to be valid for the entire ATLAS Run 2. The final single-hit spatial resolution, averaged over 1172 monitored drift tube chambers, is 81.7 +/- 2.2 μm.
|
|
|
ATLAS Collaboration(Aad, G. et al), Alvarez Piqueras, D., Aparisi Pozo, J. A., Bailey, A. J., Cabrera Urban, S., Castillo, F. L., et al. (2019). Electron and photon performance measurements with the ATLAS detector using the 2015-2017 LHC proton-proton collision data. J. Instrum., 14, P12006–69pp.
Abstract: This paper describes the reconstruction of electrons and photons with the ATLAS detector, employed for measurements and searches exploiting the complete LHC Run 2 dataset. An improved energy clustering algorithm is introduced, and its implications for the measurement and identification of prompt electrons and photons are discussed in detail. Corrections and calibrations that affect performance, including energy calibration, identification and isolation efficiencies, and the measurement of the charge of reconstructed electron candidates are determined using up to 81 fb(-1) of proton-proton collision data collected at root s = 13 TeV between 2015 and 2017.
|
|
|
Navarro-Salas, J., & Pla, S. (2022). Particle Creation and the Schwinger Model. Symmetry-Basel, 14(11), 2435–9pp.
Abstract: We study the particle creation process in the Schwinger model coupled with an external classical source. One can approach the problem by taking advantage of the fact that the full quantized model is solvable and equivalent to a (massive) gauge field with a non-local effective action. Alternatively, one can also face the problem by following the standard semiclassical route. This means quantizing the massless Dirac field and considering the electromagnetic field as a classical background. We evaluate the energy created by a generic, homogeneous, and time-dependent source. The results match exactly in both approaches. This proves in a very direct and economical way the validity of the semiclassical approach for the (massless) Schwinger model, in agreement with a previous analysis based on the linear response equation. Our discussion suggests that a similar analysis for the massive Schwinger model could be used as a non-trivial laboratory to confront a fully quantized solvable model with its semiclassical approximation, therefore mimicking the long-standing confrontation of quantum gravity with quantum field theory in curved spacetime.
|
|
|
ATLAS Collaboration(Aad, G. et al), Aparisi Pozo, J. A., Bailey, A. J., Barranco Navarro, L., Cabrera Urban, S., Castillo, F. L., et al. (2020). ATLAS data quality operations and performance for 2015-2018 data-taking. J. Instrum., 15(4), P04003–43pp.
Abstract: The ATLAS detector at the Large Hadron Collider reads out particle collision data from over 100 million electronic channels at a rate of approximately 100 kHz, with a recording rate for physics events of approximately 1 kHz. Before being certified for physics analysis at computer centres worldwide, the data must be scrutinised to ensure they are clean from any hardware or software related issues that may compromise their integrity. Prompt identification of these issues permits fast action to investigate, correct and potentially prevent future such problems that could render the data unusable. This is achieved through the monitoring of detector-level quantities and reconstructed collision event characteristics at key stages of the data processing chain. This paper presents the monitoring and assessment procedures in place at ATLAS during 2015-2018 data-taking. Through the continuous improvement of operational procedures, ATLAS achieved a high data quality efficiency, with 95.6% of the recorded proton-proton collision data collected at root s = 13 TeV certified for physics analysis.
|
|
|
KM3NeT Collaboration(Aiello, S. et al), Alves Garre, S., Calvo, D., Carretero, V., Colomer, M., Corredoira, I., et al. (2020). Event reconstruction for KM3NeT/ORCA using convolutional neural networks. J. Instrum., 15(10), P10005–39pp.
Abstract: The KM3NeT research infrastructure is currently under construction at two locations in the Mediterranean Sea. The KM3NeT/ORCA water-Cherenkov neutrino detector off the French coast will instrument several megatons of seawater with photosensors. Its main objective is the determination of the neutrino mass ordering. This work aims at demonstrating the general applicability of deep convolutional neural networks to neutrino telescopes, using simulated datasets for the KM3NeT/ORCA detector as an example. To this end, the networks are employed to achieve reconstruction and classification tasks that constitute an alternative to the analysis pipeline presented for KM3NeT/ORCA in the KM3NeT Letter of Intent. They are used to infer event reconstruction estimates for the energy, the direction, and the interaction point of incident neutrinos. The spatial distribution of Cherenkov light generated by charged particles induced in neutrino interactions is classified as shower- or track-like, and the main background processes associated with the detection of atmospheric neutrinos are recognized. Performance comparisons to machine-learning classification and maximum-likelihood reconstruction algorithms previously developed for KM3NeT/ORCA are provided. It is shown that this application of deep convolutional neural networks to simulated datasets for a large-volume neutrino telescope yields competitive reconstruction results and performance improvements with respect to classical approaches.
|
|
|
DUNE Collaboration(Abi, B. et al), Antonova, M., Barenboim, G., Cervera-Villanueva, A., De Romeri, V., Fernandez Menendez, P., et al. (2020). First results on ProtoDUNE-SP liquid argon time projection chamber performance from a beam test at the CERN Neutrino Platform. J. Instrum., 15(12), P12004–100pp.
Abstract: The ProtoDUNE-SP detector is a single-phase liquid argon time projection chamber with an active volume of 7.2 x 6.1 x 7.0 m(3). It is installed at the CERN Neutrino Platform in a specially-constructed beam that delivers charged pions, kaons, protons, muons and electrons with momenta in the range 0.3 GeV/c to 7 GeV/c. Beam line instrumentation provides accurate momentum measurements and particle identification. The ProtoDUNE-SP detector is a prototype for the first far detector module of the Deep Underground Neutrino Experiment, and it incorporates full-size components as designed for that module. This paper describes the beam line, the time projection chamber, the photon detectors, the cosmic-ray tagger, the signal processing and particle reconstruction. It presents the first results on ProtoDUNE-SP's performance, including noise and gain measurements, dE/dx calibration for muons, protons, pions and electrons, drift electron lifetime measurements, and photon detector noise, signal sensitivity and time resolution measurements. The measured values meet or exceed the specifications for the DUNE far detector, in several cases by large margins. ProtoDUNE-SP's successful operation starting in 2018 and its production of large samples of high-quality data demonstrate the effectiveness of the single-phase far detector design.
|
|
|
Ahlburg, P. et al, & Marinas, C. (2020). EUDAQ – a data acquisition software framework for common beam telescopes. J. Instrum., 15(1), P01038–30pp.
Abstract: EUDAQ is a generic data acquisition software developed for use in conjunction with common beam telescopes at charged particle beam lines. Providing high-precision reference tracks for performance studies of new sensors, beam telescopes are essential for the research and development towards future detectors for high-energy physics. As beam time is a highly limited resource, EUDAQ has been designed with reliability and ease-of-use in mind. It enables flexible integration of different independent devices under test via their specific data acquisition systems into a top-level framework. EUDAQ controls all components globally, handles the data flow centrally and synchronises and records the data streams. Over the past decade, EUDAQ has been deployed as part of a wide range of successful test beam campaigns and detector development applications.
|
|