|
Aliaga, R. J., & Guirao, A. J. (2019). On the preserved extremal structure of Lipschitz-free spaces. Studia Math., 245(1), 1–14.
Abstract: We characterize preserved extreme points of the unit ball of Lipschitz-free spaces F (X) in terms of simple geometric conditions on the underlying metric space (X, d). Namely, the preserved extreme points are the elementary molecules corresponding to pairs of points p, q in X such that the triangle inequality d (p, q) <= d (p, r) + d (q, r) is uniformly strict for r away from p, q. For compact X, this condition reduces to the triangle inequality being strict. As a consequence, we give an affirmative answer to a conjecture of N. Weaver that compact spaces are concave if and only if they have no triple of metrically aligned points, and we show that all extreme points are preserved for several classes of compact metric spaces X, including Holder and countable compacta.
|
|
|
Conde, D., Castillo, F. L., Escobar, C., García, C., Garcia Navarro, J. E., Sanz, V., et al. (2023). Forecasting Geomagnetic Storm Disturbances and Their Uncertainties Using Deep Learning. Space Weather, 21(11), e2023SW003474–27pp.
Abstract: Severe space weather produced by disturbed conditions on the Sun results in harmful effects both for humans in space and in high-latitude flights, and for technological systems such as spacecraft or communications. Also, geomagnetically induced currents (GICs) flowing on long ground-based conductors, such as power networks, potentially threaten critical infrastructures on Earth. The first step in developing an alarm system against GICs is to forecast them. This is a challenging task given the highly non-linear dependencies of the response of the magnetosphere to these perturbations. In the last few years, modern machine-learning models have shown to be very good at predicting magnetic activity indices. However, such complex models are on the one hand difficult to tune, and on the other hand they are known to bring along potentially large prediction uncertainties which are generally difficult to estimate. In this work we aim at predicting the SYM-H index characterizing geomagnetic storms multiple-hour ahead, using public interplanetary magnetic field (IMF) data from the Sun-Earth L1 Lagrange point and SYM-H data. We implement a type of machine-learning model called long short-term memory (LSTM) network. Our scope is to estimate the prediction uncertainties coming from a deep-learning model in the context of forecasting the SYM-H index. These uncertainties will be essential to set reliable alarm thresholds. The resulting uncertainties turn out to be sizable at the critical stages of the geomagnetic storms. Our methodology includes as well an efficient optimization of important hyper-parameters of the LSTM network and robustness tests.
|
|
|
SCiMMA and SNEWS Collaborations(Baxter, A. L. et al), & Colomer, M. (2022). Collaborative experience between scientific software projects using Agile Scrum development. Softw.-Pract. Exp., 52, 2077–2096.
Abstract: Developing sustainable software for the scientific community requires expertise in software engineering and domain science. This can be challenging due to the unique needs of scientific software, the insufficient resources for software engineering practices in the scientific community, and the complexity of developing for evolving scientific contexts. While open-source software can partially address these concerns, it can introduce complicating dependencies and delay development. These issues can be reduced if scientists and software developers collaborate. We present a case study wherein scientists from the SuperNova Early Warning System collaborated with software developers from the Scalable Cyberinfrastructure for Multi-Messenger Astrophysics project. The collaboration addressed the difficulties of open-source software development, but presented additional risks to each team. For the scientists, there was a concern of relying on external systems and lacking control in the development process. For the developers, there was a risk in supporting a user-group while maintaining core development. These issues were mitigated by creating a second Agile Scrum framework in parallel with the developers' ongoing Agile Scrum process. This Agile collaboration promoted communication, ensured that the scientists had an active role in development, and allowed the developers to evaluate and implement the scientists' software requirements. The collaboration provided benefits for each group: the scientists actuated their development by using an existing platform, and the developers utilized the scientists' use-case to improve their systems. This case study suggests that scientists and software developers can avoid scientific computing issues by collaborating and that Agile Scrum methods can address emergent concerns.
|
|
|
Esteve, R., Toledo, J. F., Herrero, V., Simon, A., Monrabal, F., Alvarez, V., et al. (2021). The Event Detection System in the NEXT-White Detector. Sensors, 21(2), 673–18pp.
Abstract: This article describes the event detection system of the NEXT-White detector, a 5 kg high pressure xenon TPC with electroluminescent amplification, located in the Laboratorio Subterraneo de Canfranc (LSC), Spain. The detector is based on a plane of photomultipliers (PMTs) for energy measurements and a silicon photomultiplier (SiPM) tracking plane for offline topological event filtering. The event detection system, based on the SRS-ATCA data acquisition system developed in the framework of the CERN RD51 collaboration, has been designed to detect multiple events based on online PMT signal energy measurements and a coincidence-detection algorithm. Implemented on FPGA, the system has been successfully running and evolving during NEXT-White operation. The event detection system brings some relevant and new functionalities in the field. A distributed double event processor has been implemented to detect simultaneously two different types of events thus allowing simultaneous calibration and physics runs. This special feature provides constant monitoring of the detector conditions, being especially relevant to the lifetime and geometrical map computations which are needed to correct high-energy physics events. Other features, like primary scintillation event rejection, or a double buffer associated with the type of event being searched, help reduce the unnecessary data throughput thus minimizing dead time and improving trigger efficiency.
|
|
|
Real, D., Calvo, D., Diaz, A., Salesa Greus, F., & Sanchez Losa, A. (2022). A Narrow Optical Pulse Emitter Based on LED: NOPELED. Sensors, 22(19), 7683–15pp.
Abstract: Light sources emitting short pulses are needed in many particle physics experiments using optical sensors as they can replicate the light produced by the particles being detected and are also an important calibration and test element. This work presents NOPELED, a light source based on LEDs emitting short optical pulses with typical rise times of less than 3 ns and Full Width at Half Maximum lower than 7 ns. The emission wavelength depends on the model of LED used. Several LED models have been characterized in the range from 405 to 532 nm, although NOPELED can work with LED emitting wavelengths outside of that region. While the wavelength is fixed for a given LED model, the intensity and the frequency of the optical pulse can be controlled. NOPELED, which also has low cost and simple operation, can be operated remotely, making it appropriate for either different physics experiments needing in-place light sources such as astrophysical neutrino detectors using photo-multipliers or positron emission tomography devices using scintillation counters, or, beyond physics, applications needing short pulses of light such as protein fluorescence or chemodetection of heavy metals.
|
|