|
Mendez, V., Amoros, G., Garcia, F., & Salt, J. (2010). Emergent algorithms for replica location and selection in data grid. Futur. Gener. Comp. Syst., 26(7), 934–946.
Abstract: Grid infrastructures for e-Science projects are growing in magnitude terms. Improvements in data Grid replication algorithms may be critical in many of these infrastructures. This paper shows a decentralized replica optimization service, providing a general Emergent Artificial Intelligence (EAI) algorithm for the problem definition. Our aim is to set up a theoretical framework for emergent heuristics in Grid environments. Further, we describe two EAI approaches, the Particle Swarm Optimization PSO-Grid Multiswarm Federation and the Ant Colony Optimization ACO-Grid Asynchronous Colonies Optimization replica optimization algorithms, with some examples. We also present extended results with best performance and scalability features for PSO-Grid Multiswarrn Federation.
|
|
|
HADES Collaboration(Agakishiev, G. et al), Diaz, J., & Gil, A. (2010). Origin of the low-mass electron pair excess in light nucleus-nucleus collisions. Phys. Lett. B, 690(2), 118–122.
Abstract: We report measurements of electron pair production in elementary p + p and d + p reactions at 1.25 GeV/mu with the HADES spectrometer. For the first time, the electron pairs were reconstructed for n + p reactions by detecting the proton spectator from the deuteron breakup. We find that the yield of electron pairs with invariant mass Me+e- > 0.15 GeV/c(2) is about an order of magnitude larger in n + p reactions as compared to p + p. A comparison to model calculations demonstrates that the production mechanism is not sufficiently described yet. The electron pair spectra measured in C + C reactions are compatible with a superposition of elementary n + p and p + p collisions, leaving little room for additional electron pair sources in such light collision systems.
|
|
|
Albertus, C., Hernandez, E., & Nieves, J. (2010). Hyperfine mixing in electromagnetic decay of doubly heavy bc baryons. Phys. Lett. B, 690(3), 265–271.
Abstract: We investigate the role of hyperfine mixing in the electromagnetic decay of ground state doubly heavy bc baryons. As in the case of a previous calculation on b -> c semileptonic decays of doubly heavy baryons, we find large corrections to the electromagnetic decay widths due to this mixing. Contrary to the weak case just mentioned, we find here that one cannot use electromagnetic width relations obtained in the infinite heavy quark mass limit to experimentally extract information on the admixtures in a model independent way.
|
|
|
Jimenez, R., Kitching, T., Pena-Garay, C., & Verde, L. (2010). Can we measure the neutrino mass hierarchy in the sky? J. Cosmol. Astropart. Phys., 05(5), 035–14pp.
Abstract: Cosmological probes are steadily reducing the total neutrino mass window, resulting in constraints on the neutrino-mass degeneracy as the most significant outcome. In this work we explore the discovery potential of cosmological probes to constrain the neutrino hierarchy, and point out some subtleties that could yield spurious claims of detection. This has an important implication for next generation of double beta decay experiments, that will be able to achieve a positive signal in the case of degenerate or inverted hierarchy of Majorana neutrinos. We find that cosmological experiments that nearly cover the whole sky could in principle distinguish the neutrino hierarchy by yielding 'substantial' evidence for one scenario over the another, via precise measurements of the shape of the matter power spectrum from large scale structure and weak gravitational lensing.
|
|
|
Cabrera, M. E., Casas, J. A., & Ruiz de Austri, R. (2010). MSSM forecast for the LHC. J. High Energy Phys., 05(5), 043–48pp.
Abstract: We perform a forecast of the MSSM with universal soft terms (CMSSM) for the LHC, based on an improved Bayesian analysis. We do not incorporate ad hoc measures of the fine-tuning to penalize unnatural possibilities: such penalization arises from the Bayesian analysis itself when the experimental value of M-Z is considered. This allows to scan the whole parameter space, allowing arbitrarily large soft terms. Still the low-energy region is statistically favoured (even before including dark matter or g-2 constraints). Contrary to other studies, the results are almost unaffected by changing the upper limits taken for the soft terms. The results are also remarkable stable when using flat or logarithmic priors, a fact that arises from the larger statistical weight of the low-energy region in both cases. Then we incorporate all the important experimental constrains to the analysis, obtaining a map of the probability density of the MSSM parameter space, i.e. the forecast of the MSSM. Since not all the experimental information is equally robust, we perform separate analyses depending on the group of observables used. When only the most robust ones are used, the favoured region of the parameter space contains a significant portion outside the LHC reach. This effect gets reinforced if the Higgs mass is not close to its present experimental limit and persits when dark matter constraints are included. Only when the g-2 constraint (based on e(+)e(-) data) is considered, the preferred region (for μ> 0) is well inside the LHC scope. We also perform a Bayesian comparison of the positive- and negative-mu possibilities.
|
|