当前位置: 首页 > 工具软件 > arxiv-vanity > 使用案例 >

arXiv journal 2022.0207

薄欣怿
2023-12-01

White Dwarfs as Physics Laboratories: Lights and Shadows
https://arxiv.org/pdf/2202.02052.pdf

abstract: The evolution of white dwarfs is essentially a gravothermal process of cooling in which the basic ingredients for predicting their evolution are well identified, although not always well understood. There are two independent ways to test the cooling rate. One is the luminosity function of the white dwarf population, and another is the secular drift of the period of pulsation of those individuals that experience variations. Both scenarios are sensitive to the cooling or heating time scales, for which reason, the inclusion of any additional source or sink of energy will modify these properties and will allow to set bounds to these perturbations. These studies also require complete and statistical significant samples for which current large data surveys are providing an unprecedented wealth of information. In this paper we review how these techniques are applied to several cases like the secular drift of the Newton gravitational constant, neutrino magnetic moments, axions and weakly interacting massive particles (WIMPS).

LIE GROUPS AND LIE ALGEBRAS
https://arxiv.org/pdf/2201.09397.pdf

A long but detailed nice review/lecture note of Lie groups and algebras. References in it are also very interesting.

Machine Learning Symmetry
https://arxiv.org/pdf/2201.09345.pdf

abstract: We review recent work in machine learning aspects of conformal field theory and Lie algebra representation theory using neural networks.

A new generation of simultaneous fits to LHC data using deep learning
https://arxiv.org/pdf/2201.07240.pdf

abstract: We present a new methodology that is able to yield a simultaneous determination of the Parton Distribution Functions (PDFs) of the proton alongside any set of parameters that determine the theory predictions; whether within the Standard Model (SM) or beyond it. The SIMUnet methodology is based on an extension of the NNPDF4.0 neural network architecture, which allows the addition of an extra layer to simultaneously determine PDFs alongside an arbitrary number of such parameters. We illustrate its capabilities by simultaneously fitting PDFs with a subset of Wilson coefficients within the Standard Model Effective Field Theory framework and show how the methodology extends naturally to larger subsets of Wilson coefficients and to other SM precision parameters, such as the strong coupling constant or the heavy quark masses.

A data-based parametrization of parton distribution functions
https://arxiv.org/pdf/2111.02954.pdf

abstract: Since the first determination of a structure function many decades ago, all methodologies used to determine structure functions or parton distribution functions (PDFs) have employed a common prefactor as part of the parametrization. The NNPDF collaboration pioneered the use of neural networks to overcome the inherent bias of constraining the space of solution with a fixed functional form while still keeping the same common prefactor as a preprocessing. Over the years various, increasingly sophisticated, techniques have been introduced to counter the effect of the prefactor on the PDF determination. In this paper we
present a methodology to remove the prefactor entirely, thereby significantly simplifying the methodology, without a loss of efficiency and finding good agreement with previous results.

Correlation and Combination of Sets of Parton Distributions
https://arxiv.org/pdf/2110.08274.pdf

abstract: We study the correlation between different sets of parton distributions (PDFs). Specifically, viewing different PDF sets as distinct determinations, generally correlated, of the same underlying physical quantity, we examine the extent to which the correlation between them is due to the underlying data. We do this both for pairs of PDF sets determined using a given fixed methodology, and between sets determined using different methodologies. We show that correlations have a sizable component that is not due to the underlying data, because the data do not determine the PDFs uniquely. We show that the data-driven correlations can be used to assess the efficiency of methodologies used for PDF determination. We also show that the use of data-driven correlations for the combination of different PDFs into a joint set can lead to inconsistent results, and thus that the statistical combination used in constructing the widely used PDF4LHC15 PDF set remains the most reliable method.

 类似资料:

相关阅读

相关文章

相关问答