A combination is presented of all inclusive deep inelastic cross sections previously published by the H1 and ZEUS collaborations at HERA for neutral and charged current e(+/-)p scattering for zero beam polarisation. The data were taken at proton beam energies of 920, 820, 575 and 460 GeV and an electron beam energy of 27.5 GeV. The data correspond to an integrated luminosity of about 1 fb(-1) and span six orders of magnitude in negative four-momentum-transfer squared, Q(2), and Bjorken x. The correlations of the systematic uncertainties were evaluated and taken into account for the combination. The combined cross sections were input to QCD analyses at leading order, next-to-leading order and at next-to-next-to-leading order, providing a new set of parton distribution functions, called HERAPDF2.0. In addition to the experimental uncertainties, model and parameterisation uncertainties were assessed for these parton distribution functions. Variants of HERAPDF2.0 with an alternative gluon parameterisation, HERAPDF2.0AG, and using fixed-flavour-number schemes, HERAPDF2.0FF, are presented. The analysis was extended by including HERA data on charm and jet production, resulting in the variant HERAPDF2.0Jets. The inclusion of jet-production cross sections made a simultaneous determination of these parton distributions and the strong coupling constant possible, resulting in as alpha(s) (M-Z(2)) = 0.1183 +/- 0.0009(exp) +/- 0.0005(model/parameterisation) +/- 0.0012(hadronisation)(-0.0030)(+0.0037)(scale). An extraction of xF(3)(gamma Z) and results on electroweak unification and scaling violations are also presented.
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way. ROOT AEFA_v1_0 CPC Program Library, Queen's University, Belfast, N. Ireland LGPL 3 044 581 36 325 133 tar.gz C++ Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC GNU/Linux, Windows XP/Vista, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Yes 4, 9, 11.9, 14 Storage, analysis and visualization of scientific data Object store, wide range of analysis algorithms and visualization methods For an up-to-date author list see: and Depending on the data size and complexity of analysis algorithms
The resonant structure of the doubly Cabibbo-suppressed decay D +→K − K + K + is studied for the first time. The measurement is based on a sample of pp-collision data, collected at a centre-of-mass energy of 8 TeV with the LHCb detector and corresponding to an integrated luminosity of 2 fb−1. The amplitude analysis of this decay is performed with the isobar model and a phenomenological model based on an effective chiral Lagrangian. In both models the S-wave component in the K − K + system is dominant, with a small contribution of the ϕ(1020) meson and a negligible contribution from tensor resonances. The K + K − scattering amplitudes for the considered combinations of spin (0,1) and isospin (0,1) of the two-body system are obtained from the Dalitz plot fit with the phenomenological decay amplitude.
High-precision analyses of supersymmetry parameters aim at reconstructing the fundamental supersymmetric theory and its breaking mechanism. A well defined theoretical framework is needed when higher-order corrections are included. We propose such a scheme, Supersymmetry Parameter Analysis SPA, based on a consistent set of conventions and input parameters. A repository for computer programs is provided which connect parameters in different schemes and relate the Lagrangian parameters to physical observables at LHC and high energy e + e- linear collider experiments, i.e., masses, mixings, decay widths and production cross sections for supersymmetric particles. In addition, programs for calculating high-precision low energy observables, the density of cold dark matter (CDM) in the universe as well as the cross sections for CDM search experiments are included. The SPA scheme still requires extended efforts on both the theoretical and experimental side before data can be evaluated in the future at the level of the desired precision. We take here an initial step of testing the SPA scheme by applying the techniques involved to a specific supersymmetry reference point.
We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner.
There are many indirect and direct experimental indications that the new particle H discovered by the ATLAS and CMS Collaborations has spin zero and (mostly) positive parity, and that its couplings to other particles are correlated with their masses. To a high degree of confidence, it is a Higgs boson, and here we examine the extent to which its couplings resemble those of the single Higgs boson of the Standard Model. Our global analysis of its couplings to fermions and massive bosons determines that they have the same relative sign as in the Standard Model. We also show directly that these couplings are highly consistent with a dependence on particle masses that is linear to within a few %, and scaled by the conventional electroweak symmetry-breaking scale to within 10%. We also give constraints on loop-induced couplings, on the total Higgs decay width, and on possible invisible decays of the Higgs boson under various assumptions.
A combination of three LHCb measurements of the CKM angle is presented. The decays and are used, where denotes an admixture of and mesons, decaying into , , , , , or final states. All measurements use a dataset corresponding to of integrated luminosity. Combining results from decays alone a best-fit value of is found, and confidence intervals are set The best-fit value of found from a combination of results from decays alone, is , and the confidence intervals are set, without constraint at CL. The combination of results from and decays gives a best-fit value of and the confidence intervals are set. All values are expressed modulo 180°, and are obtained taking into account the effect of – mixing.
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton–proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface.
We perform a global analysis of the constraints on a possible Higgs-like particle with mass ∼ 125 GeV that are provided by the ATLAS, CDF, CMS and D0 experiments. We combine the available constraints on possible deviations from the Standard Model Higgs couplings to massive vector bosons and to fermions, considering also the possibilities of non-standard loop-induced couplings to photon and gluon pairs. We analyze the combined constraints on pseudo-dilaton scenarios and on some other scenarios in which the possible new particle is identified as a pseudo-Nambu-Goldstone boson in a composite electroweak symmetry-breaking sector.