scispace - formally typeset
Search or ask a question

Showing papers on "Probability density function published in 2017"


Journal ArticleDOI
TL;DR: Results show that LIF and the new method proposed in this research are very efficient when dealing with nonlinear performance function, small probability, complicated limit state and engineering problems with high dimension.

268 citations


Journal ArticleDOI
TL;DR: In this paper, the full probability density function (PDF) of inflationary curvature perturbations was calculated, even in the presence of large quantum backreaction, using the stochastic-δ-N formalism.
Abstract: We calculate the full probability density function (PDF) of inflationary curvature perturbations, even in the presence of large quantum backreaction. Making use of the stochastic-δ N formalism, two complementary methods are developed, one based on solving an ordinary differential equation for the characteristic function of the PDF, and the other based on solving a heat equation for the PDF directly. In the classical limit where quantum diffusion is small, we develop an expansion scheme that not only recovers the standard Gaussian PDF at leading order, but also allows us to calculate the first non-Gaussian corrections to the usual result. In the opposite limit where quantum diffusion is large, we find that the PDF is given by an elliptic theta function, which is fully characterised by the ratio between the squared width and height (in Planck mass units) of the region where stochastic effects dominate. We then apply these results to the calculation of the mass fraction of primordial black holes from inflation, and show that no more than ~ 1 e-fold can be spent in regions of the potential dominated by quantum diffusion. We explain how this requirement constrains inflationary potentials with two examples.

152 citations


Journal ArticleDOI
TL;DR: A probability density forecasting method based on Copula theory is proposed in order to achieve the relational diagram of electrical load and real-time price and the simulation results show that the proposed method has great potential for power load forecasting by selecting appropriate kernel function for KSVQR model.

127 citations


Journal ArticleDOI
TL;DR: In this paper, a method to forecast the probability distribution function (PDF) of the generated power of PV systems based on the higher order Markov chain (HMC) is presented.
Abstract: This paper presents a method to forecast the probability distribution function (PDF) of the generated power of PV systems based on the higher order Markov chain (HMC). Since the output power of the PV system is highly influenced by ambient temperature and solar irradiance, they are used as important features to classify different operating conditions of the PV system. The classification procedure is carried out by applying the pattern discovery method on the historical data of the mentioned variables. An HMC is developed based on the categorized historical data of PV power in each operating point. The 15-min ahead PDF of the PV output power is forecasted through the Gaussian mixture method (GMM) by combining several distribution functions and by using the coefficients defined based on parameters of the HMC-based model. In order to verify the proposed method, the genetic algorithm is applied to minimize a well-defined objective function to achieve the optimal GMM coefficients. Numerical tests using real data demonstrate that the forecast results follow the real probability distribution of the PV power well under different weather conditions.

120 citations


Journal ArticleDOI
TL;DR: In this article, the Riccati-Bernoulli sub-ODE method was used to construct exact traveling wave solutions for the non-linear Schrodinger equation by the new RBN method.
Abstract: This work deals with the construction of the exact traveling wave solutions for the nonlinear Schrodinger equation by the new Riccati-Bernoulli Sub-ODE method. Additionally, we apply this method in order to study the random solutions by finding the probability distribution function when the coefficient in our problem is a random variable. The travelling wave solutions of many equations physically or mathematically are expressed by hyperbolic functions, trigonometric functions and rational functions. We discuss our method in the deterministic case and also in a random case, by studying the beta distribution for the random input.

119 citations


Journal ArticleDOI
TL;DR: In this article, the authors present and discuss various one-dimensional linear Fokker-Planck type equations that have been recently considered in connection with the study of interacting multi-agent systems.
Abstract: We present and discuss various one-dimensional linear Fokker–Planck-type equations that have been recently considered in connection with the study of interacting multi-agent systems. In general, these Fokker–Planck equations describe the evolution in time of some probability density of the population of agents, typically the distribution of the personal wealth or of the personal opinion, and are mostly obtained by linear or bilinear kinetic models of Boltzmann type via some limit procedure. The main feature of these equations is the presence of variable diffusion, drift coefficients and boundaries, which introduce new challenging mathematical problems in the study of their long-time behavior.

115 citations


Journal ArticleDOI
TL;DR: In this paper, an exact solvable model of random walk in random environment is introduced, called the Beta RWRE, which performs nearest neighbour jumps with transition probabilities drawn according to the Beta distribution.
Abstract: We introduce an exactly-solvable model of random walk in random environment that we call the Beta RWRE. This is a random walk in $$\mathbb {Z}$$ which performs nearest neighbour jumps with transition probabilities drawn according to the Beta distribution. We also describe a related directed polymer model, which is a limit of the q-Hahn interacting particle system. Using a Fredholm determinant representation for the quenched probability distribution function of the walker’s position, we are able to prove second order cube-root scale corrections to the large deviation principle satisfied by the walker’s position, with convergence to the Tracy–Widom distribution. We also show that this limit theorem can be interpreted in terms of the maximum of strongly correlated random variables: the positions of independent walkers in the same environment. The zero-temperature counterpart of the Beta RWRE can be studied in a parallel way. We also prove a Tracy–Widom limit theorem for this model.

109 citations


Journal ArticleDOI
TL;DR: Novel accurate closed-form expressions for the cumulative distribution function, the probability density function, and the moment generating function (MGF) in terms of Meijer's G functions are derived from Monte-Carlo simulations of mixed millimeter-wave radio-frequency systems.
Abstract: This paper studies the performance of mixed millimeter-wave radio-frequency (mmWave RF), free-space optics (FSO) systems in a highly scalable and cost-effective solution for fifth-generation (5G) mobile backhaul networks. The mmWave RF and FSO fading channels are, respectively, modeled by the Rician and the generalized Malaga ( $\mathcal {M}$ ) distributions. The effect of pointing errors due to the misalignment between the transmitter and the receiver in the FSO link is also included. Novel accurate closed-form expressions for the cumulative distribution function, the probability density function, and the moment generating function (MGF) in terms of Meijer's G functions are derived. Capitalizing on these new results, we analytically derive precise closed-form expressions for various performance metrics of the proposed system, including the outage probability, the average bit error rate (ABER), and the average capacity. Additionally, new asymptotic results are provided for the outage probability, the MGF, and the ABER in terms of simple elementary functions by applying the asymptotic expansion of the Meijer's G function at high signal-to-noise ratios (SNRs). Numerical results further validate the mathematical analysis by Monte-Carlo simulations.

109 citations


Journal ArticleDOI
TL;DR: This paper provides a novel method dealing with non-Gaussian random variables in wind farm decision making as a chance-constrained economic dispatch problem that can be solved as a deterministic linear convex optimization with a global optimal solution.
Abstract: Extending traditional deterministic economic dispatch to incorporate significant stochastic wind power is an important but challenging task in today's power system decision making. In this paper, this issue is formulated as a chance-constrained economic dispatch (CCED) problem. Usually, in the presence of non-Gaussian correlated random variables, both the objective function and constraints are difficult to handle. To address this issue, this paper provides a novel method dealing with non-Gaussian random variables. First, the Gaussian mixture model is adopted to represent the joint probability density function of power output for multiple wind farms. Then, analytical formulae are derived that can be used for fast computation of partial derivatives of the objective function and transformation of chance constraints into linear ones. Thereafter, the CCED can be solved as a deterministic linear convex optimization with a global optimal solution. The effectiveness and efficiency of the proposed methodology are validated via a case study with a modified IEEE 39-bus system.

108 citations


Journal ArticleDOI
TL;DR: This work aims to develop a method to derive probabilistic photometric redshift directly from multi-band imaging data, rendering pre-classification of objects and feature extraction obsolete, and shows that the proposed method is able to predict redshift PDFs independently from the type of source, for example galaxies, quasars or stars.
Abstract: The need to analyze the available large synoptic multi-band surveys drives the development of new data-analysis methods. Photometric redshift estimation is one field of application where such new methods improved the results, substantially. Up to now, the vast majority of applied redshift estimation methods have utilized photometric features. We aim to develop a method to derive probabilistic photometric redshift directly from multi-band imaging data, rendering pre-classification of objects and feature extraction obsolete. A modified version of a deep convolutional network was combined with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) were applied as performance criteria. We have adopted a feature based random forest and a plain mixture density network to compare performances on experiments with data from SDSS (DR9). We show that the proposed method is able to predict redshift PDFs independently from the type of source, for example galaxies, quasars or stars. Thereby the prediction performance is better than both presented reference methods and is comparable to results from the literature. The presented method is extremely general and allows us to solve of any kind of probabilistic regression problems based on imaging data, for example estimating metallicity or star formation rate of galaxies. This kind of methodology is tremendously important for the next generation of surveys.

107 citations


Journal ArticleDOI
01 Dec 2017
TL;DR: A statistical anomaly detection approach based on Gaussian mixture model is proposed which can be detected much easier compared with small changes in the loads, and trained regularly based on the updated load profile.
Abstract: One of the most addressed attacks in power networks is false data injection (FDI) which affects monitoring, fault detection, and state estimation integrity by tampering measurement data. To detect such devastating attack, the authors propose a statistical anomaly detection approach based on Gaussian mixture model, while some appropriate machine learning approaches are evaluated for detecting FDI. It should be noted that a finite mixture model is a convex combination of some probability density functions and combining the properties of several probability functions, making the mixture models capable of approximating any arbitrary distribution. Simulations results confirm superior performance of the proposed method over conventional bad data detection (BDD) tests and other learning approaches that studied in this article. It should be noted that using data which change significantly over a day can be highly clustered, and therefore, detected much easier compared with small changes in the loads. So without loss of generality, in the simulations it is assumed that the power demand follows a uniform distribution in a small range. However, the detector can be trained regularly based on the updated load profile.

Journal ArticleDOI
TL;DR: In this article, a new class of continuous distributions with two extra shape parameters named the generalized odd log-logistic family of distributions was proposed, which can be expressed as a linear combination of exponentiated densities based on the same baseline distribution.
Abstract: We propose a new class of continuous distributions with two extra shape parameters named the generalized odd log-logistic family of distributions. The proposed family contains as special cases the proportional reversed hazard rate and odd log-logistic classes. Its density function can be expressed as a linear combination of exponentiated densities based on the same baseline distribution. Some of its mathematical properties including ordinary moments, quantile and generating functions, two entropy measures and order statistics are obtained. We derive a power series for the quantile function. We discuss the method of maximum likelihood to estimate the model parameters. We study the behaviour of the estimators by means of Monte Carlo simulations. We introduce the log-odd log-logistic Weibull regression model with censored data based on the odd log-logistic-Weibull distribution. The importance of the new family is illustrated using three real data sets. These applications indicate that this family can...

Journal ArticleDOI
TL;DR: In this paper, the full probability density function (PDF) of inflationary curvature perturbations was calculated, even in the presence of large quantum backreaction, using the stochastic-$\delta N$ formalism.
Abstract: We calculate the full probability density function (PDF) of inflationary curvature perturbations, even in the presence of large quantum backreaction. Making use of the stochastic-$\delta N$ formalism, two complementary methods are developed, one based on solving an ordinary differential equation for the characteristic function of the PDF, and the other based on solving a heat equation for the PDF directly. In the classical limit where quantum diffusion is small, we develop an expansion scheme that not only recovers the standard Gaussian PDF at leading order, but also allows us to calculate the first non-Gaussian corrections to the usual result. In the opposite limit where quantum diffusion is large, we find that the PDF is given by an elliptic theta function, which is fully characterised by the ratio between the squared width and height (in Planck mass units) of the region where stochastic effects dominate. We then apply these results to the calculation of the mass fraction of primordial black holes from inflation, and show that no more than $\sim 1$ $e$-fold can be spent in regions of the potential dominated by quantum diffusion. We explain how this requirement constrains inflationary potentials with two examples.

Journal ArticleDOI
TL;DR: In this article, the odd Lindley-G family was proposed as a new generator of continuous distributions with one extra positive parameter, which can be expressed as a linear combination of exponentiated densities based on the same baseline distribution and various structural properties of the new family were derived including explicit expressions for the quantile function, ordinary and incomplete moments, generating function, Renyi entropy, reliability, order statistics and their moments and k upper record values.
Abstract: We propose a new generator of continuous distributions with one extra positive parameter called the odd Lindley-G family. Some special cases are presented. The new density function can be expressed as a linear combination of exponentiated densities based on the same baseline distribution. Various structural properties of the new family, which hold for any baseline model, are derived including explicit expressions for the quantile function, ordinary and incomplete moments, generating function, Renyi entropy, reliability, order statistics and their moments and k upper record values. We discuss estimation of the model parameters by maximum likelihood and provide an application to a real data set.

Journal ArticleDOI
TL;DR: In this article, a sparse polynomial chaos expansion is employed to perform a probabilistic analysis of the tunnel face stability in the spatially random soils, and the results show that the spatial variability has an important influence on the probability density function as well as the failure probability, but it has a negligible impact on the Sobol index.
Abstract: The sparse polynomial chaos expansion is employed to perform a probabilistic analysis of the tunnel face stability in the spatially random soils. A shield tunnel under compressed air is considered which implies that the applied pressure is uniformly distributed on the tunnel face. Two sets of failure mechanisms in the context of the limit analysis theory with respect to the frictional and the purely cohesive soils are used to calculate the required face pressure. In the case of the frictional soils, the cohesion and the friction angle are modeled as two anisotropic cross-correlated lognormal random fields; for the purely cohesive soils, the cohesion and the unit weight are modeled as two anisotropic independent lognormal random fields. The influences of the spatial variability and of the cross-correlation between the cohesion and the friction angle on the probability density function of the required face pressure, on the sensitivity index and on the failure probability are discussed. The obtained results show that the spatial variability has an important influence on the probability density function as well as the failure probability, but it has a negligible impact on the Sobol’s index.

Journal ArticleDOI
TL;DR: A way to infer distributions of any performance indicator computed from the confusion matrix, based on a Bayesian approach in which the unknown parameters of the multinomial probability function themselves are assumed to be generated from a random vector.
Abstract: We propose a way to infer distributions of any performance indicator computed from the confusion matrix. This allows us to evaluate the variability of an indicator and to assess the importance of an observed difference between two performance indicators. We will assume that the values in a confusion matrix are observations coming from a multinomial distribution. Our method is based on a Bayesian approach in which the unknown parameters of the multinomial probability function themselves are assumed to be generated from a random vector. We will show that these unknown parameters follow a Dirichlet distribution. Thanks to the Bayesian approach, we also benefit from an elegant way of injecting prior knowledge into the distributions. Experiments are done on real and synthetic data sets and assess our method’s ability to construct accurate distributions.

Journal ArticleDOI
TL;DR: The ability of radial basis function artificial neural networks for nonlinear mapping is exploited with an acceptable level of accuracy, and even exact to solve nonlinear equation set of power-flow analysis, and the speed of the algorithm is improved.

Journal ArticleDOI
TL;DR: This article investigated the historical developments that gave rise to the Bayes factor for testing a point null hypothesis against a composite alternative and found that the conceptual innovation to assign prior mass to a general law is due to a series of three articles by Dorothy Wrinch and Sir Harold Jeffreys (1919, 1921, 1923a).
Abstract: This article brings attention to some historical developments that gave rise to the Bayes factor for testing a point null hypothesis against a composite alternative. In line with current thinking, we find that the conceptual innovation—to assign prior mass to a general law—is due to a series of three articles by Dorothy Wrinch and Sir Harold Jeffreys (1919, 1921, 1923a). However, our historical investigation also suggests that in 1932, J. B. S. Haldane made an important contribution to the development of the Bayes factor by proposing the use of a mixture prior comprising a point mass and a continuous probability density. Jeffreys was aware of Haldane’s work and it may have inspired him to pursue a more concrete statistical implementation for his conceptual ideas. It thus appears that Haldane may have played a much bigger role in the statistical development of the Bayes factor than has hitherto been assumed.

Journal ArticleDOI
TL;DR: In this paper, application of the two parameter Birnbaum-Saunders (BS) distribution is introduced and reviewed for characterizing the wind speed and wind power density distributions, and the suitability of the BS distribution was evaluated against nine earlier used one-component distributions.

Journal ArticleDOI
TL;DR: The typicality is considered as a fundamental quantity in the pattern analysis, which is derived directly from data and is stated in a discrete form in contrast to the traditional approach where a continuous pdf is assumed a priori and estimated from data afterward.
Abstract: In this paper, we propose an approach to data analysis, which is based entirely on the empirical observations of discrete data samples and the relative proximity of these points in the data space. At the core of the proposed new approach is the typicality—an empirically derived quantity that resembles probability. This nonparametric measure is a normalized form of the square centrality (centrality is a measure of closeness used in graph theory). It is also closely linked to the cumulative proximity and eccentricity (a measure of the tail of the distributions that is very useful for anomaly detection and analysis of extreme values). In this paper, we introduce and study two types of typicality, namely its local and global versions. The local typicality resembles the well-known probability density function (pdf), probability mass function, and fuzzy set membership but differs from all of them. The global typicality, on the other hand, resembles well-known histograms but also differs from them. A distinctive feature of the proposed new approach, empirical data analysis (EDA), is that it is not limited by restrictive impractical prior assumptions about the data generation model as the traditional probability theory and statistical learning approaches are. Moreover, it does not require an explicit and binary assumption of either randomness or determinism of the empirically observed data, their independence, or even their number (it can be as low as a couple of data samples). The typicality is considered as a fundamental quantity in the pattern analysis, which is derived directly from data and is stated in a discrete form in contrast to the traditional approach where a continuous pdf is assumed a priori and estimated from data afterward. The typicality introduced in this paper is free from the paradoxes of the pdf. Typicality is objectivist while the fuzzy sets and the belief-based branch of the probability theory are subjectivist. The local typicality is expressed in a closed analytical form and can be calculated recursively, thus, computationally very efficiently. The other nonparametric ensemble properties of the data introduced and studied in this paper, namely, the square centrality, cumulative proximity, and eccentricity, can also be updated recursively for various types of distance metrics. Finally, a new type of classifier called naive typicality-based EDA class is introduced, which is based on the newly introduced global typicality. This is only one of the wide range of possible applications of EDA including but not limited for anomaly detection, clustering, classification, control, prediction, control, rare events analysis, etc., which will be the subject of further research.

Journal ArticleDOI
TL;DR: In this article, a novel computational approach, namely the extended unified interval stochastic sampling (X-UISS) method, is proposed to calculate the statistical characteristics (i.e., mean and standard deviation) of the extreme bounds of the concerned responses (e.g., displacement and stress) of engineering structure involving hybrid spatially dependent uncertainties.

Journal ArticleDOI
TL;DR: In this article, a triangulation of laser-leaf intersection points recorded by the LiDAR scan is used to obtain a probability density function for leaf orientation from triangle normal vectors.

Journal ArticleDOI
TL;DR: The results show that the proposed model significantly outperform both reference models in terms of all evaluation metrics for all locations when the forecast horizon is greater than 5-min, and shows superior performance in predicting DNI ramps.

Journal ArticleDOI
TL;DR: In this article, the authors present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques.
Abstract: A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine-learning-based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z probability density function (PDF), due to the fact that the analytical relation mapping the photometric parameters on to the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use of the MLPQNA neural network (Multi Layer Perceptron with Quasi Newton learning rule), with the possibility to easily replace the specific machine-learning model chosen to predict photo-z. We present a summary of results on SDSS-DR9 galaxy data, used also to perform a direct comparison with PDFs obtained by the Le Phare spectral energy distribution template fitting. We show that METAPHOR is capable to estimate the precision and reliability of photometric redshifts obtained with three different self-adaptive techniques, i.e. MLPQNA, Random Forest and the standard K-Nearest Neighbors models.

Journal ArticleDOI
01 May 2017
TL;DR: In this paper, the effect of uncertainty in the allocation of photovoltaic (PV) generation, solar irradiance, and its impact on the power flow in a distribution network is investigated.
Abstract: This paper investigates the effect of uncertainty in the allocation of photovoltaic (PV) generation, solar irradiance, and its impact on the power flow in a distribution network. The solar irradiance available in the National Renewable Energy Laboratory Resource Data Center is clustered into two states: high and low irradiance defined by a threshold. The uncertainty is modeled based on Non-Gaussian distribution, obtained using kernel density estimation. This estimation aids in achieving the probability density function and cumulative distribution functions of the solar irradiance. Moreover, the load demand, wind speed, and generator location are modeled according to Gaussian, Weibull, and discrete uniform distribution functions, respectively. As a part of probabilistic power flow, the backward/forward sweep method is used to solve each scenario of the Monte Carlo simulation. The proposed framework is applied to the 33-node test system considering three different test cases. The first case considers deployment of PV systems in three microgrids of the electric grid, and the other two test cases analyze different levels of penetration of randomly allocated PV and wind power systems. At the end, the results indicate potential reverse power flow through certain branches of the grid, and the renewables have a major impact on the system.

Journal ArticleDOI
TL;DR: In this paper, the authors used tracer particles to track the motion of substances in water, as it flows through transparent, 3D synthetic sandstones, and demonstrated that these particle velocity characteristics can be explained and modeled as a continuous time random walk that is both Markovian and mean reverting toward the stationary state.
Abstract: ©2017. American Geophysical Union. All Rights Reserved. We study the evolution of velocity in time, which fundamentally controls the way dissolved substances are transported and spread in porous media. Experiments are conducted that use tracer particles to track the motion of substances in water, as it flows through transparent, 3-D synthetic sandstones. Particle velocities along streamlines are found to be intermittent and strongly correlated, while their probability density functions are lognormal and nonstationary. We demonstrate that these particle velocity characteristics can be explained and modeled as a continuous time random walk that is both Markovian and mean reverting toward the stationary state. Our model accurately captures the fine-scale velocity fluctuations observed in each tested sandstone, as well as their respective dispersion regime progression from initially ballistic, to superdiffusive, and finally Fickian . Model parameterization is based on the correlation length and mean and standard deviation of the velocity distribution, thus linking pore-scale attributes with macroscale transport behavior for both short and long time scales.

Journal ArticleDOI
TL;DR: In this paper, a probabilistic model was developed to select representative and realistic track irregularity sets from numerous data with higher efficiency and accuracy, which can be used to reveal the interaction mechanisms between the moving vehicles and the guiding tracks.

Journal ArticleDOI
TL;DR: The main idea of the proposed approach is that the DFA problem is modeled as a nonlinear function of a set of probability distribution functions, and then a linear feedback iteration scheme is proposed to solve the non linear function, leading to a group judgment or decision.
Abstract: In group decision making, it is inevitable that the individual decision maker’s subjectivity is involved, which causes difficulty in reaching a group decision. One of the difficulties is to aggregate a small set of expert opinions with the individual subjectivity or uncertainty modeled with probability theory. This difficult problem is called probability distribution function aggregation (DFA). This paper presents a simple and efficient approach to the DFA problem. The main idea of the proposed approach is that the DFA problem is modeled as a nonlinear function of a set of probability distribution functions, and then a linear feedback iteration scheme is proposed to solve the nonlinear function, leading to a group judgment or decision. Illustration of this new approach is given by a well-known DFA example which was solved with the Delphi method. The DFA problem is a part of the group decision problem. Therefore, the proposed algorithm is also useful to the decision making problem in general. Another contribution of the this paper is the proposed notation of systematically representing the imprecise group decision problem with the classification of imprecise information into three classes, namely incomplete information, vague information, and uncertain information. The particular DFA problem dealt with in this paper is then characterized with this general notation.

Journal ArticleDOI
TL;DR: In this article, the authors derived PDFs of the galaxy and projected matter density distributions via the Counts in Cells (CiC) method and used them to test whether the underlying density contrast is well described by a lognormal distribution for the galaxies, the convergence and their joint PDF.
Abstract: It is well known that the probability distribution function (PDF) of galaxy density contrast is approximately lognormal; whether the PDF of mass fluctuations derived from weak lensing convergence (kappaWL) is lognormal is less well established. We derive PDFs of the galaxy and projected matter density distributions via the Counts in Cells (CiC) method. We use maps of galaxies and weak lensing convergence produced from the Dark Energy Survey (DES) Science Verification data over 139 deg2. We test whether the underlying density contrast is well described by a lognormal distribution for the galaxies, the convergence and their joint PDF. We confirm that the galaxy density contrast distribution is well modeled by a lognormal PDF convolved with Poisson noise at angular scales from 10'- 40'(corresponding to physical scales of 3-10 Mpc). We note that as kappaWL is a weighted sum of the mass fluctuations along the line of sight, its PDF is expected to be only approximately lognormal. We find that the kappaWL distribution is well modeled by a lognormal PDF convolved with Gaussian shape noise at scales between 10'and 20', with a best-fit chi2/DOF of 1.11 compared to 1.84 for a Gaussian model, corresponding to p-values 0.35 and 0.07 respectively, at a scale of 10'. Above 20'a simple Gaussian model is sufficient. The joint PDF is also reasonably fitted by a bivariate lognormal. As a consistency check we compare the variances derived from the lognormal modelling with those directly measured via CiC. Our methods are validated against maps from the MICE Grand Challenge N-body simulation.

Journal ArticleDOI
TL;DR: In this paper, the Laguerre and Jacobi versions of the one-body interaction terms are derived in terms of the eigenvalue PDF of certain random matrices with Gaussian entries.
Abstract: Muttalib–Borodin ensembles are characterised by the pair interaction term in the eigenvalue probability density function being of the form Q (Formula presented). We study the Laguerre and Jacobi versions of this model — so named by the form of the one-body interaction terms — and show that for θ ∈ ℤ+ they can be realised as the eigenvalue PDF of certain random matrices with Gaussian entries. For general θ > 0, realisations in terms of the eigenvalue PDF of ensembles involving triangular matrices are given. In the Laguerre case this is a recent result due to Cheliotis, although our derivation is different. We make use of a generalisation of a double contour integral formula for the correlation functions contained in a paper by Adler, van Moerbeke and Wang to analyse the global density (which we also analyse by studying characteristic polynomials), and the hard edge scaled correlation functions. For the global density functional equations for the corresponding resolvents are obtained; solving this gives the moments in terms of Fuss–Catalan numbers (Laguerre case — a known result) and particular binomial coefficients (Jacobi case). For θ ∈ ℤ+ the Laguerre and Jacobi cases are closely related to the squared singular values for products of θ standard Gaussian random matrices, and truncations of unitary matrices, respectively. At the hard edge the double contour integral formulas provide a double contour integral form of the scaled correlation kernel obtained by Borodin in terms of Wright’s Bessel function.