scispace - formally typeset
Search or ask a question

Showing papers on "Cumulative distribution function published in 2010"


Journal ArticleDOI
TL;DR: It is shown analytically that the maximal rate achievable with error probability ¿ isclosely approximated by C - ¿(V/n) Q-1(¿) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion, and Q is the complementary Gaussian cumulative distribution function.
Abstract: This paper investigates the maximal channel coding rate achievable at a given blocklength and error probability. For general classes of channels new achievability and converse bounds are given, which are tighter than existing bounds for wide ranges of parameters of interest, and lead to tight approximations of the maximal achievable rate for blocklengths n as short as 100. It is also shown analytically that the maximal rate achievable with error probability ? isclosely approximated by C - ?(V/n) Q-1(?) where C is the capacity, V is a characteristic of the channel referred to as channel dispersion , and Q is the complementary Gaussian cumulative distribution function.

3,242 citations


Journal ArticleDOI
TL;DR: In this paper, a new analytical approach for the derivation of fragility curves for masonry buildings is proposed, based on nonlinear stochastic analyses of building prototypes, where the mechanical properties of the prototypes are considered as random variables, assumed to vary within appropriate ranges of values.

218 citations


Journal ArticleDOI
TL;DR: This article discusses the deficiencies of the available methods and proposes a Fuzzy Monte Carlo Simulation (FMCS) framework for risk analysis of construction projects and develops a fuzzy cumulative distribution function constructed as a novel way to represent uncertainty.
Abstract: Monte Carlo simulation has been used extensively for addressing probabilistic uncertainty in range estimating for construction projects. However, subjective and linguistically expressed information results in added non-probabilistic uncertainty in construction management. Fuzzy logic has been used successfully for representing such uncertainties in construction projects. In practice, an approach that can handle both random and fuzzy uncertainties in a risk assessment model is necessary. This article discusses the deficiencies of the available methods and proposes a Fuzzy Monte Carlo Simulation (FMCS) framework for risk analysis of construction projects. In this framework, a fuzzy cumulative distribution function constructed as a novel way to represent uncertainty. To verify the feasibility of the FMCS framework and demonstrate its main features, the authors have developed a special purpose simulation template for cost range estimating. This template is employed to estimate the cost of a highway overpass project.

179 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a closed-form expression for the Poisson-binomial probability density function (pdf), which describes the number of successes in N independent trials, when the individual probabilities of success vary across trials.
Abstract: The Poisson-binomial probability density function (pdf) describes the numbers of successes in N independent trials, when the individual probabilities of success vary across trials. Its use is pervasive in applications, such as fault tolerance, signal detection, target tracking, object classification/identification, multi-sensor data fusion, system management, and performance characterization, among others. We present a closed-form expression for this pdf, and we discuss several of its advantages regarding computing speed and implementation and in simplifying analysis, with examples of the latter including the computation of moments and the development of new trigonometric identities for the binomial coefficient and the binomial cumulative distribution function (cdf). Finally we also pose and address the inverse Poisson-binomial problem; that is, given such pdf, how to find (within a permutation) the probabilities of success of the individual trials.

179 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated nonparametric estimation of some functionals of the conditional distribution of a scalar response variable Y given a random variable X taking values in a semi-metric space.

144 citations


Journal ArticleDOI
TL;DR: This paper derives novel closed-form expressions for the moment-generating function, the probability density function, and the cumulative distribution function of the product of rational powers of statistically independent squared gamma-gamma random variables for the end-to-end signal- to-noise ratio for multihop free-space optical wireless systems.
Abstract: In this paper, a study on the end-to-end performance of multihop free-space optical wireless systems over turbulence-induced fading channels, modeled by the gamma-gamma distribution, is presented. Our analysis is carried out for systems employing amplify-and-forward channel-state-information-assisted or fixed-gain relays. To assess the statistical properties of the end-to-end signal-to-noise ratio for both considered systems, we derive novel closed-form expressions for the moment-generating function, the probability density function, and the cumulative distribution function of the product of rational powers of statistically independent squared gamma-gamma random variables. These statistical results are then applied to studying the outage probability and the average bit error probability of binary modulation schemes. Also, for the case of channel-state-information-assisted relays, an accurate asymptotic performance analysis at high SNR values is presented. Numerical examples compare analytical and simulation results, verifying the correctness of the proposed mathematical analysis.

138 citations


Journal ArticleDOI
TL;DR: In this article, a moment-independent importance measure of the basic random variable is proposed, and its properties are analyzed and verified based on this work, the importance measure is compared with that on the distribution density of the response by use of the probability density evolution method, which can efficiently avoid the difficulty in solving the importance measures.
Abstract: To analyze the effect of basic variable on failure probability in reliability analysis, a moment-independent importance measure of the basic random variable is proposed, and its properties are analyzed and verified Based on this work, the importance measure of the basic variable on the failure probability is compared with that on the distribution density of the response By use of the probability density evolution method, a solution is established to solve two importance measures, which can efficiently avoid the difficulty in solving the importance measures Some numerical examples and engineering examples are used to demonstrate the proposed importance measure on the failure probability and that on the distribution density of the response The results show that the proposed importance measure can effectively describe the effect of the basic variable on the failure probability from the distribution density of the basic variable Additionally, the results show that the established solution on the probability density evolution is efficient for the importance measures

119 citations


Journal ArticleDOI
TL;DR: In this paper, a significantly improved probability distribution for the H-test for periodicity in X-ray and γ-ray arrival times, which is already extensively used by the γ -ray pulsar community, is presented.
Abstract: Aims. To provide a significantly improved probability distribution for the H-test for periodicity in X-ray and γ-ray arrival times, which is already extensively used by the γ-ray pulsar community. Also, to obtain an analytical probability distribution for stacked test statistics in the case of a search for pulsed emission from an ensemble of pulsars where the significance per pulsar is relatively low, making individual detections insignificant on their own. This information is timely given the recent rapid discovery of new pulsars with the Fermi-LAT t γ-ray telescope. Methods. Approximately 10 14 realisations of the H-statistic (H) for random (white) noise is calculated from a random number generator for which the repetition cycle is >> 10 14 . From these numbers the probability distribution P(>H) is calculated. Results. The distribution of H is found to be exponential with parameter λ = 0.4 so that the cumulative probability distribution P(>H) = exp (―λH). If we stack independent values for H, the sum of K such values would follow the Erlang-K distribution with parameter λ for which the cumulative probability distribution is also a simple analytical expression. Conclusions. Searches for weak pulsars with unknown pulse profile shapes in the Fermi-LAT, Agile or other X-ray data bases should benefit from the H-test since it is known to be powerful against a broad range of pulse profiles, which introduces only a single statistical trial if only the H-test is used. The new probability distribution presented here favours the detection of weaker pulsars in terms of an improved sensitivity relative to the previously known distribution.

118 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed the notion of density when functional data are considered in the space determined by the eigenfunctions of principal component analysis, which leads to a transparent and meaningful surrogate for density defined in terms of the average value of the logarithms of the densities of the distributions of principal components for a given dimension.
Abstract: The notion of probability density for a random function is not as straightforward as in finite-dimensional cases. While a probability density function generally does not exist for functional data, we show that it is possible to develop the notion of density when functional data are considered in the space determined by the eigenfunctions of principal component analysis. This leads to a transparent and meaningful surrogate for density defined in terms of the average value of the logarithms of the densities of the distributions of principal components for a given dimension. This density approximation is estimable readily from data. It accurately represents, in a monotone way, key features of small-ball approximations to density. Our results on estimators of the densities of principal component scores are also of independent interest; they reveal interesting shape differences that have not previously been considered. The statistical implications of these results and properties are identified and discussed, and practical ramifications are illustrated in numerical work.

117 citations


Journal ArticleDOI
TL;DR: In this paper, the mean-centered first-order second-moment (FOSM) method is employed to perform probabilistic pushover analysis (POA) of structural and/or soil-structure systems.
Abstract: In this paper, the mean-centered first-order second-moment (FOSM) method is employed to perform probabilistic push-over analysis (POA) of structural and/or soil-structure systems. Approximations of first and second statistical moments (FSSMs) of engineering demand parameters (EDPs) of structural and/or geotechnical systems with random material parameters are computed based on finite-element (FE) response and response sensitivity analysis (RSA) results. The FE RSA is performed accurately and efficiently by using the direct differentiation method (DDM) and is employed to evaluate the relative importance (RI) of the various modeling material parameters in influencing the variability of the EDPs. The proposed approximate methodology is illustrated through probabilistic POA results for nonlinear inelastic FE models of: (1) a three-story reinforced-concrete (RC) frame building and (2) a soil-foundation-structure interaction system consisting of a RC frame structure founded on layered soil. FSSMs of EDPs computed through the FOSM method are compared with the corresponding accurate estimates obtained via Monte Carlo simulation. Results obtained from "exact" (or "local") and "averaged" (or "global") response sensitivities are also compared. The RI of the material parameters describing the systems is studied in both the deterministic and probabilistic sense, and presented in the form of tornado diagrams. Effects of statistical correlation between material parameters are also considered and analyzed by the FOSM method. A simple approximation of the probability density function and cumulative distribution function of EDPs due to a single random parameter at a time (while all the other parameters are fixed to their mean values) is also proposed. Conclusions are drawn on both the appropriateness of using local RSA for simplified probabilistic POA and on the application limits of the FOSM method. It is observed that the FOSM method combined with the DDM provides accurate estimates of FSSMs of EDPs for low-to-moderate level of inelastic structural or system behavior and useful qualitative information on the RI ranking of material parameters on the structural or system response for high level of inelastic behavior.

99 citations


Journal ArticleDOI
TL;DR: A performance analysis is presented for amplify-and-forward (AF) cooperative relay networks employing transmit antenna diversity with orthogonal space-time block codes (OSTBCs), where multiple antennas are equipped at the transmitter.
Abstract: A performance analysis is presented for amplify-and-forward (AF) cooperative relay networks employing transmit antenna diversity with orthogonal space-time block codes (OSTBCs), where multiple antennas are equipped at the transmitter. We develop a symbol-error-rate (SER) and outage performance analysis for OSTBC transmissions with and without cooperative diversity over flat Rayleigh fading channels. We first derive exact probability density functions (pdf's) and cumulative distribution functions (cdf's) for the system SNR without direct transmission with an arbitrary number of transmit antennas and then present the exact closed-form SER and outage probability expressions. Next, we derive the moment-generating function (MGF) for the overall system SNR with direct transmission and present the exact SER and outage probability with joint transmit antenna diversity and cooperative diversity. The theoretical analysis is validated by simulations, which indicate an exact match between them. The results also show how the transmit antenna diversity and the cooperative diversity affect the overall system performance.

Proceedings ArticleDOI
09 Nov 2010
TL;DR: The most general composite fading distribution is introduced to model the envelope and the power of the received signal in such fading channels as millimeter wave (60 GHz or above) fading channels and free-space optical channels, which is term extended generalized-K (EGK) composite fade distribution.
Abstract: In this paper, we introduce the most general composite fading distribution to model the envelope and the power of the received signal in such fading channels as millimeter wave (60 GHz or above) fading channels and free-space optical channels, which we term extended generalized-K (EGK) composite fading distribution. We obtain the second-order statistics of the received signal envelope characterized by the EGK composite fading distribution. Expressions for probability density function, cumulative distribution function, level crossing rate and average fade duration, moments, amount of fading and average capacity are derived. Numerical and computer simulation examples validate the accuracy of the presented mathematical analysis.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a general framework for robust small-area estimation, based on representing a small area estimator as a functional of a predictor of the small-scale cumulative distribution function of the characteristic of interest.
Abstract: Small-area estimation techniques have typically relied on plug-in estimation based on models containing random area effects. More recently, regression M-quantiles have been suggested for this purpose, thus avoiding conventional Gaussian assumptions, as well as problems associated with the specification of random effects. However, the plug-in M-quantile estimator for the small-area mean can be shown to be the expected value of this mean with respect to a generally biased estimator of the small-area cumulative distribution function of the characteristic of interest. To correct this problem, we propose a general framework for robust small-area estimation, based on representing a small-area estimator as a functional of a predictor of this small-area cumulative distribution function. Key advantages of this framework are that it naturally leads to integrated estimation of small-area means and quantiles and is not restricted to M-quantile models. We also discuss mean squared error estimation for the resulting estimators, and demonstrate the advantages of our approach through model-based and design-based simulations, with the latter using economic data collected in an Australian farm survey.

Journal ArticleDOI
TL;DR: In this article, a technique for quantifying and removing biases from ensemble forecasts of hydrometeorological and hydrologic variables is described, which makes no a priori assumptions about the distributional form of variables, which is often unknown or difficult to model parametrically.
Abstract: This paper describes a technique for quantifying and removing biases from ensemble forecasts of hydrometeorological and hydrologic variables. The technique makes no a priori assumptions about the distributional form of the variables, which is often unknown or difficult to model parametrically. The aim is to estimate the conditional cumulative distribution function (ccdf) of the observed variable given a (possibly biased) real-time ensemble forecast. This ccdf represents the “true” probability distribution of the forecast variable, subject to sampling uncertainties. In the absence of a known distributional form, the ccdf should be estimated nonparametrically. It is noted that the probability of exceeding a threshold of the observed variable, such as flood stage, is equivalent to the expectation of an indicator variable defined for that threshold. The ccdf is then modeled through a linear combination of the indicator variables of the forecast ensemble members. The technique is based on Bayesian opt...

Journal ArticleDOI
TL;DR: This paper considers orthogonal space-time block coded transmission for a multiple-input multiple-output channel (MIMO) with non-coherent amplify-and-forward (AF) relaying in Rayleigh fading to show a reciprocity relationship between the number of antennas at the relay and destination.
Abstract: This paper considers orthogonal space-time block coded transmission for a multiple-input multiple-output channel (MIMO) with non-coherent amplify-and-forward (AF) relaying in Rayleigh fading. We first characterize the statistical properties of the instantaneous signal-to-noise ratio (SNR) at the destination, by deriving new exact closed form expressions for the moment generating function, cumulants, and first and second moments. These results show a reciprocity relationship between the number of antennas at the relay and destination. The probability density function and cumulative distribution function of the SNR are also derived for certain system configurations, and for various asymptotic regimes. We then investigate the system performance by presenting new analytical expressions for the symbol error rate, outage probability, amount of fading, as well as the diversity order and array gain. Our results indicate that the proposed scheme can achieve the maximum diversity order of the non-coherent AF MIMO relay channel.

Journal ArticleDOI
Tian Pau Chang1
TL;DR: In this paper, the frequency distributions of global radiations are investigated using four kinds of probability density functions, i.e. the Weibull function, logistic function, normal function and lognormal function.
Abstract: The amount of daily irradiation received for a particular area is one of the most important meteorological parameters for many application fields. In this paper, the frequency distributions of global radiations are investigated using four kinds of probability density functions, i.e. the Weibull function, logistic function, normal function and lognormal function. The radiations observed at six meteorological stations in Taiwan are selected as sample data to be analyzed. To evaluate the performance of the probability functions both the Kolmogorov-Smirnov test and the root mean square errors are considered as judgment criteria. The results show that all the four probability functions are applicable for stations where weather conditions are relatively steady throughout the year as in Taichung and Tainan. While for stations revealing more dispersive distribution as in Hualien and Taitung, the lognormal function describes the frequency distribution quite better than other three functions. On the whole the lognormal function performs best followed by the normal function; the Weibull function widely used in other fields seems to be not appropriate in this case.

Journal ArticleDOI
TL;DR: In this article, a new importance measure that characterizes the influence of the entire input distribution on the entire output distribution was proposed, which represents the expected deviation of the cumulative distribution function (CDF) of the model output that would be obtained when one input parameter of interest were known.
Abstract: Uncertainty is an integral part of risk assessment of complex engineering systems, such as nuclear power plants and space crafts. The aim of sensitivity analysis is toidentify the contribution of the uncertainty in model inputs to the uncertainty in the model output. In this study, a new importance measure that characterizes the influence of the entire input distribution on the entire output distribution was proposed. It represents the expected deviation of the cumulative distribution function (CDF) of the model output that would be obtained when one input parameter of interest were known. The applicability of this importance measure was tested with two models, a nonlinear nonmonotonic mathematical model and a risk model. In addition, a comparison of this new importance measure with several other importance measures was carried out and the differences between these measures were explained.

Journal ArticleDOI
TL;DR: In this article, the performance of antenna array processing in distributed multiple access networks without power control is studied, where the positions of nodes are determined by a Poisson point process and the desired and interfering signals are subject to both path-loss and independent Rayleigh fading.
Abstract: This paper studies the performance of antenna array processing in distributed multiple access networks without power control. The positions of nodes are determined by a Poisson point process. Desired and interfering signals are subject to both path-loss (with an exponent greater than 2) and to independent Rayleigh fading. Using these assumptions, we derive the exact closed form expression for the cumulative distribution function of the output signal-to-interference-plus-noise ratio when optimum combining is applied. This results in a pertinent measure of the network performance in terms of the outage probability, which in turn provides insights into the network capacity gain that could be achieved with antenna array processing. We present and discuss examples of applications, as well as some numerical results.

Journal ArticleDOI
TL;DR: Applications to three practical case studies showed that while maximum likelihood method can effectively handle low replicate measurements, the density function distance methods are more robust in estimating the parameters with consistently higher accuracy, even for systems showing multimodality.
Abstract: The importance of stochasticity in cellular processes having low number of molecules has resulted in the development of stochastic models such as chemical master equation. As in other modelling frameworks, the accompanying rate constants are important for the end-applications like analyzing system properties (e.g. robustness) or predicting the effects of genetic perturbations. Prior knowledge of kinetic constants is usually limited and the model identification routine typically includes parameter estimation from experimental data. Although the subject of parameter estimation is well-established for deterministic models, it is not yet routine for the chemical master equation. In addition, recent advances in measurement technology have made the quantification of genetic substrates possible to single molecular levels. Thus, the purpose of this work is to develop practical and effective methods for estimating kinetic model parameters in the chemical master equation and other stochastic models from single cell and cell population experimental data. Three parameter estimation methods are proposed based on the maximum likelihood and density function distance, including probability and cumulative density functions. Since stochastic models such as chemical master equations are typically solved using a Monte Carlo approach in which only a finite number of Monte Carlo realizations are computationally practical, specific considerations are given to account for the effect of finite sampling in the histogram binning of the state density functions. Applications to three practical case studies showed that while maximum likelihood method can effectively handle low replicate measurements, the density function distance methods, particularly the cumulative density function distance estimation, are more robust in estimating the parameters with consistently higher accuracy, even for systems showing multimodality. The parameter estimation methodologies described in this work have provided an effective and practical approach in the estimation of kinetic parameters of stochastic systems from either sparse or dense cell population data. Nevertheless, similar to kinetic parameter estimation in other modelling frameworks, not all parameters can be estimated accurately, which is a common problem arising from the lack of complete parameter identifiability from the available data.

Proceedings ArticleDOI
01 Sep 2010
TL;DR: This work develops general models for multi-cell signal-to-noise-plus-interference ratio (SINR) based on homogeneous Poisson point processes and derives the coverage probability, which is one minus the outage probability.
Abstract: Cellular networks are usually modeled by placing the base stations according to a regular geometry such as a grid, with the mobile users scattered around the network either as a Poisson point process (i.e. uniform distribution) or deterministically. These models have been used extensively for cellular design and analysis but suffer from being both highly idealized and not very tractable. Thus, complex simulations are used to evaluate key metrics such as coverage probability for a specified target rate (equivalently, the outage probability) or average/sum rate. We develop general models for multi-cell signal-to-noise-plus-interference ratio (SINR) based on homogeneous Poisson point processes and derive the coverage probability, which is one minus the outage probability. Under very general assumptions, the resulting expressions for the SINR cumulative distribution function involve quickly computable integrals, and in some important special cases of practical interest these integrals can be simplified to common integrals (e.g., the Q-function) or even to exact and quite simple closed-form expressions. We compare our coverage predictions to the standard grid model and an actual base station deployment. We observe that the proposed model is pessimistic (a lower bound on coverage) whereas the grid model is optimistic. In addition to being more tractable, the proposed model may better capture the increasingly opportunistic and dense placement of base stations in urban cellular networks with highly variable coverage radii.

Proceedings ArticleDOI
01 Dec 2010
TL;DR: This paper shows that the symbol error rate (SER) of a flat fading communications system can be expressed in closed form by expressing the demodulator outputs as random variable (RVs) that have a complex ratio distribution, which is the ratio of two correlated complex Gaussian RVs.
Abstract: Communications systems rarely have perfect channel state information (PCSI) when demodulating received symbols. This paper shows that the symbol error rate (SER) of a flat fading communications system can be expressed in closed form by expressing the demodulator outputs as random variable (RVs) that have a complex ratio distribution, which is the ratio of two correlated complex Gaussian RVs. To complete the analysis, the complex ratio probability density function (PDF) and cumulative distribution function (CDF) are both derived. Finally, using several scenarios based on M-QAM signaling, the SER performance of imperfect channel state information (ICSI) systems is analyzed.

Journal ArticleDOI
TL;DR: In this article, it was shown that if a probability density function is geometrically concave (convex), then the corresponding cumulative distribution function and the survival function are concave too, under some assumptions.

Journal ArticleDOI
TL;DR: The performance of dual-hop Decode-and-Forward relaying with relay selection with RS is analyzed over Nakagami-m fading channels and results show that, although, in terms of ASEP, relaying is always beneficial, in Terms of OP, it should be disabled whenever the direct link is strong.
Abstract: The performance of dual-hop Decode-and-Forward relaying with relay selection (RS) is analyzed over Nakagami-m fading channels. Assuming that the direct source-to-destination link is active, closed-form expressions for the moment generating and the cumulative distribution functions of a RS-based cooperation scheme that utilizes maximal-ratio diversity at the destination are derived. These expressions are used to obtain the outage probability (OP) and average symbol error probability (ASEP) of this pure RS scheme as well as of a rate-selective one that utilizes RS only when it provides higher achievable rate than that of the direct transmission. Numerically evaluated results, verified by computer simulations, show that, although, in terms of ASEP, relaying is always beneficial, in terms of OP, it should be disabled whenever the direct link is strong.

Journal ArticleDOI
TL;DR: In this paper, a unified reliability analysis framework is proposed to deal with both random and interval variables in multidisciplinary systems, which is an extension of an existing unified uncertainty analysis framework for single-disciplinary problems.
Abstract: Tremendous efforts have been devoted to developing efficient approaches to reliability analysis for multidisciplinary systems. Most of the approaches are only capable of dealing with random variables modeled by probability distributions. Both random and interval variables, however, may exist in multidisciplinary systems. Their propagation through coupled subsystems make s reliability analysis computationally expensive. In this work, a unified reliability analysis framework is proposed to deal with both random and interval variables in multidisciplinary systems. The framework is an extension of an existing unified uncertainty analysis framework for single-disciplinary problems. The new framework involves probabilistic analysis and interval analysis. Both probabilistic analysis and interval analysis are decoupled from each other and are performed sequentially. The first order reliability method is used for probabilistic analysis. Three supporting algorithms are developed. The effectiveness of the algorithms is demonstrated with a mathematical example and an engineering application.

Journal ArticleDOI
TL;DR: This letter studies the performance of two promising relay selection techniques, namely selection cooperation (SC) and opportunistic relaying (OPR), under a clustered fixed-gain relay setting, and shows that, irrespective on the metric analyzed, OPR always yields higher performance than SC.
Abstract: In this letter, we study the performance of two promising relay selection techniques, namely selection cooperation (SC) and opportunistic relaying (OPR), under a clustered fixed-gain relay setting. Such relay configuration finds applicability in practical ad-hoc and sensor networks, although it considers independent identically distributed channels among the links of each hop. Assuming Rayleigh fading and that all nodes are single-antenna devices, a comparative analysis between the selection strategies is performed in terms of the outage probability and average bit error rate. With this aim, closed-form expressions for the probability density function, cumulative distribution function, and moment generating function of the end-to-end signal-to-noise ratio (SNR) are derived. Our theoretical results are validated by means of Monte Carlo simulations, and show that, irrespective on the metric analyzed, OPR always yields higher performance than SC. Such conclusions differ from recent results reported in the open literature for decode-and-forward relays.

Journal ArticleDOI
TL;DR: In this paper, the bin contents of a weighted histogram are considered as a sum of random variables with a random number of terms, and generalizations of the classical chi-square test for comparing weighted histograms are proposed.
Abstract: Weighted histograms in Monte Carlo simulations are often used for the estimation of probability density functions. They are obtained as a result of random experiments with random events that have weights. In this paper, the bin contents of a weighted histogram are considered as a sum of random variables with a random number of terms. Generalizations of the classical chi-square test for comparing weighted histograms are proposed. Numerical examples illustrate an application of the tests for the histograms with different statistics of events and different weighted functions. The proposed tests can be used for the comparison of experimental data histograms with simulated data histograms as well as for the two simulated data histograms.

Proceedings ArticleDOI
13 Jun 2010
TL;DR: In this article, the asymptotic behavior of the polarization process for polar codes when the blocklength tends to infinity was studied and its dependence on the rate of transmission was shown.
Abstract: We consider the asymptotic behavior of the polarization process for polar codes when the blocklength tends to infinity. In particular, we study the asymptotics of the cumulative distribution ℙ(Z n ≤ z), where Z n = Z(W n ) is the Bhat-tacharyya process, and its dependence on the rate of transmission R. We show that for a BMS channel W, for R n→8 ℙ equations R and for R n→8 ℙ equations R, where Q(x) is the probability that a standard normal random variable exceeds x. As a result, if we denote by ℙSC e (n,R) the probability of error using polar codes of block-length N = 2n and rate R e (n,R))) scales as equations. We also prove that the same result holds for the block error probability using the MAP decoder, i.e., for log(−log(ℙMAP e (n,R))).

Journal ArticleDOI
TL;DR: In this article, the cumulative distribution function (CDF) of Rician shadowed random variables is analyzed for the performance analysis of land-mobile satellite (LMS) communications and the results find applicability in the performance of LMS communications.
Abstract: New analytical results are presented for the cumulative distribution function (CDF) of Rician shadowed random variables. In particular, these results find applicability in the performance analysis of land-mobile satellite (LMS) communications.

Journal ArticleDOI
TL;DR: In this paper, a nonintrusive polynomial chaos formulation is used to evaluate the variability in the performance of a generic modular core compression system for a three-spool modern gas turbine engine subject to uncertain operating conditions with a defined probability density function.
Abstract: The design of a gas turbine, or one of its constituentmodules, is generally approachedwith some specific operating condition in mind (its design point). Unfortunately, engine components seldom exactly meet their specifications and do not operate at just one condition, but over a range of power settings. This simplification can then lead to a product that exhibits performance worse than nominal in real-world conditions. The integration of some consideration of robustness as an active part of the design process can allow products less sensitive to the presence of the noise factors commonly found in real-world environments to be obtained. To become routinely used as a design tool, minimization of the time required for robustness analysis is paramount. In this study, a nonintrusive polynomial chaos formulation is used to evaluate the variability in the performance of a genericmodular-core compression system for a three-spool modern gas turbine engine subject to uncertain operating conditions with a defined probability density function. The standard orthogonal polynomials from the Askey scheme are replaced by a set of orthonormal polynomials calculated relative to the specific probability density function, improving the convergence of the method.

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors investigated the cumulative probability density function (PDF) and multiscaling properties of the returns in the Chinese stock market and found that the distribution has power-law tails at shorter microscopic timescales or lags.
Abstract: We investigate the cumulative probability density function (PDF) and the multiscaling properties of the returns in the Chinese stock market. By using returns data adjusted for thin trading, we find that the distribution has power-law tails at shorter microscopic timescales or lags. However, the distribution follows an exponential law for longer timescales. Furthermore, we investigate the long-range correlation and multifractality of the returns in the Chinese stock market by the DFA and MFDFA methods. We find that all the scaling exponents are between 0.5 and 1 by DFA method, which exhibits the long-range power-law correlations in the Chinese stock market. Moreover, we find, by MFDFA method, that the generalized Hurst exponents h ( q ) are not constants, which shows the multifractality in the Chinese stock market. We also find that the correlation of Shenzhen stock market is stronger than that of Shanghai stock market.