scispace - formally typeset
Search or ask a question

Showing papers on "Cumulative distribution function published in 1999"


Reference BookDOI
27 Dec 1999
TL;DR: Introduction Summarizing Data Probability Functions of Random Variables Discrete Probability Distributions Continuous Probable Distributions Standard Normal Distribution Estimation Confidence Intervals Hypothesis Testing Regression Analysis Analysis of Variance Experimental Design Nonparametric Statistics Quality Control and Risk Analysis.
Abstract: Introduction Summarizing Data Probability Functions of Random Variables Discrete Probability Distributions Continuous Probability Distributions Standard Normal Distribution Estimation Confidence Intervals Hypothesis Testing Regression Analysis Analysis of Variance Experimental Design Nonparametric Statistics Quality Control and Risk Analysis General Linear Models Miscellaneous Topics Special Functions

353 citations


Posted Content
TL;DR: In this paper, the authors provide cumulative distribution functions, densities, and finite sample critical values for the single-equation error correction statistic for testing cointegration, and provide a convenient way for calculating finite-sample critical values at standard levels; and a computer program can be used to calculate both critical values and p-values.
Abstract: This paper provides cumulative distribution functions, densities, and finite sample critical values for the single-equation error correction statistic for testing cointegration. Graphs and response surfaces summarize extensive Monte Carlo simulations and highlight simple dependencies of the statistic's quantiles on the number of variables in the error correction model, the choice of deterministic components, and the estimation sample size. The response surfaces provide a convenient way for calculating finite sample critical values at standard levels; and a computer program, freely available over the Internet, can be used to calculate both critical values and p-values. Three empirical examples illustrate these tools.

219 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a method for estimating the model Λ(Y)= min (β′X+U, C), where Y is a scalar, Λ is an unknown increasing function, X is a vector of explanatory variables, β is a variable of unknown parameters, U has unknown cumulative distribution function F, and C is a censoring threshold.

181 citations


Posted Content
06 Dec 1999
TL;DR: Nine of the most important estimators known for the two-point correlation function are compared using a predetermined, rigorous criterion and a clear recommendation has emerged: the Landy & Szalay estimator is preferred in comparison with the other indicators examined, with a performance almost indistinguishable from the Hamilton estimator.
Abstract: Nine of the most important estimators known for the two-point correlation function are compared using a predetermined, rigorous criterion. The indicators were extracted from over 500 subsamples of the Virgo Hubble Volume simulation cluster catalog. The ``real'' correlation function was determined from the full survey in a 3000Mpc/h periodic cube. The estimators were ranked by the cumulative probability of returning a value within a certain tolerance of the real correlation function. This criterion takes into account bias and variance, and it is independent of the possibly non-Gaussian nature of the error statistics. As a result for astrophysical applications a clear recommendation has emerged: the Landy & Szalay (1993) estimator, in its original or grid version Szapudi & Szalay (1998), are preferred in comparison to the other indicators examined, with a performance almost indistinguishable from the Hamilton (1993) estimator.

172 citations


Journal ArticleDOI
TL;DR: It is shown that a possibility measure is a coherent upper probability if and only if it is normal, and it is proved that when either the upper or the lower cumulative distribution function of a random quantity is specified, possibility measures very naturally emerge as the corresponding natural extensions.

132 citations


Journal ArticleDOI
TL;DR: Expressions for the outage probability and average error probability performances of a dual selective diversity system with correlated slow Rayleigh fading are presented either in closed form or in terms of a single integral with finite limits and an integrand composed of elementary functions.
Abstract: Using a simple finite integral representation for the bivariate Rayleigh (1889) cumulative distribution function previously discovered by the authors, we present expressions for the outage probability and average error probability performances of a dual selective diversity system with correlated slow Rayleigh fading either in closed form (in particular for binary differential phase-shift keying) or in terms of a single integral with finite limits and an integrand composed of elementary (exponential and trigonometric) functions. Because of their simple form, these expressions readily allow numerical evaluation for cases of practical interest. The results are also extended to the case of slow Nakagami-m fading using an alternate representation of the generalized Marcum (1950) Q-function.

127 citations


Journal ArticleDOI
TL;DR: In this paper, two methods, bootstrap simulation and a likelihood-based method, are applied to three datasets including a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants.
Abstract: Variability arises due to differences in the value of a quantity among different members of a population. Uncertainty arises due to lack of knowledge regarding the true value of a quantity for a given member of a population. We describe and evaluate two methods for quantifying both variability and uncertainty. These methods, bootstrap simulation and a likelihood-based method, are applied to three datasets. The datasets include a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and a sample of five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants. For each of these datasets, we employ the two methods to characterize uncertainty in the arithmetic mean and standard deviation, cumulative distribution functions based upon fitted parametric distributions, the 95th percentile of variability, and the 63rd percentile of uncertainty for the 81st percentile of variability. The latter is intended to show that it is possible to describe any point within the uncertain frequency distribution by specifying an uncertainty percentile and a variability percentile. Using the bootstrap method, we compare results based upon use of the method of matching moments and the method of maximum likelihood for fitting distributions to data. Our results indicate that with only 5–19 data points as in the datasets we have evaluated, there is substantial uncertainty based upon random sampling error. Both the boostrap and likelihood-based approaches yield comparable uncertainty estimates in most cases.

110 citations


Journal ArticleDOI
TL;DR: In this paper, a spatial subsampling method for predicting an SCDF based on observations made on a hexagonal grid, similar to the one used in the Environmental Monitoring and Assessment Program of the U.S. Environmental Protection Agency, is presented.
Abstract: The spatial cumulative distribution function (SCDF) is a random function that provides a statistical summary of a random field over a spatial domain of interest. In this article we develop a spatial subsampling method for predicting an SCDF based on observations made on a hexagonal grid, similar to the one used in the Environmental Monitoring and Assessment Program of the U.S. Environmental Protection Agency. We show that under quite general conditions, the proposed subsampling method provides accurate data-based approximations to the sampling distributions of various functionals of the SCDF predictor. In particular, it produces estimators of different population characteristics, such as the quantiles and weighted mean integrated squared errors of the empirical predictor. As an illustration, we apply the subsampling method to construct large-sample prediction bands for the SCDF of an ecological index for foliage condition of red maple trees in the state of Maine.

100 citations


Journal ArticleDOI
TL;DR: In this article, the distribution of the distance between words in a random sequence of letters is studied in view of application in genome sequence analysis and the exact distribution probability and cumulative distribution function of the distances between two successive occurrences of a given word and between the nth and the (n + m)th occurrences under three models of generation of the letters: i.i.d with the same probability for each letter, i.d. with different probabilities and Markov process.
Abstract: The study of the distribution of the distance between words in a random sequence of letters is interesting in view of application in genome sequence analysis. In this paper we give the exact distribution probability and cumulative distribution function of the distances between two successive occurrences of a given word and between the nth and the (n + m)th occurrences under three models of generation of the letters: i.i.d. with the same probability for each letter, i.i.d. with different probabilities and Markov process. The generating function and the first two moments are also given. The point of studying the distances instead of the counting process is that we get some knowledge not only about the frequency of a word but also about its longitudinal distribution in the sequence.

96 citations


Journal ArticleDOI
Sheng Yue1
TL;DR: In this paper, a procedure for using the bivariate normal distribution to describe the joint distributions of correlated flood peaks and volumes as well as correlated flood volumes and durations is presented.
Abstract: This article provides a procedure for using the bivariate normal distribution to describe the joint distributions of correlated flood peaks and volumes as well as correlated flood volumes and durations. The Box-Cox transformation (power-transformation) method is used to normalize original marginal distributions of flood peaks, volumes, and durations regardless of the original forms of these distributions. The power-transformation parameter is estimated using the maximum likelihood method. The joint cumulative distribution function, the conditional cumulative distribution function, and the associated return periods can be readily obtained based on the bivariate normal distribution. The method is tested and validated using observed flood data from the Batiscan River basin in the province of Quebec, Canada. The resulting theoretical distributions show a good fit to observed ones.

93 citations


Journal ArticleDOI
TL;DR: This work considers the estimation of the intensity and survival functions for a continuous time progressive three-state semi-Markov model with intermittently observed data and obtains smooth estimates of the severity functions.
Abstract: Summary. We consider the estimation of the intensity and survival functions for a continuous time progressive three-state semi-Markov model with intermittently observed data. The estimator of the intensity function is defined nonparametrically as the maximum of a penalized likelihood. We thus obtain smooth estimates of the intensity and survival functions. This approach can accommodate complex observation schemes such as truncation and interval censoring. The method is illustrated with a study of hemophiliacs infected by HIV. The intensity functions and the cumulative distribution functions for the time to infection and for the time to AIDS are estimated. Covariates can easily be incorporated into the model.

Journal ArticleDOI
E. Villier1
TL;DR: Although limited to equipower interferers, this analysis is a convenient way of assessing the performance of optimum combining in some typical situations and comparing it with that of maximal-ratio combining, especially in the region of reasonably high BERs which are of practical interest.
Abstract: This paper is a performance analysis of optimum combining in the presence of multiple equal power interferers and noise when the number of interferers is less than the number of antenna elements. Desired signal and interferers are subject to flat Rayleigh fading, and the propagation channels are independent. An approximate expression of the probability density function of the output signal-to-interference-plus-noise ratio (SINR) is derived analytically. It is then applied to obtain the cumulative distribution function of the SINR, and the bit-error rate (BER) of some binary modulations, including coherent binary phase-shift keying. In the case of a single interferer, an exact analysis is performed to prove the validity of the approximation. In the case of multiple interferers, the accuracy of the approximation is assessed through simulations. Although limited to equipower interferers, this analysis is a convenient way of assessing the performance of optimum combining in some typical situations and comparing it with that of maximal-ratio combining. The final results are remarkably simple and provide a useful complement to previous analyzes, especially in the region of reasonably high BERs which are of practical interest.

Journal ArticleDOI
TL;DR: In this paper, the authors presented an approximate analytical cumulative distribution function of the output signal-to-noise-plus-interference ratio (SINR) of an adaptive antenna operating in multipath environments with multiple interferers and correlated fadings.
Abstract: This paper presents an approximate analytical cumulative distribution function of the output signal-to-noise-plus-interference ratio (SINR) of an adaptive antenna operating in multipath environments with multiple interferers and correlated fadings. Previously, approximate analytical results were only available for the case of one interferer and independent fadings between antenna branches, whereas in other cases Monte Carlo simulations had to be used with many limitations including excessive computer time and inaccurate results for small probability levels. The distribution, expressed in terms of the mean eigenvalues of the system, is accurate in most cases investigated even though it is based on an approximation to the characteristic function of the output SINR. As a result, a closed-form expression of bit error rate (BER) for coherent phase-shift keying (PSK) has been derived based on this approximation.

Book
01 Jan 1999
TL;DR: BASIC CONCEPTS Introduction Events and Probability Rules of Probability Dependent Events Random Variables and probabilities Distributions The Reliability Function The Hazard Function Expectation COMMON LIFETIME MODELS Introduction Graphical Methods Testing for Trend Repair Time Maintainability and Availability Introduction to Renewal Theory Laplace Transforms The Renewal Function Alternating Renewal Processes The Distribution of N(t) system RELIABILITY Systems and System Logic Tie and Cut Sets Probability Bounds Fault Trees Failure Over Time Redundancy Quorum or m-out-of
Abstract: BASIC CONCEPTS Introduction Events and Probability Rules of Probability Dependent Events Random Variables and Probability Distributions The Reliability Function The Hazard Function Expectation COMMON LIFETIME MODELS Introduction The Poisson Process The Weibull Distribution The Gumbel Distribution The Normal and Lognormal Distributions The Gamma Distribution The Logistic and Log Logistic Distributions The Pareto Distribution Order Statistic and Extreme Value Distributions MODEL SELECTION Introduction Non-Parametric Estimation of R(t) and h(t) Censoring Kaplan-Meier Estimator Graphical Methods Straight Line Fitting Weibull Plotting Normal Plotting Other Model Family Plots Comparison of Distributions MODEL FITTING Parameter Estimation The Variance of Estimators Confidence Interval Estimates Maximum Likelihood Estimating Quantities Estimation Methods Using Sample Moments General Probability Plots Goodness of Fit Pearson's Chi-squared Test Kolmogorov-Smirnov Test Tests for Normality A-squared and W-squared Tests Stabilized Probability Plots Censored Data REPAIRABLE SYSTEMS Introduction Graphical Methods Testing for Trend Repair Time Maintainability and Availability Introduction to Renewal Theory Laplace Transforms The Renewal Function Alternating Renewal Processes The Distribution of N(t) SYSTEM RELIABILITY Systems and System Logic Tie and Cut Sets Probability Bounds Fault Trees Failure Over Time Redundancy Quorum or m-out-of-n Systems Analysis of Systems Using State Spaces Mean Time to Fail (MTTF) Considerations Due to "Switching" Common Cause Failures MODELS FOR FUNCTIONS OF RANDOM VARIABLES Combinations and Transformations of Random Variables Expectations of Functions of Random Variables Approximations for E[g(x)] and V[g(x)] Distribution of a Function of Random Variables Probabilistic Engineering Design Stress and Strength Distributions Interference Theory and Reliability Computations Normally Distributed Stress and Strength Safety Factors and Reliability Graphical Approach for Empirically Determined Distributions MAINTENANCE STRATEGIES Maintained Systems Availability Markovian Systems Mean Time between Failures (MTBF) Age Replacement Scheduled Maintenance Systems with Failure Detection/Fail Safe Devices Down-Time Distributions LIFE TESTING AND INFERENCE Life Test Plans Prediction of Time on Test Inference for the Exponential Distribution The Effect of Data Rounding Parametric Reliability Bounds Likelihood-Based Methods The Likelihood Ratio Test Binomial Experiments Non-Parametric Estimation and Confidence Intervals for R(t) Estimating System Reliability from Subsystem Test Data Accelerated Testing ADVANCED MODELS Covariates Proportional Hazards Models Accelerated Life Models Mixture Models Competing Risks Dependent Failures Load-Sharing Systems Bayesian Reliability Case Studies USEFUL MATHEMATICAL TECHNIQUES Partial Fractions Series Taylor Expansions Newton-Raphson Iteration Numerical Integration Matrix Algebra The Principle of Least Squares

Journal ArticleDOI
TL;DR: This paper investigates in detail the level-crossing rate (LCR), average duration of fades (ADF), and cumulative distribution function (CDF) of such classes of simulation models.
Abstract: The concept of Rice's sum of sinusoids is often applied to the design of deterministic simulation models for Rice fading channels. This paper investigates in detail the level-crossing rate (LCR), average duration of fades (ADF), and cumulative distribution function (CDF) of such classes of simulation models. Exact and simple approximative formulas are deduced for these statistical quantities. Several numerical results for the derived expressions are presented by using different procedures for the design of the parameters of the deterministic simulation model. Moreover, comparisons with the corresponding simulation results-obtained by evaluating the deterministic simulation models output data-are also given.

Journal ArticleDOI
TL;DR: In this article, a computer program was developed to estimate the parameters of the two-parameter Weibull distribution describing a given data set and to generate random numbers following a given Weibbull distribution.

Journal ArticleDOI
TL;DR: In this article, the authors considered the empirical predictor of a spatial cumulative distribution function (F^∞) based on a finite set of observations from a region in Ω(n, d ) under a uniform sampling design.
Abstract: A spatial cumulative distribution function F^∞ (say) is a random distribution function that provides a statistical summary of random field over a given region. This paper considers the empirical predictor of F^∞ based on a finite set of observations from a region in ℝ d under a uniform sampling design. A functional central limit theorem is proved for the predictor as a random element of the space D[−∞, ∞]. A striking feature of the result is that the rate of convergence of the predictor to the predictand F^∞ depends on the location of the data-sites specified by the sampling design. A precise description of the dependence is given. Furthermore, a subsampling method is proposed for integral-based functionals of random fields, which is then used to construct large sample prediction bands for F^∞.

Book
24 Jun 1999
TL;DR: This book contains a summary, additional examples, problems, computer exercises, and self-test problems with solutions of Probability, Randomness, Random Signals and Systems.
Abstract: INTRODUCTION Randomness, Random Signals and Systems Probability and Random Processes Typical Engineering Applications Why Study Probability and Random Signals Key Features of the Book Rules for the Presentation BASIC CONCEPTS IN PROBABILITY Basics of Set Theory Fundamental Concepts in Probability Conditional Probability Independent Events Total Probability Theorem and Bayes' Rule Combined Experiments and Bernoulli Trials THE RANDOM VARIABLE Concept of Random Variable Cumulative Distribution Function Probability Density Function Uniform Distribution Gaussian Distribution and Central Limit Theorem Some Other Commonly Used Distributions Expectation and Moments Functions of a Random Variable Conditional Distributions Generation of Random Numbers Determination of Distribution from Data Characteristic Functions MULTIPLE RANDOM VARIABLES Joint Distribution Functions Joint Density Function Independence of Random Variables Expectation and Moments Relation between Two Random Variables Uniform Distributions Jointly Gaussian Random Variables Functions of Random Variables Conditional Distributions INTRODUCTION TO STATISTICS Introduction Sample Mean and Sample Variance Empirical Distributions Statistical Inference Parameter Estimation Hypothesis Testing Linear Regression and Curve Fitting RANDOM PROCESSES Concept of Random Process Characterization of Random Processes Classification of Random Processes Correlation Functions Properties of Autocorrelation Functions Sample Mean and Sample Correlation Functions Relationship between Two Random Processes Properties of Crosscorrelation Functions Gaussian Random Processes POWER SPECTRAL DENSITY Concept of Power Spectral Density Properties of Power Spectral Density White Noise Power Spectrum Estimation Cross-Power Spectrum Power Spectrum in Laplace Domain Some Facts about Fourier Transforms LINEAR SYSTEMS WITH RANDOM INPUTS Deterministic Linear Systems Time-Domain Analysis Frequency-Domain Analysis Summary of Input-Output Relationships Linear Systems with White Noise Input Equivalent Noise Bandwidth Output of a Linear System as a Gaussian Process OPTIMAL LINEAR SYSTEMS Introduction Signal-to-Noise Ratio The Matched Filter The Wiener Filter Plus each chapter includes a summary, additional examples, problems, computer exercises, and self-test problems with solutions.

Journal ArticleDOI
TL;DR: In this article, a set of sufficient conditions under which the probability at an exponential rate as n → ∞ where the rate possibly depends on ϵ, δ and f and [a, b] is a finite or an infinite interval is studied.
Abstract: Let {X n ;n ≥1} be a sequence of stationary associated random variables having a common marginal density function f (x). Let , be a sequence of Borel-measurable functions defined on R 2. Let be the empirical density function. Here we study a set of sufficient conditions under which the probability at an exponential rate as n → ∞ where the rate possibly depends on ϵ, δ and f and [a, b] is a finite or an infinite interval.

Journal ArticleDOI
TL;DR: In this paper, an heuristic algorithm is presented for the evaluation of the parameters of additive-Weibull distributions having more than two elementary functions, which is a necessary condition when testing complex insulating structures with different partial discharge (PD) sources.
Abstract: In this paper, an heuristic algorithm is presented for the evaluation of the parameters of additive-Weibull distributions having more than two elementary functions, which is a necessary condition when testing complex insulating structures with different partial discharge (PD) sources. The algorithm has, in principle, no limit in the number of component Weibull functions. Furthermore, it should be emphasized that it is not required to have a very close initial guess of the parameters for solution convergence, unlike other algorithms used at present. In order to check the validity of the proposed algorithm in finding the elementary components of a mixed Weibull function, many PD experimental tests have been performed on some lumped capacitance specimens in order to compare the experimental cumulative probability of pulse amplitude distributions (PAD) with the ones obtained from the Weibull analysis. In this aim, the results of the Cramer Von Mises (CVM) test performed on the experimental results are reported. Some considerations are also given in the case of testing commercial HV components having complex PAD.

Journal ArticleDOI
TL;DR: In this paper, the joint asymptotic distributions for the estimators of the regression function and its derivatives are established for multivariate regression functions and their derivatives for a regression model with long-range dependent errors.

Journal ArticleDOI
TL;DR: In this paper, a family of random functions with a characterization of its finite dimensional law is described for the purpose of numerically studying sahelian storm rainfields, and two solutions to bypass those problems are provided, according to the regularity properties of the marginal cumulative distribution function.
Abstract: For the purpose of numerically studying sahelian storm rainfields, a family of random functions is described with a characterization of its finite dimensional law. Some problems appearing when fitting its functional parameters are put forward and two solutions to bypass those problems are provided, according to the regularity properties of the marginal cumulative distribution function. An illustration of this method is implemented on a set of sahelian rainfields of event accumulation displaying a strong spatial intermittency.

Journal ArticleDOI
A.E. Tercan1
TL;DR: In this article, the orthogonal transformed indicator approach to conditional cumulative distribution functions is based on the decomposition of the indicator variogram matrix as a matrix product, and five decomposition algorithms are considered: spectral, Cholesky, symmetric, cholesky-spectral, and simultaneous decompositions.
Abstract: The orthogonal transformed indicator approach to conditional cumulative distribution functions is based on the decomposition of the indicator variogram matrix as a matrix product. This paper explores the manner in which the decomposition algorithm affects the conditional cumulative distribution function as estimated by orthogonal transformed indicator kriging. Five decomposition algorithms are considered: spectral, Cholesky, symmetric, Cholesky-spectral, and simultaneous decompositions. Impact of the algorithms on spatial orthogonality and order relations problems is examined and their performances together with indicator kriging are compared using a real dataset.


Journal ArticleDOI
TL;DR: In this paper, an inequality of Ostrowski's type for a random variable whose probability density function is in Lp (a,b),p > 1, in terms of the cumulative distribution function and expectation is given.
Abstract: An inequality of Ostrowski's type for a random variable whose probability density function is in Lp (a,b),p > 1, in terms of the cumulative distribution function and expectation is given. An application for a Beta random variable is also given.

Journal ArticleDOI
TL;DR: It is shown how these methods can be automatically extended from constraints setting parameters to constant values to constraints positing equality of parameters, and how the approximation to the non-centrality parameter differs between the methods.
Abstract: This paper describes two asymptotic methods for sample size and power calculation for hypothesis testing. Both methods assume that the distribution of the likelihood ratio is approximately distributed as a central chi(2) distribution under the null hypothesis and as a non-central chi(2) under the alternative hypothesis. The approximation to the non-centrality parameter differs between the methods. It is shown how these methods can be automatically extended from constraints setting parameters to constant values to constraints positing equality of parameters. Two very simple examples are presented; one demonstrates that the information method can produce arbitrarily incorrect results. Four more comprehensive examples are then discussed. In addition to demonstrating the wide range of applicability of these methods, the examples illustrate techniques that may be used in cases in which there is insufficient initial information available to perform a realistic calculation. The availability of a computer implementation of these methods in S-plus is announced, as are routines for computing the cumulative distribution function of the non-central chi(2) and its inverse.

Journal ArticleDOI
TL;DR: In this paper, the authors extended the results of Hayter (1990) to the scale parameter case under location-scale probability model and proposed a test to test the null hypothesis H0:θ1=…=θk against the simple ordered alternative H1:Φ1≦…≦Φk, with at least one strict inequality, using the data Xi,j, i, i=1,…k; j= 1,…,n1.
Abstract: In an earlier paper the authors (1997) extended the results of Hayter (1990) to the two parameter exponential probability model. This paper addressee the extention to the scale parameter case under location-scale probability model. Consider k (k≧3) treatments or competing firms such that an observation from with treatment or firm follows a distribution with cumulative distribution function (cdf) Fi(x)=F[(x-μi)/Qi], where F(·) is any absolutely continuous cdf, i=1,…,k. We propose a test to test the null hypothesis H0:θ1=…=θk against the simple ordered alternative H1:θ1≦…≦θk, with at least one strict inequality, using the data Xi,j, i=1,…k; j=1,…,n1. Two methods to compute the critical points of the proposed test have been demonstrated by talking k two parameter exponential distributions. The test procedure also allows us to construct simultaneous one sided confidence intervals (SOCIs) for the ordered pairwise ratios θj/θi, 1≦i

Journal ArticleDOI
TL;DR: In this paper, a new model for the prediction of the cumulative distribution function of fatigue life of structural elements during the crack propagation stage is established, which is considered as a cumulative damage process following the probabilistic approach of Bogdanoff and Kozin.

Journal Article
TL;DR: A simple fuzzy-probabilistic method for the time series analysis using the new concepts of transition and cumulative probability procedures for taking decision among the alternative consequent fuzzy sets prior to the defuzzification.
Abstract: Classical time series analysis requires many assumptions such as the normality of data, linearity in the autocorelation coefficient and statistical parameter estimations. It is almost impossible to find all these assumptions applicable in stochastic time series generation or simulation. This paper provides a simple fuzzy-probabilistic method for the time series analysis. The basis of the methodology is to construct the fuzzy base rule domain from the available daily maximum temperature records at Kandilli observatory in Istanbul. The new concepts of transition and cumulative probability procedures are employed for taking decision among the alternative consequent fuzzy sets prior to the defuzzification.

Journal ArticleDOI
TL;DR: In this article, the response behavior of a suspended cable to random Gaussian excitation was studied using computational simulations, including the interaction between the first two in-plane and first two out-of-plane modes.