scispace - formally typeset
Search or ask a question

Showing papers on "Probability distribution published in 1992"


Book
01 Jan 1992
TL;DR: In this paper, the authors present a review of the basic ideas of linear regression models, including the two-variable model, the dummy variable model, and the multiple regression model.
Abstract: Chapter 1: The Nature and Scope of Econometrics Part I: The Linear Regression Model Chapter 2: Basic Ideas of Linear Regression Chapter 3: The Two-Variable Model: Hypothesis Testing Chapter 4: Multiple Regression: Estimation and Hypothesis Testing Chapter 5: Functional Forms of Regression Models Chapter 6: Dummy Variable Regression Models Part II: Regression Analysis in Practice Chapter 7: Model Selection: Criteria and Tests Chapter 8: Multicollinearity: What Happens if Explanatory Variables are Correlated? Chapter 9: Heteroscedasticity: What Happens if the Error Variance is Nonconstant? Chapter 10: What Happens if Error Terms are Correlated? Part II: Advanced Topics in Econometrics Chapter 11: Simultaneous Equation Models Chapter 12: Selected Topics in Single-Equation Regression Models Appendices Introduction: Basics of Probability and Statistics Appendix A: Review of Statistics: Probability and Probability Distributions Appendix B: Characteristics of Probability Distributions Appendix C: Some Important Probability Distributions Appendix D: Statistical Inference: Estimation and Hypothesis Testing Appendix E: Statistical Tables Appendix F: Computer Output of EViews, Minitab, Excel, and STATA

1,043 citations


Journal ArticleDOI
TL;DR: The authors apply flexible constraints, in the form of a probabilistic deformable model, to the problem of segmenting natural 2-D objects whose diversity and irregularity of shape make them poorly represented in terms of fixed features or form.
Abstract: Segmentation using boundary finding is enhanced both by considering the boundary as a whole and by using model-based global shape information. The authors apply flexible constraints, in the form of a probabilistic deformable model, to the problem of segmenting natural 2-D objects whose diversity and irregularity of shape make them poorly represented in terms of fixed features or form. The parametric model is based on the elliptic Fourier decomposition of the boundary. Probability distributions on the parameters of the representation bias the model to a particular overall shape while allowing for deformations. Boundary finding is formulated as an optimization problem using a maximum a posteriori objective function. Results of the method applied to real and synthetic images are presented, including an evaluation of the dependence of the method on prior information and image quality. >

888 citations


Journal ArticleDOI
TL;DR: Partial table of contents: Maximum--Entropy Probability Distributions: Principles, Formalism and Techniques.
Abstract: Partial table of contents: Maximum--Entropy Probability Distributions: Principles, Formalism and Techniques. Maximum--Entropy Discrete Univariate Probability Distributions. Maximum--Entropy Discrete Multivariate Probability Distributions. Maximum--Entropy Continuous Multivariate Probability Distributions. Maximum--Entropy Distributions in Statistical Mechanics. Minimum Discrepancy Measures. Concavity (Convexity) of Maximum--Entropy (Minimum Information) Functions. Equivalence of Maximum--Entropy Principle and Gauss's Principle of Density Estimation. Maximum--Entropy Principle and Contingency Tables. Maximum--Entropy Principle and Statistics. Maximum--Entropy Models in Regional and Urban Planning. Maximum--Entropy Models in Marketing and Elections. Maximum--Entropy Spectral Analysis. Maximum--Entropy Image Reconstruction. Maximum--Entropy Principle in Operations Research. References. Author Index. Subject Index.

720 citations


Journal ArticleDOI
TL;DR: A unified framework for the analysis of a class of random allocation processes that include the birthday paradox, the coupon collector problem, least-recently-used caching in memory management systems under the independent reference model and the move-to-front heuristic of self-organizing search is introduced.

478 citations


Journal ArticleDOI
TL;DR: How to determine the sensitivity of these instruments to sources of gravitational radiation by considering the process by which data are analyzed in a noisy detector is examined.
Abstract: The optimum design, construction, and use of the Laser Interferometer Gravitational Wave Observatory (LIGO), the French-Italian Gravitational Wave Observatory (VIRGO), or the Laser Gravitational Wave Observatory (LAGOS) gravitational radiation detectors depends upon accurate calculations of their sensitivity to different sources of radiation. Here I examine how to determine the sensitivity of these instruments to sources of gravitational radiation by considering the process by which data are analyzed in a noisy detector. The problem of detection (is a signal present in the output of the detector?) is separated from that of measurement (what are the parameters that characterize the signal in the detector output?). By constructing the probability that the detector output is consistent with the presence of a signal, I show how to quantify the uncertainty that the output contains a signal and is not simply noise. Proceeding further, I construct the probability distribution that the parametrization $\ensuremath{\mu}$ that characterizes the signal has a certain value. From the distribution and its mode I determine volumes $V(P)$ in parameter space such that $\ensuremath{\mu}\ensuremath{\in}V(P)$ with probability $P$ [owing to the random nature of the detector noise, the volumes $V(P)$ are always different, even for identical signals in the detector output], thus quantifying the uncertainty in the estimation of the signal parametrization. These techniques are suitable for analyzing the output of a noisy detector. If we are designing a detector, or determining the suitability of an existing detector for observing a new source, then we do not have detector output to analyze but are interested in the "most likely" response of the detector to a signal. I exploit the techniques just described to determine the "most likely" volumes $V(P)$ for detector output that would result in a parameter probability distribution with given mode. Finally, as an example, I apply these techniques to determine the anticipated sensitivity of the LIGO and LAGOS detectors to the gravitational radiation from a perturbed Kerr black hole.

464 citations


01 Jan 1992
TL;DR: In this article, Monte Carlo techniques are used to fit dependent and independent variables least squares fit to a polynomial least-squares fit to an arbitrary function fitting composite peaks direct application of the maximum likelihood.
Abstract: Uncertainties in measurements probability distributions error analysis estimates of means and errors Monte Carlo techniques dependent and independent variables least-squares fit to a polynomial least-squares fit to an arbitrary function fitting composite peaks direct application of the maximum likelihood. Appendices: numerical methods matrices graphs and tables histograms and graphs computer routines in Pascal.

399 citations


Book
06 Aug 1992
TL;DR: In this paper, the authors provide a systematic account of the theory of generalized Gamma convolutions and related classes of probability distributions and densities, and several well-known probability distributions are treated in the accompanying examples.
Abstract: The aim of this monograph is to provide a systematic account of the theory of generalized Gamma convolutions and related classes of probability distributions and densities. Several well-known probability distributions are treated in the accompanying examples.

349 citations


Journal ArticleDOI
TL;DR: Product partition models as discussed by the authors assume that observations in different components of a random partition of the data are independent, and thus provide a convenient machinery for allowing the data to weight the partitions likely to hold; and inference about particular future observations may then be made by conditioning on the partition and then averaging over all partitions.
Abstract: Product partition models assume that observations in different components of a random partition of the data are independent. If the probability distribution of random partitions is in a certain product form prior to making the observations, it is also in product form given the observations. The product model thus provides a convenient machinery for allowing the data to weight the partitions likely to hold; and inference about particular future observations may then be made by first conditioning on the partition and then averaging over all partitions. These models apply with special computational simplicity to change point problems, where the partitions divide the sequence of observations into components within which different regimes hold. We show, with appropriate selection of prior product models, that the observations can eventually determine approximately the true partition.

307 citations


Book
01 Jun 1992
TL;DR: Design of mechanical components and systems Monte Carlo simulation reliability-based optimum design strength-based reliability and interface theory reliability testing time dependent reliability of component and systems failure modes, event tree and fault tree analysis.
Abstract: Design of mechanical components and systems Monte Carlo simulation reliability-based optimum design strength-based reliability and interface theory reliability testing time dependent reliability of components and systems failure modes, event tree and fault tree analysis quality control and reliability modeling of geometry, material strength and loads structural reliability weakest link and fail safe systems maintainability and availability extremal distributions random variables and probability distributions functions of random variables basic probability theory.

267 citations


Journal ArticleDOI
TL;DR: It is found that the frequency hopping schemes are inherently superior and their performance is not dependent on the synchronization of the hopping times for the different users.
Abstract: Results on the modeling of interference in a radio communication network and performance measures for the link as a function of distance are presented. It is assumed that a transmitter-receiver pair in a radio network is affected by a set of interferers, using the same modulation and power, whose positions are modeled as a Poisson field in the plane. Assuming a 1/r/sup gamma / propagation power loss law, the probability distributions for the noise at the receiver are found to be the stable distributions. Results are given for the probability of symbol error and link capacity as a function of the distance between the transmitter and receiver for direct sequence and frequency hopping spread spectrum schemes. It is found that the frequency hopping schemes are inherently superior and their performance is not dependent on the synchronization of the hopping times for the different users. >

251 citations


Journal ArticleDOI
Ronald R. Yager1
TL;DR: A unifying view for constructing specificity measures is described and the relationship of the specificity of a distribution and its negation is looked at and the case where the base set is continuous is considered.

Journal ArticleDOI
TL;DR: This paper demonstrates a new methodology for extended uncertainty analyses in public health risk assessments using Monte Carlo techniques that provides a quantitative way to estimate the probability distributions for exposure and health risks within the validity of the model used.
Abstract: Most public health risk assessments assume and combine a series of average, conservative, and worst-case values to derive a conservative point estimate of risk This procedure has major limitations This paper demonstrates a new methodology for extended uncertainty analyses in public health risk assessments using Monte Carlo techniques The extended method begins as do some conventional methods--with the preparation of a spreadsheet to estimate exposure and risk This method, however, continues by modeling key inputs as random variables described by probability density functions (PDFs) Overall, the technique provides a quantitative way to estimate the probability distributions for exposure and health risks within the validity of the model used As an example, this paper presents a simplified case study for children playing in soils contaminated with benzene and benzo(a)pyrene (BaP)

Journal ArticleDOI
TL;DR: The authors examined the performance of mixed-effects logistic regression analysis when a main component of the model, the mixture distribution, is misspecified, and showed that estimates of model parameters, including the effects of covariates, typically are asymptotically biased, i.e. inconsistent.
Abstract: SUMMARY Mixed-effects logistic models are often used to analyze binary response data which have been gathered in clusters or groups. Responses are assumed to follow a logistic model within clusters, with an intercept which varies across clusters according to a specified probability distribution G. In this paper we examine the performance of mixed-effects logistic regression analysis when a main component of the model, the mixture distribution, is misspecified. We show that, when the mixture distribution is misspecified, estimates of model parameters, including the effects of covariates, typically are asymptotically biased, i.e. inconsistent. However, we present some approximations which suggest that the magnitude of the bias in the estimated covariate effects is typically small. These findings are corroborated by a set of simulations which also suggest that valid variance estimates of estimated covariate effects can be obtained when the mixture distribution is misspecified.

Book
29 Jun 1992
TL;DR: PRELIMINARIES: BASIC MATHEMATICAL NOTIONS, LOCAL COMPUTATIONS with PROBABILITIES on GRAPHICAL STRUCTURES and INFLUENCE DIAGRAMS, and A PROBABILISTIC ANALYSIS of Compositional Systems.
Abstract: PRELIMINARIES: BASIC MATHEMATICAL NOTIONS. PROBABILITY. Basic Notions. Independence and Conditional Probability: Events. Independence and Conditional Probability: Two Random Variates. Independence and Conditional Probabilities: Random Fields. Log-Linear Representations of Probability Distributions of Random Fields. Appendix: Where Does the Probability Come From? GRAPHS AND PROBABILITY. Graphs. From Hierachial Log-Linear Models to Graphical Representation of Probability Distributions of Random Fields. Markov Properties. Decomposability and Collapsibility. Decomposability and Approximation. Appendix: Some Additional Facts Concerning Graphs. DECISION MAKING UNDER UNCERTAINTY. Decision Task. Decision Under Ignorance. Maximum Entropy Principle. Minimax Principle. LOCAL COMPUTATIONS WITH PROBABILITIES ON GRAPHICAL STRUCTURES AND INFLUENCE DIAGRAMS. Causal Graphs and Conditional Probability Tables. Local Representation of Probabilities. Local Computations: Inference Engine. Local Computations: Some Technicalities. Shachter's Method. KNOWLEDGE INTEGRATION METHODS. Completeness of Input Knowledge. Optimal Decision. Lagrange Multiplers Method. Iterative Proportional Fitting Procedure. D.S.S. Approximations. Studeng's Method. AN INTRODUCTION TO COMPOSITIONAL SYSTEMS. Basic Definitions and Assumptions on Compositional Systems. Some Properties of Combining Functions. Backward Chaining. Three-Valued Systems. The Most Modest Runs. Additional Information on Propositional Logic. COMPOSITIONAL SYSTEMS: AN ALGEBRAIC ANALYSIS. Compositional Systems and Ordered Abelian Groups. Comparative Properties of Compositional Systems. Finitely Generated Ordered Abelian Groups. Where Are Weights of Rules From? A PROBABILISTIC ANALYSIS OF COMPOSITIONAL SYSTEMS. Uncertainty and Probability in Classical Systems. Compositional Systems and Log-Linear Representation. Compositional Systems and Graphical Models: The Method of Guarded Use. THE DEMPSTER-SHAFER THEORY OF EVIDENCE AND ITS USE IN EXPERT SYSTEMS. An Introduction to Dempster-Shafer Theory. Dempster-Shafer Theory and Local Computations. Belief Functions and Compositional Systems. ESTIMATION OF PROBABILITIES AND STRUCTURES. Estimation of Probabilities. Estimation of Structures. References.

Journal ArticleDOI
01 Jan 1992-Genetica
TL;DR: A simple, completely general, and computationally efficient procedure for calculating probability distributions arising from fluctuation analysis and the formula for this procedure when cells in a colony have only grown for a finite number of generations after initial seeding are reported.
Abstract: Fluctuation analysis, which is often used to demonstrate random mutagenesis in cell lines (and to estimate mutation rates), is based on the properties of a probability distribution known as the Luria-Delbruck distribution (and its generalizations). The two main new results reported in this paper are (i) a simple, completely general, and computationally efficient procedure for calculating probability distributions arising from fluctuation analysis and (ii) the formula for this procedure when cells in a colony have only grown for a finite number of generations after initial seeding. It is also shown that the procedure reduces to one that was developed earlier when an infinite number of generations is assumed. The derivation of the generating function of the distribution is also clarified. The results obtained should also be useful to experimentalists when only a relatively short time elapses between seeding and harvestint cultures for fluctuation analysis.

Journal ArticleDOI
TL;DR: A condition for approximate decoherence is proposed, and it is argued that most probability sum rules will be satisfied to approximately the same degree, and an inequality bounding the size of the off-diagonal terms of thedecoherence functional is derived.
Abstract: We study a formulation of quantum mechanics in which the central notion is that of a quantum-mechanical history---a sequence of events at a succession of times. The primary aim is to identify sets of ``decoherent'' (or ``consistent'') histories for the system. These are quantum-mechanical histories suffering negligible interference with each other, and, therefore, to which probabilities may be assigned. These histories may be found for a given system using the so-called decoherence functional. When the decoherence functional is exactly diagonal, probabilities may be assigned to the histories, and all probability sum rules are satisfied exactly. We propose a condition for approximate decoherence, and argue that it implies that most probability sum rules will be satisfied to approximately the same degree. We also derive an inequality bounding the size of the off-diagonal terms of the decoherence functional. We calculate the decoherence functional for some simple one-dimensional systems, with a variety of initial states. For these systems, we explore the extent to which decoherence is produced using two different types of coarse graining. The first type of coarse graining involves imprecise specification of the particle's position. The second involves coupling the particle to a thermal bath of harmonic oscillators and ignoring the details of the bath (the Caldeira-Leggett model). We argue that both types of coarse graining are necessary in general. We explicitly exhibit the degree of decoherence as a function of the temperature of the bath, and of the width to within which the particle's position is specified. We study the diagonal elements of the decoherence functional, representing the probabilities for the possible histories of the system. To the extent that the histories decohere, we show that the probability distributions are peaked about the classical histories of the system, with the distribution of their initial positions and momenta given by a smeared version of the Wigner function. We discuss this result in connection with earlier uses of the Wigner function in this context. We find that there is a certain amount of tension between the demands of decoherence and peaking about classical paths.

Journal ArticleDOI
TL;DR: In this paper, a model for the probability distribution of the rainflow stress range based on a mixed-distribution Weibull model whose parameters can be evaluated from only two spectral properties, namely the irregularity factor I and a bandwidth parameter β 0.75, is presented.

Journal ArticleDOI
P. Balaban1, J. Salz1
TL;DR: The probability distributions of the data rates that can be supported by optimum receiver structures as well as the distribution of the Shannon capacity are studied and the dependences among the important system parameters are exhibited graphically for several illustrative examples including QPSK.
Abstract: For Pt.I, see ibid., vol.40, no.5, p.885-94 (1992). The probability distributions of the data rates that can be supported by optimum receiver structures as well as the distribution of the Shannon capacity are studied. The dependences among the important system parameters are exhibited graphically for several illustrative examples including QPSK. At outage probabilities >

Book
01 Jun 1992
TL;DR: In this article, the authors present an approach for fault tree analysis and analysis of flow networks, based on two state models and three state models with three state systems, and three-state models with different types of stress-strength distributions.
Abstract: 1. Reliability Engineering: An overview. Historical development. Reliability: A birth-to-death problem. Reliability: An interdisciplinary effort. Reliability education and research. Problems of developing countries. Reliability prediction and analysis. Problems in prediction and analysis. Challenges for future. Scope of the book. 2. Reliability Mathematics. Classical set theory. Boolean algebra. Sample space. Definitions of probability. Basic properties of probability. Independent events. Conditional probability. Multiplication theorem. Total probability theorem. Bayes' theorem. Random variables. Probability distributions. Cumulative distributions. Mathematical expectation. Variance. Covariance and correlation. Moments. Moment generating functions. Probability distributions. Joint probability distributions. Distributions of several random variables. Some useful limit theorems. Estimation theory. Laplace transform. Markov processes. Random number generation. 3. Reliability Data Analysis and Management. The reliability function. Mean time to failure. Variance. The bathtub curve. Linear hazard models. Other hazard models. Analysis of failure data. Probability graph papers. Illustrations. Hazard function plots. Selection of a distribution. Statistical estimation of failure data. Interval estimates. Reliability data management. 4. Reliability Prediction from Stress-Strength Models. Stresses due to internal and external environments. Physics of failures. Reliability from stress-strength distributions. Reliability from similar stress-strength distributions. Reliability from dissimilar stress-strength distributions. Graphical approach. Time dependent stress-strength models. Environmental factors. Environmental testing Test specifications. Stress derating. Estimation of part failure rate. 5. System Reliability Modelling. System modelling. Assumptions for modelling. Two state modelling. Three-state models. 6. Reliability Evaluation Techniques. Non path sets or cut sets approaches. Tie set and cut set approaches. Reliability evaluation of flow networks. Path sets/cut sets enumeration. 7. Maintainability Analysis. Measures of system performance. State space approach. Network approach. Conditional probability approach. Three state systems. Preventive maintenance. Condition-based maintenance. 8. System Analysis Through Fault Trees. Important definitions. Event oriented analysis. Fault tree definitions and symbols. Structure function and coherence. Fault tree construction. Fault tree simplification. Fault tree evaluation. Importance measures of events. Measures of importance in multistate systems. Modularization in fault trees. Common cause/dependent failure analysis. Automatic synthesis of fault trees. Computer codes for fault tree analysis. References. Appendices. Subject index.

Journal ArticleDOI
J. Biernat1, J. Jarnicki1, K. Kaplon1, A. Kuras1, George J. Anders 
TL;DR: In this article, a new approach to estimate life distributions at nominal conditions from the results of accelerated life testing of electrical insulating materials is introduced, where nonlinear optimization techniques are applied in conjunction with linear regression analysis.
Abstract: The authors introduced a new approach to estimate life distributions at nominal conditions from the results of accelerated life testing of electrical insulating materials. A very general family of probability distributions is introduced, and a best fit member of this family is used to represent life data at each stress level. Nonlinear optimization techniques are applied in conjunction with linear regression analysis. In any accelerated life testing study important questions pertain to the minimal and maximal stress levels to be applied. A method of determination of the minimal stress level as well as the suitable number of tests based on reliability considerations is presented. A numerical example based on test data and a user-friendly computer program are presented. >

Book ChapterDOI
06 Apr 1992
TL;DR: Markov chain simulation has emerged as a powerful algorithmic paradigm and its chief application is to the random sampling of combinatorial structures from a specified probability distribution as mentioned in this paper, which lies at the heart of efficient probabilistic algorithms for a wide variety of problems, such as approximating the size of the combinatorially defined sets, estimating the expectation of certain operators in statistical physics, and optimisation by stochastic search.
Abstract: In recent years, Markov chain simulation has emerged as a powerful algorithmic paradigm. Its chief application is to the random sampling of combinatorial structures from a specified probability distribution. Such a sampling procedure lies at the heart of efficient probabilistic algorithms for a wide variety of problems, such as approximating the size of combinatorially defined sets, estimating the expectation of certain operators in statistical physics, and combinatorial optimisation by stochastic search.

Journal ArticleDOI
Peter C. Fishburn1
TL;DR: The problem of characterizing all binary probability systems on a finite set that are induced by probability distributions over the family of linear orders of the set is studied in this paper, with a focus on systems of inequalities that define the facets of the space of all induced binary probabilities.


Book
21 Dec 1992
TL;DR: This paper presents an analysis of the Poisson and Exponential Distributions of the M/M/1/8/FIFO System and discusses the design of the Simulation Model, which resulted in Deadlock Resolution in Distributed Simulations.
Abstract: BASIC CONCEPTS AND TERMINOLOGY. Concept of a System. System Methodology. Advantages and Disadvantages of Simulation. Simulation Terminology. PROBABILITY CONCEPTS IN SIMULATION. Probability. Set Theory, Compound Events. Conditional Probability, Independent Events. Discrete Distributions. Continuous Distributions. RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Functions of a Random Variable. Moments. Generating Functions. Multivariate Distributions. DETAILED ANALYSIS OF COMMON PROBABILITY DISTRIBUTIONS. Bernoulli Distribution. Binomial Distribution. Geometric Distribution. Poisson Distribution. Uniform Distribution. Normal Distribution. Exponential Distribution. Chi-Square Distribution. Student's t-Distribution. F-Distribution. STATISTICS AND RANDOM SAMPLES. Descriptive Statistics and Frequency Diagrams. Statistics and Sampling Distributions. Method of Least Squares. Estimation. Confidence Interval Estimates. STATISTICAL TESTS. Tests of Hypotheses. Student's t-Test. The F-Test. The Chi-Square Goodness-of-Fit Test. The Kolmogorov-Smirnov Test. GENERATION OF RANDOM NUMBERS. Pseudorandom Numbers. Algorithms for Generating Pseudorandom Numbers. Testing and Validating Pseudorandom Sequences. Generation of Nonuniform Variates. DISCRETE SYSTEM SIMULATION. Simulation Terminology. Time Management Methods. Object Generation. Queue Management and List Processing. Collecting and Recording Simulation Data. MODEL VALIDATION. Evaluation of the Simulation Model. Validation Description. Sampling Methods. THE DESIGN OF SIMULATION EXPERIMENTS. Completely Randomized Design. Randomized Complete Block Design. Factorial Design. Network Simulation Model Performance Analysis. ESTIMATION OF MODEL PARAMETERS. Optimization of Response Surfaces. Heuristic Search. Complete Enumeration. Random Search. Steepest Ascent (Descent). Coordinate or Single-Variable Search. Parallel Tangent Search. Conjugate Direction Search. OUTPUT ANALYSIS. Analysis of Simulation Results. Estimation and Confidence Limits. Initial Conditions and Inputs. Simulation Model Run Length. Variance Reduction. LANGUAGES FOR DISCRETE SYSTEM SIMULATION. Language Characteristics. Use of Multipurpose Languages. Simulation Languages. DISTRIBUTED SIMULATION. The System Simulation Problem. Decomposition of a Simulation. Synchronization of Distributed Model Components. Deadlock Resolution in Distributed Simulations. QUEUEING THEORY AND SIMULATION. Review of the Poisson and Exponential Distributions. The M/M/1/8/FIFO System. Summary Measures for the M/M/1/8/FIFO System. The M/M/1/K/FIFO System. The M/M/C/8/FIFO System. Priority Queueing Systems. APPENDIX TABLES. Normal Distribution Function. Student's t-Distribution Function. Chi-Square Distribution Function. F-Distribution Function. Poisson Distribution Function. Critical Values for the Kolmogorov-Smirnov Test. INDEX.

Journal ArticleDOI
TL;DR: This paper offers some conclusions on how many and which parameters should be estimated, and which distributions should be used, when modelling project networks under uncertainty.
Abstract: When modelling project networks under uncertainty, there are a variety of probability distributions from which to choose for activity-durations. From experience of risk analysis in practice, this paper offers some conclusions on how many and which parameters should be estimated, and which distributions should be used.

Book ChapterDOI
01 Jan 1992
TL;DR: An improved version of a self-organizing network model which has been proposed at the ICANN-91 and since then has been applied to various problems is described, with the generalization of the model to arbitrary dimension and the introduction of a local estimate of the probability density.
Abstract: In this paper an improved version of a self-organizing network model is described which has been proposed at the ICANN-91[3] and since then has been applied to various problems [1,2,5]. The improvements presented here are the generalization of the model to arbitrary dimension and the introduction of a local estimate of the probability density. The latter leads to a very clear distinction between necessary and superfluous neurons with respect to modeling a given probability distribution. This makes it possible to automatically generate network structures that are nearly optimally suited for the distribution at hand.

Journal ArticleDOI
TL;DR: In this article, an abstract combinatoric result and a concrete geometric result are given which guarantee the existence of a depth $d + 1$ identity or inequality for the indicator function of a union of a finite collection of events, that is, an expression which is a linear combination of indicator functions of at most $(d+ 1)$fold intersections of the events.
Abstract: Improvements to the classical inclusion-exclusion identity are developed. There are two main results: an abstract combinatoric result and a concrete geometric result. In the abstract result conditions are given which guarantee the existence of a depth $d + 1$ identity or inequality for the indicator function of a union of a finite collection of events, that is, an expression which is a linear combination of indicator functions of at most $(d + 1)$-fold intersections of the events. Such an identity or inequality can be integrated with respect to any probability measure to yield a probability identity or inequality. Connections are given to previous work on Bonferroni-type inequalities. The concrete result says that there is a depth $d + 1$ identity for the union of finitely many balls in $d$-dimensional Euclidean space. With a single correction term this result also holds in the $d$-dimensional sphere. These results form the basis for a discrete theory of tubes, which up to now has been continuous in nature. The spherical result is used to give a simulation method for finding critical probabilities for multiple-comparisons procedures, and a computer program implementing the method is described. Numerical results are presented which demonstrate that in the tails of the distribution probability estimates based on the method tend to exhibit less variability than estimates based on naive simulation.

Journal ArticleDOI
TL;DR: An improved first-order reliability approach is presented wherein the linearization point varies to match the output level whose exceedance probability is sought, circumvents some of the problems of central value linearization while retaining much of its simplicity.

Journal ArticleDOI
TL;DR: In this article, the problem of designing mechanical systems or components under uncertainty is considered, and the basic idea is to ensure quality control at the design stage by minimizing sensitivity of the response to uncertain variables by proper selection of design variables.
Abstract: The problem of designing mechanical systems or components under uncertainty is considered. The basic idea is to ensure quality control at the design stage by minimizing sensitivity of the response to uncertain variables by proper selection of design variables . The formulation does not involve probability distributions. It is proved, however, that when the response is linear in the uncertain variable, reduction in sensitivity implies lesser probability of failure. The proof is generalized to the non-linear case under certain restrictions. In one example, the design of a three-bar truss is considered. The length of one of the bars is considered to be the uncertain variable while cross-sectional areas are the design variables. The sensitivity of the x-displacement is minimized. The constrained optimization problem is solved using a nonlinear programming code. A criterion which can help identify some of the problems where robustness in design is critical is discussed.

Book ChapterDOI
01 Jan 1992
TL;DR: In the applications of equilibrium theory, one occasionally encounters situations where consumption sets are naturally compact as mentioned in this paper, where consumption is restricted to a given budget set and the equilibrating variables are ration-coupons prices.
Abstract: In the applications of equilibrium theory one occasionally encounters situations where consumption sets are naturally compact. Two examples are: (i) fix-price equilibria where consumption is restricted to a given budget set and the equilibrating variables are ration-coupons prices (as in Dreze and Muller, 1980); (ii) models where the choice variables are probability distributions on a fixed number of indivisible objects (as in Hylland and Zeckhauser, 1979).