scispace - formally typeset
Search or ask a question

Showing papers on "Mathematical statistics published in 1992"


Book
01 Jul 1992
TL;DR: In this paper, the authors use order statistics in Statistical Inference Asymptotic Theory Record Values Bibliography Indexes to measure moment relations, bounds, and approximations.
Abstract: Basic Distribution Theory Discrete Order Statistics Order Statistics from Some Specific Distributions Moment Relations, Bounds, and Approximations Characterizations Using Order Statistics Order Statistics in Statistical Inference Asymptotic Theory Record Values Bibliography Indexes.

1,605 citations


Journal ArticleDOI
TL;DR: In this article, a textbook for a one-semester course for students specializing in mathematical statistics or in multivariate analysis, or reference for theoretical as well as applied statisticians, confines its discussion to quadratic forms and second degree polynomials in real normal random vectors and matr
Abstract: Textbook for a one-semester graduate course for students specializing in mathematical statistics or in multivariate analysis, or reference for theoretical as well as applied statisticians, confines its discussion to quadratic forms and second degree polynomials in real normal random vectors and matr

577 citations





Journal ArticleDOI
TL;DR: Balakrishnan and Cohen as mentioned in this paper proposed order statistics and inference methods for estimating estimation methods in order to improve the accuracy of the estimation process and reduce the computational complexity of estimation methods.
Abstract: 1. Order Statistics and Inference—Estimation Methods. By N. Balakrishnan and A. C. Cohen. ISBN 012 076948 4. Academic Press, San Diego, 1991. xx + 312 pp. $79.95.

389 citations


Journal ArticleDOI
TL;DR: It is not proved that the introduction of additive noise to the training vectors always improves network generalization, but the analysis suggests mathematically justified rules for choosing the characteristics of noise if additive noise is used in training.
Abstract: The possibility of improving the generalization capability of a neural network by introducing additive noise to the training samples is discussed. The network considered is a feedforward layered neural network trained with the back-propagation algorithm. Back-propagation training is viewed as nonlinear least-squares regression and the additive noise is interpreted as generating a kernel estimate of the probability density that describes the training vector distribution. Two specific application types are considered: pattern classifier networks and estimation of a nonstochastic mapping from data corrupted by measurement errors. It is not proved that the introduction of additive noise to the training vectors always improves network generalization. However, the analysis suggests mathematically justified rules for choosing the characteristics of noise if additive noise is used in training. Results of mathematical statistics are used to establish various asymptotic consistency results for the proposed method. Numerical simulations support the applicability of the training method. >

374 citations



Journal ArticleDOI
TL;DR: Naive Set Theory.

235 citations


Book
01 Jan 1992
TL;DR: This book sets out statistical methods that can be used in the preparation, execution, evaluation and interpretation of experiments of a random nature, and contains detailed sections on breakdown statistics of typical electrical insulating arrangements.
Abstract: This book sets out statistical methods that can be used in the preparation, execution, evaluation and interpretation of experiments of a random nature. It also includes the assessment of test methods used in high-voltage engineering from a statistical standpoint, and contains detailed sections on breakdown statistics of typical electrical insulating arrangements. Separate special areas of mathematical statistics - such as statistical trial planning, questions of reliability, and stochastic processes - are mentioned briefly. The extensive bibliography points the way to more advanced work. Emphasis is placed on easy comprehension, clarity, visual representation and practical relevance, and each process is explained using at least one example. The book is written from the engineer's point of view: mathematical eduction is dispensed with, while mathematical logic and terminological accuracy are ensured. This book is directed both at the practising engineer and at the student of electrical engineering at the stages of study involving independent creative experimental activity. Physicists and mathematicians encountering problems of application will also find the book invaluable.

220 citations


Book
28 Oct 1992
TL;DR: In this paper, a theory of optimal sampling is developed in order to prove the various properties of the procedures and the procedures turn out to be optimal in a Bayesian sense as well as for problems with side conditions (e.g., specified bounds on error probabilities or expected sampling costs).
Abstract: This volume is concerned with statistical procedures where the data are collected in sequentially designed groups. The basic premise here is that the expected total sample size is not always the appropriate criterion for evaluating statistical procedures, especially for nonlinear sampling costs (eg. additive fixed costs) and in clinical trials. In fact, this criterion seems to have been a hindrance to the practical use of Wald's sequential probability ratio test (SPRT) despite its well-known optimum properties. This volume systematically develops decision procedures which retain the possibility of early stopping and remove some of the disadvantages of one-at-a-time sampling. In particular, for generalizations of the SPRT algorithms, methods for computing characteristics (such as operating characteristics or power functions, expected sampling costs, etc) are developed and implemented. The procedures turn out to be optimal in a Bayesian sense as well as for problems with side conditions (eg. specified bounds on error probabilities or expected sampling costs). A theory of optimal sampling is developed in order to prove the various properties of the procedures.

Book ChapterDOI
01 Jan 1992
TL;DR: A physical process (a change of a certain physical system) is called stochastically determined if, knowing a state X 0 of the system at a certain moment of time t 0, we also know the probability distribution for all the states X of this system at the moments t > t 0 as mentioned in this paper.
Abstract: A physical process (a change of a certain physical system) is called stochastically determined if, knowing a state X 0 of the system at a certain moment of time t0 we also know the probability distribution for all the states X of this system at the moments t > t 0.

Journal ArticleDOI
TL;DR: In this paper, the authors introduce a new class of probability models which are referred to as distributions of fractional order statistics, and consider the potential efficacies of various member distributions within the class for hydrologic data analysis.
Abstract: A critical issue in parametric methods of frequency analysis, regardless of the phenomenon being modeled, is that of selection of a form of probability distribution to be applied. When one is interested in continuous distributions there exists little theoretical guidance, other than perhaps that provided by the central limit theorem or the (asymptotic) results of extreme value theory, upon which one may base a choice. This paper, in a very general way, introduces a whole new class of probability models which are referred to as distributions of fractional order statistics. The potential efficacies of various member distributions within the class for hydrologic data analysis are also rationalized in a very intuitive way. Considered in some detail is an application of the theory of fractional order statistics to generalize the Gaussian distribution. Monte Carlo results comparing the performance of the generalized distribution with other common hydrologic models are also set forth.

Journal ArticleDOI
TL;DR: This work describes a multivariate generalization of the Hodges and Lehmann estimator of a location shift that can be obtained via the multivariate U statistic with the Mann-Whitney-Wilcoxon kernel and describes large-sample group sequential interval estimators and tests based on an aggregate estimate of the location shift combined over all of the repeated measures.
Abstract: Many studies involve the collection of multivariate observations, such as repeated measures, on two groups of subjects who are recruited over time, i.e., with staggered entry of subjects. Various marginal distribution-free multivariate methods have been proposed for the analyses of such multivariate observations where some measures may be missing at random. Using the multivariate U statistic of Wei and Johnson (1985, Biometrika 72, 359-364), we describe the group sequential analysis of such a study where the multivariate observations are observed sequentially--both within and among subjects. We describe a multivariate generalization of the Hodges and Lehmann (1963, Annals of Mathematical Statistics 34, 598-611) estimator of a location shift that can be obtained via the multivariate U statistic with the Mann-Whitney-Wilcoxon kernel. We then describe large-sample group sequential interval estimators and tests based on an aggregate estimate of the location shift combined over all of the repeated measures. We also describe how the same steps could be employed to perform a group sequential analysis based on any one of the variety of marginal multivariate methods that have been proposed. These methods are applied to a real-life example.

Book
30 Nov 1992
TL;DR: In this article, the authors present a review of Mathematical Statistics for Clinical Trial Planning under Proportional Hazards: Putting It All Together, where the authors use the Logrank Test when Survival is Exponential.
Abstract: HOW TO USE THE SAMPLE SIZE PROGRAM. Identification of Parameters. DESIGN AND ANALYSIS OF RANDOMIZED CLINICAL TRIALS. Formulation of the Therapeutic Question. One-Sided vs. Two-Sided Question. Design of the Clinical Trial. Statistical Considerations. Conduct of the Trial. Analysis and Reporting of the Trial. Critical Elements. Binomial Comparison. Kaplan-Meier Comparison (Large Sample). Logrank Test. DERIVATION OF THE STATISTICAL RESULTS. Derivation of the Large Sample Distribution of the Logrank Statistic. Derivation of the Large Sample Distribution of the Kaplan-Meier Statistic. Difference between Kaplan-Meier Curves. Exponential Survival. Applications of the Logrank Test When Survival is Exponential. Exponential Survival with a Poisson Accrual Process. Extension to the Two-Sample Problem. Exponential Survival with "Up-Front" Accrual. Proportional Hazard Models and the Exponential Distribution. Consideration in Planning a Trial Under Proportional Hazards: Putting It All Together. Losses to Follow-Up and Sample Size Adjustment. Interpretation of the Program. Multi-Treatment Trials. Stratified Logrank Test. Intuitive Justification Why the Logrank Test and Kaplan-Meier Estimation for Actual Accrual Process Behave in the Limit in the Same Way as the Fixed Binomial Assumption. Alternate Standard Error for the Kaplan-Meier Estimator. Connection between Kaplan-Meier and Binomial. FIGURES. APPENDIX I: A Review of Mathematical Statistics. Expected Value of a Function of a Random Vector. Special Distributions.




Book
30 Apr 1992
TL;DR: In this paper, the concept of probability is introduced and set theory is used to describe the set theory of cumulative distribution and its relation to the theory of theory of probability of probability.
Abstract: Concepts of probability. Theory of probability. Concepts of statistics. Sequential methods. Set theory. Tables of cumulative distribution. Sufficiency of procedures.

Book ChapterDOI
01 Jan 1992
TL;DR: The basic concepts and results of probability and stochastic processes needed later in the book are reviewed here and PC-Exercises, based on pseudo-random number generators, are used extensively to help the reader to develop an intuitive understanding of the material.
Abstract: The basic concepts and results of probability and stochastic processes needed later in the book are reviewed here. The emphasis is descriptive and PC-Exercises (PC= Personal Computer), based on pseudo-random number generators introduced in Section 3, are used extensively to help the reader to develop an intuitive understanding of the material. Statistical tests are discussed briefly in the final section.

Journal ArticleDOI
TL;DR: A geostochastic system called FASPF was developed by the U.S. Geological Survey for their 1989 assessment of undiscovered petroleum resources in the United States using a field-size geological model and an analytic probabilistic methodology, which resulted in a probabilism methodology for play analysis, subplay analysis, economic analysis, and aggregation analysis.
Abstract: A geostochastic system called FASPF was developed by the U.S. Geological Survey for their 1989 assessment of undiscovered petroleum resources in the United States. FASPF is a fast appraisal system for petroleum play analysis using a field-size geological model and an analytic probabilistic methodology. The geological model is a particular type of probability model whereby the volumes of oil and gas accumulations are modeled as statistical distributions in the form of probability histograms, and the risk structure is bilevel (play and accumulation) in terms of conditional probability. The probabilistic methodology is an analytic method derived from probability theory rather than Monte Carlo simulation. The resource estimates of crude oil and natural gas are calculated and expressed in terms of probability distributions. The probabilistic methodology developed by the author is explained.

Journal ArticleDOI
TL;DR: In this article, the principal hypotheses implicit in experiments are enumerated: the principle of reproducibility ("the past will be repeated in the future"); the principle for reasonable sufficiency; and, the statistical principle ("better to predict something rather than nothing").
Abstract: The informal aspects, arising in the interpretation of physical experiments, of the theory of probability and mathematical statistics are discussed. The conditions that verifying experiments must satisfy are presented and the role of heuristic (extralogical) assertions is analyzed using the example of mathematical expectation. The principal hypotheses implicit in experiments are enumerated: the principle of reproducibility ("the past will be repeated in the future"); the principle of reasonable sufficiency; and, the statistical principle ("better to predict something rather than nothing"). Considerable attention is devoted to Fisher and multisample confidence intervals. It is noted that Fisher confidence intervals are inconsistent. The arguments for introducing contrivances into practical calculations of probabilities are enumerated: incompleteness of any system of hypotheses; subjective estimates of probabilities; adjoining of statistical ensembles; nonstationariness and instability; rare phenomena; and, the use of classical probabilities and the law of large numbers. It is concluded that the relative frequency of appearance (empirical probability) is a "normal" physical quantity in the sense that it admits physical measurement. Its "abnormality" is manifested in the fact that it is burdened, more than other physical quantities, with conventions and hypotheses which must be specially checked (verified).

Proceedings ArticleDOI
07 Jun 1992
TL;DR: In this article, the authors extend the theory of search dynamics for stochastic learning algorithms, address the time evolution of the weight-space probability density and the distribution of convergence times, with particular attention given to escape from local optima.
Abstract: The authors extend the theory of search dynamics for stochastic learning algorithms, address the time evolution of the weight-space probability density and the distribution of convergence times, with particular attention given to escape from local optima, and develop a theoretical framework that describes the evolution of the weight-space probability density. The primary results are exact predictions of the statistical distribution of convergence times for simple backpropagation and competitive learning problems. >

Book
01 Dec 1992
TL;DR: "Foundations of Experimental Data Analysis" presents the most important procedures used in the analysis of sets of experimental data and describes the most efficient methods for estimating parameters.
Abstract: "Foundations of Experimental Data Analysis" presents the most important procedures used in the analysis of sets of experimental data. It also describes the most efficient methods for estimating parameters. Fundamental concepts in probability theory and mathematical statistics are provided for background information in the first chapter. The book's second chapter presents a survey of algorithms from which factographic data on measured objects are acquired. Six fundamental linear models of measurement describing all known types of linear or linearized relations between directly observable parameters and determined ones are studied in their simple and mixed versions. The third chapter is devoted to the problems of analyzing measured data.


Journal ArticleDOI
TL;DR: The Simple Linear Regression Model - Multiple Regression - Statistical Tables - Bibliography - Index
Abstract: Preface - Acknowledgements - PART 1 DESCRIPTIVE STATISTICS - Introduction - Data Presentation: Qualitative Data - Data Presentation: Quantitative Data - Measures of Location - Measures of Dispersion - PART 2 PROBABILITY AND PROBABILITY DISTRIBUTIONS - An Introduction to Probability - Discrete Probability Distributions - Continuous Probability Distributions - PART 3 STATISTICAL INFERENCE - Introduction to Statistical Inference - Sampling Distribution of Sample Statistics - Estimation - Hypothesis Testing - Statistical Inference with Two Populations - PART 4 REGRESSION ANALYSIS - The Simple Linear Regression Model - Multiple Regression - Statistical Tables - Bibliography - Index


Journal ArticleDOI
TL;DR: Chang and Rao as mentioned in this paper obtained a Berry-Esseen bound for the ACL-estimator of the survival distribution in the proportional hazards model of random censorship, which they used to obtain the accuracy of the normal approximations for estimators of quantiles of the residual survival time distribution.
Abstract: In a recent paper, Chang and Rao obtained a Berry-Esseen bound for the Kaplan-Meier estimator of the survival distribution in the general random censorship model. We complement their theorem by an analogous result for the ACL-estimator of the survival distribution in the proportional hazards model of random censorship. These rate of convergence results are used to obtain the accuracy of the normal approximations for estimators of quantiles of the survival time distribution in both of the censoring models. As a further application, we show how the U-statistic representations, implicitly contained in the proofs of the Berry-Esseen theorems for the estimators of the survival distribution, can be used to provide rate of convergence results for quantile estimators of the residual survival time distribution.

Journal ArticleDOI
TL;DR: In this article, the authors prove consistency of the bootstrap estimator for the distribution function of a U - statistic whose kernel contains an estimated parameter, and apply it to large classes of statistics.
Abstract: We prove consistency of the bootstrap estimator for the distribution function of a U - statistic whose kernel contains an estimated parameter. The result easily applies to large classes of statistics studied previously in testing and estimation situations : U- statistics based on trimmed samples or on location aligned samples.

Proceedings ArticleDOI
13 Sep 1992
TL;DR: An approach to reconfigurable flight control, which utilizes continuous system identification of the aircraft's stability and control derivatives, is proposed, and it is shown that even during periods of low excitation, the short period parameter estimates are improved.
Abstract: An approach to reconfigurable flight control, which utilizes continuous system identification of the aircraft's stability and control derivatives, is proposed. The basic approach is to include prior knowledge of the derivatives in the linear regression formulation. This is done through the use of ridge regression and mixed estimation from the mathematical statistics literature. The experimental part of this work is based on a linearized longitudinal fourth-order F-16 model and includes appropriate sensor (measurement) noise. The experiment shows that even during periods of low excitation, the short period parameter estimates are improved. >