scispace - formally typeset
Search or ask a question
Author

Richard A. Johnson

Other affiliations: Harvard University, Yale University, BioBricks Foundation  ...read more
Bio: Richard A. Johnson is an academic researcher from University of Wisconsin-Madison. The author has contributed to research in topics: Population & Estimator. The author has an hindex of 32, co-authored 162 publications receiving 5252 citations. Previous affiliations of Richard A. Johnson include Harvard University & Yale University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, a new power transformation family is introduced for reducing skewness and approximate normality. But the power transformation is not suitable for the real line. And the large-sample properties of the transformation are investigated in the contect of a single random sample.
Abstract: SUMMARY We introduce a new power transformation family which is well defined on the whole real line and which is appropriate for reducing skewness and to approximate normality. It has properties similar to those of the Box-Cox transformation for positive variables. The large-sample properties of the transformation are investigated in the contect of a single random sample.

1,047 citations

Book
01 Jan 1985
TL;DR: In this article, the authors present an organization and description of the Descriptive Study of Bivariate Data Probability Probability Distributions Binomial Distribution and its Application in Testing Hypotheses The Normal Distribution Variation Variation in Repeated Samples: Sampling Distributions Drawing Inferences from Large Samples Small-Sample Inferences for Normal Populations Comparing Two Treatments Regression Analysis I (Simple Linear Regression) Regression analysis II (Multiple linear Regression and Other Topics) Analysis of Categorical Data Analysis of Variance (ANOVA) Non-
Abstract: Organization and Description of Data Descriptive Study of Bivariate Data Probability Probability Distributions Binomial Distribution and its Application in Testing Hypotheses The Normal Distribution Variation in Repeated Samples: Sampling Distributions Drawing Inferences from Large Samples Small-Sample Inferences for Normal Populations Comparing Two Treatments Regression Analysis I (Simple Linear Regression) Regression Analysis II (Multiple Linear Regression and Other Topics) Analysis of Categorical Data Analysis of Variance (ANOVA) Non-Parametric Statistics

371 citations

Journal ArticleDOI
TL;DR: In this article, the authors employ the theory of weak convergence of cumulative sums to the Wiener Process to obtain large sample theory for cusum tests and study the effect of serial correlation on the performance of the one-sided cusUM test.
Abstract: We employ the theory of weak convergence of cumulative sums to the Wiener Process to obtain large sample theory for cusum tests. These results provide at theoretical basis for studying the effects of serial correlation on the performance of the one-sided cusum test proposed by Page (1955). Particular attention is placed on the first, order auto-regressive and first order moving average models. In order to treat the sequential version of the test, we employ the same Wiener process approximation. This enables us to study the effect of correlation not only on the average run length but, more importantly, on the run length distribution itself. These theoretical distributions are shown to compare quite favorably with the true distribution on the basis of a Monte Carlo study using normal observations. The results on the changes in the shape of the run length distributions show that more than average run length should be considered. Our primary conclusion is that the cusum test is not robust with respect, to dep...

283 citations

Journal ArticleDOI
TL;DR: Using the fluorescent antibody technique, it is demonstrated that fibrin deposition is a prominent and consistent feature of both allergic contact dermatitis and classic delayed hypersensitivity skin reactions in man.
Abstract: The expression of delayed-type hypersensitivity in animals has been inhibited by a variety of anticoagulants, but direct evidence for activation of clotting in the evolution of these reactions has been lacking. Using the fluorescent antibody technique we here demonstrate that fibrin deposition is a prominent and consistent feature of both allergic contact dermatitis and classic delayed hypersensitivity skin reactions in man. Fib was detected in 55 of 58 delayed reactions studied at the peak of their intensity. The characteristic distribution of Fib—principally in the intervascular portions of the reticular dermis with sparing of vessels and their associated cuffs of mononuclear cells—is unusual and quite different from that described in antibody-mediated lesions in animals or man. Fib was found in vessel walls in only 2 of 94 biopsies studied. With a single exception, deposition of immunoglobulins and complement was not observed. The pathogensis and significance of Fib deposition in these reactions are not yet clear. Fib is ultimately derived from circulating fibrinogen, and its accumulation provides additional evidence for locally increased vascular permeability in delayed hypersensitivity. Polymerization of extravascular fibrinogen could be triggered nonspecifically by dermal elements (e.g., collagen) or by a product of sensitized lymphocytes. The appearance of Fib early in the development of these reactions (4–8 h after epicutaneous test with DNCB) and inhibition studies with anticoagulants together suggest that clotting may have a role in their pathogenesis, possibly by the release of bioactive peptides from fibrinogen/fibrin or by contributing to the induration characteristic of delayed hypersensitivity.

203 citations


Cited by
More filters
Journal ArticleDOI
22 Dec 1993-JAMA
TL;DR: The SAPS II, based on a large international sample of patients, provides an estimate of the risk of death without having to specify a primary diagnosis, and is a starting point for future evaluation of the efficiency of intensive care units.
Abstract: Objective. —To develop and validate a new Simplified Acute Physiology Score, the SAPS II, from a large sample of surgical and medical patients, and to provide a method to convert the score to a probability of hospital mortality. Design and Setting. —The SAPS II and the probability of hospital mortality were developed and validated using data from consecutive admissions to 137 adult medical and/or surgical intensive care units in 12 countries. Patients. —The 13 152 patients were randomly divided into developmental (65%) and validation (35%) samples. Patients younger than 18 years, burn patients, coronary care patients, and cardiac surgery patients were excluded. Outcome Measure. —Vital status at hospital discharge. Results. —The SAPS II includes only 17 variables: 12 physiology variables, age, type of admission (scheduled surgical, unscheduled surgical, or medical), and three underlying disease variables (acquired immunodeficiency syndrome, metastatic cancer, and hematologic malignancy). Goodness-of-fit tests indicated that the model performed well in the developmental sample and validated well in an independent sample of patients (P=.883 andP=.104 in the developmental and validation samples, respectively). The area under the receiver operating characteristic curve was 0.88 in the developmental sample and 0.86 in the validation sample. Conclusion. —The SAPS II, based on a large international sample of patients, provides an estimate of the risk of death without having to specify a primary diagnosis. This is a starting point for future evaluation of the efficiency of intensive care units. (JAMA. 1993;270:2957-2963)

5,836 citations

Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

Journal ArticleDOI
TL;DR: The basic ideas of PCA are introduced, discussing what it can and cannot do, and some variants of the technique have been developed that are tailored to various different data types and structures.
Abstract: Large datasets are increasingly common and are often difficult to interpret. Principal component analysis (PCA) is a technique for reducing the dimensionality of such datasets, increasing interpretability but at the same time minimizing information loss. It does so by creating new uncorrelated variables that successively maximize variance. Finding such new variables, the principal components, reduces to solving an eigenvalue/eigenvector problem, and the new variables are defined by the dataset at hand, not a priori , hence making PCA an adaptive data analysis technique. It is adaptive in another sense too, since variants of the technique have been developed that are tailored to various different data types and structures. This article will begin by introducing the basic ideas of PCA, discussing what it can and cannot do. It will then describe some variants of PCA and their application.

4,289 citations

Journal ArticleDOI
TL;DR: A unified framework for the design and the performance analysis of the algorithms for solving change detection problems and links with the analytical redundancy approach to fault detection in linear systems are established.
Abstract: This book is downloadable from http://www.irisa.fr/sisthem/kniga/. Many monitoring problems can be stated as the problem of detecting a change in the parameters of a static or dynamic stochastic system. The main goal of this book is to describe a unified framework for the design and the performance analysis of the algorithms for solving these change detection problems. Also the book contains the key mathematical background necessary for this purpose. Finally links with the analytical redundancy approach to fault detection in linear systems are established. We call abrupt change any change in the parameters of the system that occurs either instantaneously or at least very fast with respect to the sampling period of the measurements. Abrupt changes by no means refer to changes with large magnitude; on the contrary, in most applications the main problem is to detect small changes. Moreover, in some applications, the early warning of small - and not necessarily fast - changes is of crucial interest in order to avoid the economic or even catastrophic consequences that can result from an accumulation of such small changes. For example, small faults arising in the sensors of a navigation system can result, through the underlying integration, in serious errors in the estimated position of the plane. Another example is the early warning of small deviations from the normal operating conditions of an industrial process. The early detection of slight changes in the state of the process allows to plan in a more adequate manner the periods during which the process should be inspected and possibly repaired, and thus to reduce the exploitation costs.

3,830 citations