scispace - formally typeset
Search or ask a question
Author

William W. Cooper

Bio: William W. Cooper is an academic researcher from University of Texas at Austin. The author has contributed to research in topics: Data envelopment analysis & Linear programming. The author has an hindex of 79, co-authored 254 publications receiving 76641 citations. Previous affiliations of William W. Cooper include Harvard University & Carnegie Mellon University.


Papers
More filters
Book ChapterDOI
01 Jan 1994
TL;DR: Data Envelopment Analysis (DEA) is a body of concepts and methodologies that have now been incorporated in a collection of models with accompanying interpretive possibilities as follows: as discussed by the authors.
Abstract: Data Envelopment Analysis (DEA) is a body of concepts and methodologies that have now been incorporated in a collection of models with accompanying interpretive possibilities as follows: 1. the CCR ratio model (1978) (i) yields an objective evaluation of overall efficiency and (ii) identifies the sources and estimates the amounts of the thusidentified inefficiencies 2. the BCC model (1984) distinguishes between technical and scale inefficiencies by (i) estimating pure technical efficiency at the given scale of operation and (ii) identifying whether increasing decreasing, or constant returns to scale possibilities are present for further exploitation 3. the Multiplicative models (Charnes et al., 1982, 1983) provide (i) a log-linear envelopment or (ii) a piecewise Cobb- Douglas interpretation of the production process (by reduction to the antecedent 1981 additive model of Charnes, Cooper, and Seiford); and 4. the Additive model (as better rendered in Charnes et al., 1985) and the extended Additive model (Charnes et al., 1987) (i) relate DEA to the earlier Charnes-Cooper (1959) inefficiency analysis and in the process (ii) relate the efficiency results to the economic concept of Pareto optimality as interpreted in the still earlier work of T. Koopmans (1949) in the volume that published the proceedings of the first conference on linear programming.1

41 citations

Journal ArticleDOI
TL;DR: In this paper, a deterministic equivalent for the original problem is presented, where the originally defined objective replaces all random variables by corresponding expected values and the remaining constraints do not contain any random terms.
Abstract: In linear programming under uncertainty the two-stage problem is handled by assuming that one chooses a first set of constrained decision variables; this is followed by observations of certain random variables after which another set of decisions must be made to adjust for any constraint violations. The objective is to optimize an expected value functional defined relative to the indicated choices. This paper shows how such problems may always be replaced with either constrained generalized medians or hypermedians in which all random elements appear only in the functional. The resulting problem is called a deterministic equivalent for the original problem since (a) the originally defined objective replaces all random variables by corresponding expected values and (b) the remaining constraints do not contain any random terms. Significant classes of cases are singled out and special attention is devoted to the structure of the constraint matrices for these purposes. Numerical examples are supplied and relat...

40 citations

Journal ArticleDOI
TL;DR: In this paper, the authors developed a model to aid Coast Guard managers in formulating appropriate policies with respect to planning for various types of equipment required to contain major pollution incidents, and elaborated in terms of three primary stages of response: offloading, containment, and removal.

40 citations

Journal ArticleDOI
TL;DR: In this article, a neural network artificial intelligence model was developed in cooperation with the Texas Department of Insurance as part of an early warning system for predicting insurer insolvency, which was applied to a collection of all domestic property and casualty insurance companies which became insolvent between 1987 and 1990.
Abstract: This paper presents a neural network artificial intelligence model developed in cooperation with the Texas Department of Insurance as part of an early warning system for predicting insurer insolvency. A feed-forward back-propagation methodology is utilised to compute an estimate of insurer propensity towards insolvency. The results are then applied to a collection of all Texas domestic property and casualty insurance companies which became insolvent between 1987 and 1990 and the goal of predicting insolvency three years ahead of time. The results show high predictability and generalisability of results for the purpose of insolvency prediction, suggesting that neural networks may be a useful technique for this and other purposes.

38 citations

Journal ArticleDOI
TL;DR: In this paper, the authors apply Data Envelopment Analysis (DEA) to determine relative efficiencies between internet dot com companies that produce only physical products and those that produce digital products.
Abstract: This paper applies Data Envelopment Analysis to determine relative efficiencies between internet dot com companies that produce only physical products and those that produce only digital products. To allow for the fact that the latter are relatively inexperienced, a distinction is made between long- and short-run efficiencies and inefficiencies, with a finding of no statistically significant difference in the short run but digital product companies are significantly more efficient in the long run. A new way of distinguishing between long- and short-run performances is utilized that avoids the need for identifying the time periods associated with long-run vs. short-run efficiencies and inefficiencies. In place of “time,” this paper utilizes differences in the “properties” that economic theory associates with long- and short-run performances.

38 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A nonlinear (nonconvex) programming model provides a new definition of efficiency for use in evaluating activities of not-for-profit entities participating in public programs and methods for objectively determining weights by reference to the observational data for the multiple outputs and multiple inputs that characterize such programs.

25,433 citations

Journal ArticleDOI
TL;DR: The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs as mentioned in this paper.
Abstract: In management contexts, mathematical programming is usually used to evaluate a collection of possible alternative courses of action en route to selecting one which is best. In this capacity, mathematical programming serves as a planning aid to management. Data Envelopment Analysis reverses this role and employs mathematical programming to obtain ex post facto evaluations of the relative efficiency of management accomplishments, however they may have been planned or executed. Mathematical programming is thereby extended for use as a tool for control and evaluation of past accomplishments as well as a tool to aid in planning future activities. The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs. A separation into technical and scale efficiencies is accomplished by the methods developed in this paper without altering the latter conditions for use of DEA directly on observational data. Technical inefficiencies are identified with failures to achieve best possible output levels and/or usage of excessive amounts of inputs. Methods for identifying and correcting the magnitudes of these inefficiencies, as supplied in prior work, are illustrated. In the present paper, a new separate variable is introduced which makes it possible to determine whether operations were conducted in regions of increasing, constant or decreasing returns to scale in multiple input and multiple output situations. The results are discussed and related not only to classical single output economics but also to more modern versions of economics which are identified with "contestable market theories."

14,941 citations

Book
31 Jul 1985
TL;DR: The book updates the research agenda with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research.
Abstract: Fuzzy Set Theory - And Its Applications, Third Edition is a textbook for courses in fuzzy set theory. It can also be used as an introduction to the subject. The character of a textbook is balanced with the dynamic nature of the research in the field by including many useful references to develop a deeper understanding among interested readers. The book updates the research agenda (which has witnessed profound and startling advances since its inception some 30 years ago) with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research. All chapters have been updated. Exercises are included.

7,877 citations

Journal ArticleDOI
01 May 1981
TL;DR: This chapter discusses Detecting Influential Observations and Outliers, a method for assessing Collinearity, and its applications in medicine and science.
Abstract: 1. Introduction and Overview. 2. Detecting Influential Observations and Outliers. 3. Detecting and Assessing Collinearity. 4. Applications and Remedies. 5. Research Issues and Directions for Extensions. Bibliography. Author Index. Subject Index.

4,948 citations

Book
30 Nov 1999
TL;DR: In this article, the basic CCR model and DEA models with restricted multipliers are discussed. But they do not consider the effect of non-discretionary and categorical variables.
Abstract: List of Tables. List of Figures. Preface. 1. General Discussion. 2. The Basic CCR Model. 3. The CCR Model and Production Correspondence. 4. Alternative DEA Models. 5. Returns to Scale. 6. Models with Restricted Multipliers. 7. Discretionary, Non-Discretionary and Categorical Variables. 8. Allocation Models. 9. Data Variations. Appendices. Index.

4,395 citations