scispace - formally typeset
Search or ask a question
Author

William W. Cooper

Bio: William W. Cooper is an academic researcher from University of Texas at Austin. The author has contributed to research in topics: Data envelopment analysis & Linear programming. The author has an hindex of 79, co-authored 254 publications receiving 76641 citations. Previous affiliations of William W. Cooper include Harvard University & Carnegie Mellon University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors critique the articles by Dmitruk & Koshevoy and by Bol by showing how to solve the examples they erected to show the non-existence of functions for evaluating performance efficiencies in DEA.
Abstract: We here critique the articles by Dmitruk & Koshevoy (1991, J Econ Theory 55:121–144) and by Bol (1986, J Econ Theory 38:380–385) by showing how to solve the examples they erected to show the non-existence of functions for evaluating performance efficiencies in DEA. We also show that functions satisfying these criteria—and other important criteria as well—were already available prior to the publications of D&K and by Bol and have since been greatly extended to increase the power and scope of DEA.

8 citations

Journal ArticleDOI
TL;DR: It is concluded that the “imposed problem ignorance” of past complexity research is deleterious to research progress on “computability” or “efficiency of computation.”
Abstract: Through key examples and constructs, exact and approximate, complexity, computability, and solution of linear programming systems are reexamined in the light of Khachian's new notion of (approximate) solution. Algorithms, basic theorems, and alternate representations are reviewed. It is shown that the Klee-Minty example hasnever been exponential for (exact) adjacent extreme point algorithms and that the Balinski-Gomory (exact) algorithm continues to be polynomial in cases where (approximate) ellipsoidal “centered-cutoff” algorithms (Levin, Shor, Khachian, Gacs-Lovasz) are exponential. By “model approximation,” both the Klee-Minty and the new J. Clausen examples are shown to be trivial (explicitly solvable) interval programming problems. A new notion of computable (approximate) solution is proposed together with ana priori regularization for linear programming systems. New polyhedral “constraint contraction” algorithms are proposed for approximate solution and the relevance of interval programming for good starts or exact solution is brought forth. It is concluded from all this that the “imposed problem ignorance” of past complexity research is deleterious to research progress on “computability” or “efficiency of computation.”

7 citations

01 Jan 1969
TL;DR: In this paper, the duality theory for semi-infinite programming is extended to fields with properties of non-Archimedean order, and the ideas of regularization are generalized to include powers of the relative infinites in terms of the indeterminates.
Abstract: : Aspects of the duality theory for semi-infinite programming are extended to fields with properties of non-Archimedean order. Emphasis is on nonstandard semi-infinite programming problems in Hilbert's Field. The ideas of regularization are generalized to include powers of the relative infinites in terms of the indeterminates. (Author)

7 citations

Journal ArticleDOI
TL;DR: In this paper, the authors discuss a long-range oil field development problem, one which has aspects that are common to many attempts at long-term facility planning, such as the use of model types and prototypes.
Abstract: Management planning problems are often so huge and unwieldy that it is necessary to attack them with a variety of devices. One strategy involves the use of model types and prototypes. This is the topic which will be discussed in this paper in the context of a long-range oil field development problem, one which has aspects that are common to many attempts at longrange facilities planning. The emphasis will be on concepts. however, rather than the details of the actual problem and even these concepts will be dealt with in broad fashion. Further de-

7 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A nonlinear (nonconvex) programming model provides a new definition of efficiency for use in evaluating activities of not-for-profit entities participating in public programs and methods for objectively determining weights by reference to the observational data for the multiple outputs and multiple inputs that characterize such programs.

25,433 citations

Journal ArticleDOI
TL;DR: The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs as mentioned in this paper.
Abstract: In management contexts, mathematical programming is usually used to evaluate a collection of possible alternative courses of action en route to selecting one which is best. In this capacity, mathematical programming serves as a planning aid to management. Data Envelopment Analysis reverses this role and employs mathematical programming to obtain ex post facto evaluations of the relative efficiency of management accomplishments, however they may have been planned or executed. Mathematical programming is thereby extended for use as a tool for control and evaluation of past accomplishments as well as a tool to aid in planning future activities. The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs. A separation into technical and scale efficiencies is accomplished by the methods developed in this paper without altering the latter conditions for use of DEA directly on observational data. Technical inefficiencies are identified with failures to achieve best possible output levels and/or usage of excessive amounts of inputs. Methods for identifying and correcting the magnitudes of these inefficiencies, as supplied in prior work, are illustrated. In the present paper, a new separate variable is introduced which makes it possible to determine whether operations were conducted in regions of increasing, constant or decreasing returns to scale in multiple input and multiple output situations. The results are discussed and related not only to classical single output economics but also to more modern versions of economics which are identified with "contestable market theories."

14,941 citations

Book
31 Jul 1985
TL;DR: The book updates the research agenda with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research.
Abstract: Fuzzy Set Theory - And Its Applications, Third Edition is a textbook for courses in fuzzy set theory. It can also be used as an introduction to the subject. The character of a textbook is balanced with the dynamic nature of the research in the field by including many useful references to develop a deeper understanding among interested readers. The book updates the research agenda (which has witnessed profound and startling advances since its inception some 30 years ago) with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research. All chapters have been updated. Exercises are included.

7,877 citations

Journal ArticleDOI
01 May 1981
TL;DR: This chapter discusses Detecting Influential Observations and Outliers, a method for assessing Collinearity, and its applications in medicine and science.
Abstract: 1. Introduction and Overview. 2. Detecting Influential Observations and Outliers. 3. Detecting and Assessing Collinearity. 4. Applications and Remedies. 5. Research Issues and Directions for Extensions. Bibliography. Author Index. Subject Index.

4,948 citations

Book
30 Nov 1999
TL;DR: In this article, the basic CCR model and DEA models with restricted multipliers are discussed. But they do not consider the effect of non-discretionary and categorical variables.
Abstract: List of Tables. List of Figures. Preface. 1. General Discussion. 2. The Basic CCR Model. 3. The CCR Model and Production Correspondence. 4. Alternative DEA Models. 5. Returns to Scale. 6. Models with Restricted Multipliers. 7. Discretionary, Non-Discretionary and Categorical Variables. 8. Allocation Models. 9. Data Variations. Appendices. Index.

4,395 citations