scispace - formally typeset
Search or ask a question
Author

William W. Cooper

Bio: William W. Cooper is an academic researcher from University of Texas at Austin. The author has contributed to research in topics: Data envelopment analysis & Linear programming. The author has an hindex of 79, co-authored 254 publications receiving 76641 citations. Previous affiliations of William W. Cooper include Harvard University & Carnegie Mellon University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, chance constrained programming is used to separate risk from return behavior and evaluate their relative strengths as sources of these negative relations, which are found to be more in the returns than the risks.
Abstract: Chance constrained programming concepts are used to formalize risk and return relations which are then modeled for use in an empirical study of mutual fund behavior during the period 1984 through 1988. The publicly announced strategies of individual funds are used to form ex ante risk classifications which are employed in examining ex post performance. Negative relations between risk and return held in every year of the period studied. The bearing of these negative risk-return findings for the Bowman paradox, as studied in the strategic management literature, are thus extended from the industrial firms studied by Bowman (and others) and shown to be present even in these investment oriented mutual funds in each of the years of the great bull market from 1984 through 1988. Finally, our use of chance constrained programming enables us to separate risk from return behavior and evaluate their relative strengths as sources of these negative relations, which are found to be more in the returns than the risks.

49 citations

Journal ArticleDOI
01 Dec 1994-Top
TL;DR: The use of DEA has been rapidly expanding and has been accompanied by developments which have enhanced its power and enlarged DEA's utility for additional applications as discussed by the authors, including simulation studies comparing DEA with competing forms of statistical regressions.
Abstract: Rapidly expanding uses of DEA have been accompanied by developments which have enhanced its power and enlarged its utility for additional applications Developments covered in the present paper include simulation studies comparing DEA with competing forms of statistical regressions Other studies covered show how these two approaches can be combined in complementary fashion Another part of this paper deals with Chance Constrained Programming formulations which incorporate probabilistic elements into DEA Included also are discussions of statistical characterizations with accompanying tests of statistical significance for DEA efficiency evaluations This paper concludes with uses of DEA in “discovery processes”-processes that need strengthening (and encouragemnt) in contemporary social science and management science research Suggestions are made for additional research on further developments which extend to uses of DEA to provide new approaches in economics (including econometrics), management and psychology and an Appendix introduces new or recently developed efficiency measures for use in DEA

48 citations

Journal ArticleDOI
TL;DR: In this paper, an enhanced Russell graph measure (ERM) is proposed for data envelopment analysis (DEA), which utilizes a ratio measure in place of the standard formulations, and the resulting model is in the form of a fractional program.
Abstract: In aggregation for data envelopment analysis (DEA), a jointly determined aggregate measure of output and input efficiency is desired that is consistent with the individual decision making unit measures. An impasse has been reached in the current state of the literature, however, where only separate measures of input and output efficiency have resulted from attempts to aggregate technical efficiency with the radial measure models commonly employed in DEA. The latter measures are “incomplete” in that they omit the non-zero input and output slacks, and thus fail to account for all inefficiencies that the model can identify. The Russell measure eliminates the latter deficiency but is difficult to solve in standard formulations. A new approach has become available, however, which utilizes a ratio measure in place of the standard formulations. Referred to as an enhanced Russell graph measure (ERM), the resulting model is in the form of a fractional program. Hence, it can be transformed into an ordinary linear programming structure that can generate an optimal solution for the corresponding ERM model. As shown in this paper, an aggregate ERM can then be formed with all the properties considered to be desirable in an aggregate measure—including jointly determined input and output efficiency measures that represent separate estimates of input and output efficiency. Much of this paper is concerned with technical efficiency in both individual and system-wide efficiency measures. Weighting systems are introduced that extend to efficiency-based measures of cost, revenue, and profit, as well as derivatives such as rates of return over cost. The penultimate section shows how the solution to one model also generates optimal solutions to models with other objectives that include rates of return over cost and total profit. This is accomplished in the form of efficiency-adjusted versions of these commonly used measures of performance.

48 citations

Journal ArticleDOI
TL;DR: In this article, generalizations of the warehousing model are discussed. But the authors focus on the generalization of the Warehousing Model, and do not consider the generalisation of the model in the context of transportation.
Abstract: (1955). Generalizations of the Warehousing Model. Journal of the Operational Research Society: Vol. 6, No. 4, pp. 131-172.

48 citations

Book ChapterDOI
TL;DR: This chapter presents some of the recently developed analytical methods for studying the sensitivity of DEA results to variations in the data, focused on the stability of classification of DMUs (decision making units) into efficient and inefficient performers.
Abstract: This chapter presents some of the recently developed analytical methods for studying the sensitivity of DEA results to variations in the data. The focus is on the stability of classification of DMUs (decision making units) into efficient and inefficient performers. Early work on this topic concentrated on developing algorithms for conducting such analyses after it was noted that standard approaches for conducting sensitivity analyses in linear programming could not be used in DEA. However, recent work has bypassed the need for such algorithms. It has also evolved from the early work that was confined to studying data variations in one input or output for one DMU. The newer methods described in this chapter make it possible to analyze the sensitivity of results when all data are varied simultaneously for all DMUs.

47 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A nonlinear (nonconvex) programming model provides a new definition of efficiency for use in evaluating activities of not-for-profit entities participating in public programs and methods for objectively determining weights by reference to the observational data for the multiple outputs and multiple inputs that characterize such programs.

25,433 citations

Journal ArticleDOI
TL;DR: The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs as mentioned in this paper.
Abstract: In management contexts, mathematical programming is usually used to evaluate a collection of possible alternative courses of action en route to selecting one which is best. In this capacity, mathematical programming serves as a planning aid to management. Data Envelopment Analysis reverses this role and employs mathematical programming to obtain ex post facto evaluations of the relative efficiency of management accomplishments, however they may have been planned or executed. Mathematical programming is thereby extended for use as a tool for control and evaluation of past accomplishments as well as a tool to aid in planning future activities. The CCR ratio form introduced by Charnes, Cooper and Rhodes, as part of their Data Envelopment Analysis approach, comprehends both technical and scale inefficiencies via the optimal value of the ratio form, as obtained directly from the data without requiring a priori specification of weights and/or explicit delineation of assumed functional forms of relations between inputs and outputs. A separation into technical and scale efficiencies is accomplished by the methods developed in this paper without altering the latter conditions for use of DEA directly on observational data. Technical inefficiencies are identified with failures to achieve best possible output levels and/or usage of excessive amounts of inputs. Methods for identifying and correcting the magnitudes of these inefficiencies, as supplied in prior work, are illustrated. In the present paper, a new separate variable is introduced which makes it possible to determine whether operations were conducted in regions of increasing, constant or decreasing returns to scale in multiple input and multiple output situations. The results are discussed and related not only to classical single output economics but also to more modern versions of economics which are identified with "contestable market theories."

14,941 citations

Book
31 Jul 1985
TL;DR: The book updates the research agenda with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research.
Abstract: Fuzzy Set Theory - And Its Applications, Third Edition is a textbook for courses in fuzzy set theory. It can also be used as an introduction to the subject. The character of a textbook is balanced with the dynamic nature of the research in the field by including many useful references to develop a deeper understanding among interested readers. The book updates the research agenda (which has witnessed profound and startling advances since its inception some 30 years ago) with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research. All chapters have been updated. Exercises are included.

7,877 citations

Journal ArticleDOI
01 May 1981
TL;DR: This chapter discusses Detecting Influential Observations and Outliers, a method for assessing Collinearity, and its applications in medicine and science.
Abstract: 1. Introduction and Overview. 2. Detecting Influential Observations and Outliers. 3. Detecting and Assessing Collinearity. 4. Applications and Remedies. 5. Research Issues and Directions for Extensions. Bibliography. Author Index. Subject Index.

4,948 citations

Book
30 Nov 1999
TL;DR: In this article, the basic CCR model and DEA models with restricted multipliers are discussed. But they do not consider the effect of non-discretionary and categorical variables.
Abstract: List of Tables. List of Figures. Preface. 1. General Discussion. 2. The Basic CCR Model. 3. The CCR Model and Production Correspondence. 4. Alternative DEA Models. 5. Returns to Scale. 6. Models with Restricted Multipliers. 7. Discretionary, Non-Discretionary and Categorical Variables. 8. Allocation Models. 9. Data Variations. Appendices. Index.

4,395 citations