Author
Chiang Kao
Other affiliations: Oregon State University
Bio: Chiang Kao is an academic researcher from National Cheng Kung University. The author has contributed to research in topic(s): Data envelopment analysis & Fuzzy number. The author has an hindex of 42, co-authored 167 publication(s) receiving 7834 citation(s). Previous affiliations of Chiang Kao include Oregon State University.
Papers published on a yearly basis
Papers
More filters
TL;DR: The relational model developed in this paper is more reliable in measuring the efficiencies and consequently is capable of identifying the causes of inefficiency more accurately.
Abstract: The efficiency of decision processes which can be divided into two stages has been measured for the whole process as well as for each stage independently by using the conventional data envelopment analysis (DEA) methodology in order to identify the causes of inefficiency. This paper modifies the conventional DEA model by taking into account the series relationship of the two sub-processes within the whole process. Under this framework, the efficiency of the whole process can be decomposed into the product of the efficiencies of the two sub-processes. In addition to this sound mathematical property, the case of Taiwanese non-life insurance companies shows that some unusual results which have appeared in the independent model do not exist in the relational model. In other words, the relational model developed in this paper is more reliable in measuring the efficiencies and consequently is capable of identifying the causes of inefficiency more accurately. Based on the structure of the model, the idea of efficiency decomposition can be extended to systems composed of multiple stages connected in series.
934 citations
TL;DR: A procedure to measure the efficiencies of DMUs with fuzzy observations by applying the α-cut approach, and by extending to fuzzy environment, the DEA approach is made more powerful for applications.
Abstract: The existing data envelopment analysis (DEA) models for measuring the relative efficiencies of a set of decision making units (DMUs) using various inputs to produce various outputs are limited to crisp data. To deal with imprecise data, the notion of fuzziness has been introduced. This paper develops a procedure to measure the efficiencies of DMUs with fuzzy observations. The basic idea is to transform a fuzzy DEA model to a family of conventional crisp DEA models by applying the α-cut approach. A pair of parametric programs is formulated to describe that family of crisp DEA models, via which the membership functions of the efficiency measures are derived. Since the efficiency measures are expressed by membership functions rather than by crisp values, more information is provided for management. By extending to fuzzy environment, the DEA approach is made more powerful for applications.
422 citations
TL;DR: This paper builds a relational network DEA model, taking into account the interrelationship of the processes within the system, to measure the efficiency of the system and those of the process at the same time, and decomposes the system efficiency into the sum of the inefficiency slacks of its component processes connected in parallel.
Abstract: Traditional studies in data envelopment analysis (DEA) view systems as a whole when measuring the efficiency, ignoring the operation of individual processes within a system. This paper builds a relational network DEA model, taking into account the interrelationship of the processes within the system, to measure the efficiency of the system and those of the processes at the same time. The system efficiency thus measured more properly represents the aggregate performance of the component processes. By introducing dummy processes, the original network system can be transformed into a series system where each stage in the series is of a parallel structure. Based on these series and parallel structures, the efficiency of the system is decomposed into the product of the efficiencies of the stages in the series and the inefficiency slack of each stage into the sum of the inefficiency slacks of its component processes connected in parallel. With efficiency decomposition, the process which causes the inefficient operation of the system can be identified for future improvement. An example of the non-life insurance industry in Taiwan illustrates the whole idea.
378 citations
TL;DR: This paper reviews studies on network DEA by examining the models used and the structures of the network system of the problem being studied, and highlights some directions for future studies from the methodological point of view.
Abstract: Network data envelopment analysis (DEA) concerns using the DEA technique to measure the relative efficiency of a system, taking into account its internal structure. The results are more meaningful and informative than those obtained from the conventional black-box approach, where the operations of the component processes are ignored. This paper reviews studies on network DEA by examining the models used and the structures of the network system of the problem being studied. This review highlights some directions for future studies from the methodological point of view, and is inspirational for exploring new areas of application from the empirical point of view.
339 citations
01 Feb 2010
TL;DR: A network DEA model is discussed which distributes the system inefficiency to its component processes and is applied to assess the impact of information technology (IT) on firm performance in a banking industry.
Abstract: A recent development in DEA (data envelopment analysis) examines the internal structure of a system so that more information regarding sources that cause inefficiency can be obtained. This paper discusses a network DEA model which distributes the system inefficiency to its component processes. The model is applied to assess the impact of information technology (IT) on firm performance in a banking industry. The results show that the impact of IT on firm performance operates indirectly through fund collection. The impact increases when the IT budget is shared with the profit generation process.
233 citations
Cited by
More filters
01 May 1975
TL;DR: The Fundamentals of Queueing Theory, Fourth Edition as discussed by the authors provides a comprehensive overview of simple and more advanced queuing models, with a self-contained presentation of key concepts and formulae.
Abstract: Praise for the Third Edition: "This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented."IIE Transactions on Operations EngineeringThoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than presenting a narrow focus on the subject, this update illustrates the wide-reaching, fundamental concepts in queueing theory and its applications to diverse areas such as computer science, engineering, business, and operations research.This update takes a numerical approach to understanding and making probable estimations relating to queues, with a comprehensive outline of simple and more advanced queueing models. Newly featured topics of the Fourth Edition include:Retrial queuesApproximations for queueing networksNumerical inversion of transformsDetermining the appropriate number of servers to balance quality and cost of serviceEach chapter provides a self-contained presentation of key concepts and formulae, allowing readers to work with each section independently, while a summary table at the end of the book outlines the types of queues that have been discussed and their results. In addition, two new appendices have been added, discussing transforms and generating functions as well as the fundamentals of differential and difference equations. New examples are now included along with problems that incorporate QtsPlus software, which is freely available via the book's related Web site.With its accessible style and wealth of real-world examples, Fundamentals of Queueing Theory, Fourth Edition is an ideal book for courses on queueing theory at the upper-undergraduate and graduate levels. It is also a valuable resource for researchers and practitioners who analyze congestion in the fields of telecommunications, transportation, aviation, and management science.
2,562 citations
01 Jan 1994
TL;DR: In this Chapter, a decision maker (or a group of experts) trying to establish or examine fair procedures to combine opinions about alternatives related to different points of view is imagined.
Abstract: In this Chapter, we imagine a decision maker (or a group of experts) trying to establish or examine fair procedures to combine opinions about alternatives related to different points of view.
1,237 citations
TL;DR: A state-of-the-art literature survey is conducted to taxonomize the research on TOPSIS applications and methodologies and suggests a framework for future attempts in this area for academic researchers and practitioners.
Abstract: Multi-Criteria Decision Aid (MCDA) or Multi-Criteria Decision Making (MCDM) methods have received much attention from researchers and practitioners in evaluating, assessing and ranking alternatives across diverse industries. Among numerous MCDA/MCDM methods developed to solve real-world decision problems, the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) continues to work satisfactorily across different application areas. In this paper, we conduct a state-of-the-art literature survey to taxonomize the research on TOPSIS applications and methodologies. The classification scheme for this review contains 266 scholarly papers from 103 journals since the year 2000, separated into nine application areas: (1) Supply Chain Management and Logistics, (2) Design, Engineering and Manufacturing Systems, (3) Business and Marketing Management, (4) Health, Safety and Environment Management, (5) Human Resources Management, (6) Energy Management, (7) Chemical Engineering, (8) Water Resources Management and (9) Other topics. Scholarly papers in the TOPSIS discipline are further interpreted based on (1) publication year, (2) publication journal, (3) authors' nationality and (4) other methods combined or compared with TOPSIS. We end our review paper with recommendations for future research in TOPSIS decision-making that is both forward-looking and practically oriented. This paper provides useful insights into the TOPSIS method and suggests a framework for future attempts in this area for academic researchers and practitioners.
1,162 citations
TL;DR: The relational model developed in this paper is more reliable in measuring the efficiencies and consequently is capable of identifying the causes of inefficiency more accurately.
Abstract: The efficiency of decision processes which can be divided into two stages has been measured for the whole process as well as for each stage independently by using the conventional data envelopment analysis (DEA) methodology in order to identify the causes of inefficiency. This paper modifies the conventional DEA model by taking into account the series relationship of the two sub-processes within the whole process. Under this framework, the efficiency of the whole process can be decomposed into the product of the efficiencies of the two sub-processes. In addition to this sound mathematical property, the case of Taiwanese non-life insurance companies shows that some unusual results which have appeared in the independent model do not exist in the relational model. In other words, the relational model developed in this paper is more reliable in measuring the efficiencies and consequently is capable of identifying the causes of inefficiency more accurately. Based on the structure of the model, the idea of efficiency decomposition can be extended to systems composed of multiple stages connected in series.
934 citations
TL;DR: A comprehensive review of the work done, during the 1968-2005, in the application of statistical and intelligent techniques to solve the bankruptcy prediction problem faced by banks and firms is presented.
Abstract: This paper presents a comprehensive review of the work done, during the 1968–2005, in the application of statistical and intelligent techniques to solve the bankruptcy prediction problem faced by banks and firms. The review is categorized by taking the type of technique applied to solve this problem as an important dimension. Accordingly, the papers are grouped in the following families of techniques: (i) statistical techniques, (ii) neural networks, (iii) case-based reasoning, (iv) decision trees, (iv) operational research, (v) evolutionary approaches, (vi) rough set based techniques, (vii) other techniques subsuming fuzzy logic, support vector machine and isotonic separation and (viii) soft computing subsuming seamless hybridization of all the above-mentioned techniques. Of particular significance is that in each paper, the review highlights the source of data sets, financial ratios used, country of origin, time line of study and the comparative performance of techniques in terms of prediction accuracy wherever available. The review also lists some important directions for future research.
897 citations