scispace - formally typeset
Search or ask a question

Showing papers by "Kaoru Tone published in 2015"


Book ChapterDOI
TL;DR: In this article, the authors make an exhaustive critical review of various possible estimation methods of scale economies in a nonparametric data envelopment analysis approach, and the strengths and weaknesses of each of these estimation methods are discussed.
Abstract: This contribution is an attempt to make an exhaustive critical review of various possible estimation methods of scale economies in a non-parametric data envelopment analysis approach. Three types of technology structure—piecewise linear, piecewise log-linear, and FDH—are found to be adopted for such estimation exercise. These technology structures are built up either in input-output space or in cost-output space. The strengths and weaknesses of the uses of each of these estimation methods are discussed. The issue of which method to use in any empirical application is a matter of an examination of various issues concerning (1) whether factor inputs are indivisible, (2) whether price data are available, and if available, whether they are well measured with certainty, and (3) whether the non-convexities in the underlying production technology are present.

23 citations


Journal ArticleDOI
TL;DR: The proposed Intertemporal DEA models are applied to the performance evaluation of high-tech Integrated Circuit design companies in Taiwan to demonstrate their advantages over other DEA models that ignore intertemporal efficiency.
Abstract: It has been well recognized that to thoroughly evaluate a firm’s performance, the evaluator must assess not only its past and present records but also future potential. However, to the best of our knowledge, there are no data envelopment analysis (DEA)-type models proposed in the literature that simultaneously take past, present and, especially, future performance indicators into account. Hence, this research aims at developing a new type of DEA model referred to as Intertemporal DEA models that can be used to fully measure a firm’s efficiency by explicitly considering its key inputs and outputs involving the past-present-future time span. In this research, the proposed Intertemporal DEA models are applied to the performance evaluation of high-tech Integrated Circuit design companies in Taiwan to demonstrate their advantages over other DEA models that ignore intertemporal efficiency.

17 citations


Journal ArticleDOI
TL;DR: A scale-dependent data set is defined by which the scale elasticity of each decision-making unit can be found, and this model is applied to a data set of Japanese universities’ research activities.
Abstract: In data envelopment analysis, we are often puzzled by the large difference between the constant-returns-scale and variable returns-to-scale scores, and by the convexity production set syndrome in spite of the S-shaped curve, often observed in many real data sets. In this paper, we propose a solution to these problems. Initially, we evaluate the constant-returns-scale and variable returns-to-scale scores for all decision-making units by means of conventional methods. We obtain the scale-efficiency for each decision-making unit. Using the scale-efficiency, we decompose the constant-returns-scale slacks for each decision-making unit into scale-independent and scale-dependent parts. Following this, we eliminate scale-dependent slacks from the data set, and thus obtain a scale-independent data set. Next, we classify decision-making units into several clusters, depending either on the degree of scale-efficiency or on some other predetermined characteristics. We evaluate slacks of scale-independent decision-making units within the same cluster using the constant-returns-scale model, and obtain the in-cluster slacks. By summing the scale-dependent and the in-cluster slacks, we define the total slacks for each decision-making unit. Following this, we evaluate the efficiency score of the decision-making unit and project it onto the efficient frontiers, which are no longer guaranteed to be convex and are usually non-convex. Finally, we define the scale-dependent data set by which we can find the scale elasticity of each decision-making unit. We apply this model to a data set of Japanese universities' research activities.

6 citations


Posted Content
TL;DR: In this paper, a closed convex set containing no line is expressed as the direct sum of the convex hull of its extreme points and conical hull of the extreme directions.
Abstract: In order to express a polyhedron as the (Minkowski) sum of a polytope and a polyhedral cone, Motzkin (1936) made a transition from the polyhedron to a polyhedral cone. Based on his excellent idea, we represent a set by a characteristic cone. By using this representation, we then reach four main results: (i) expressing a closed convex set containing no line as the direct sum of the convex hull of its extreme points and conical hull of its extreme directions, (ii) establishing a convex programming (CP) based framework for determining a maximal element-an element with the maximum number of positive components-of a convex set, (iii) developing a linear programming problem for finding a relative interior point of a polyhedron, and (iv) proposing two procedures for the identification of a strictly complementary solution in linear programming.

5 citations


Posted Content
TL;DR: Mehdiloozad et al. as mentioned in this paper proposed a two-stage linear programming-based approach to identify the two potential sources of origin in the non-radial DEA setting.
Abstract: While measuring returns to scale in data envelopment analysis (DEA), the occurrence of multiple supporting hyperplanes has been perceived as a crucial issue. To deal effectively with this in weigh restrictions (WR) framework, we first precisely identify the two potential sources of its origin in the non-radial DEA setting. If the firm under evaluation P is WR-efficient, the non-full-dimensionality of its corresponding P-face-a face of minimum dimension that contains P-is the unique source of origin (problem Type I). Otherwise, the occurrence of multiple WR-projections or, correspondingly, multiple P-faces becomes the other additional source of origin (problem Type II). To the best of our knowledge, while problem Type I has been correctly addressed in the literature, the simultaneous occurrences of problems Types I and II have not effectively been coped with. Motivated by this, we first show that problem Type II can be circumvented by using a P-face containing all the P-faces. Based on this finding, we then devise a two-stage linear programming based procedure by extending a recently developed methodology by [Mehdiloozad, M., Mirdehghan, S. M., Sahoo, B. K., & Roshdi, I. (2015). On the identification of the global reference set in data envelopment analysis. European Journal of Operational Research, 245, 779-788]. Our proposed method inherits all the advantages of the recently developed method and is computationally efficient. The practical applicability of our proposed method is demonstrated through a real-world data set of 80 Iranian secondary schools.

4 citations


Posted Content
TL;DR: In this article, a new scheme for finding the nearest point on the efficient frontiers of the production possibility set is presented, which requires a limited number of additional linear program solutions for each inefficient DMU.
Abstract: Slacks-based measure (SBM) (Tone (2001), Pastor et al. (1999)) has been widely utilized as a representative non-radial DEA model. In Tone (2010), I developed four variants of the SBM model where main concerns are to search the nearest point on the efficient frontiers of the production possibility set. However, in the worst case, a massive enumeration of facets of polyhedron associated with the production possibility set is required. In this paper, I will present a new scheme for this purpose which requires a limited number of additional linear program solutions for each inefficient DMU. Although the point thus obtained is not always the nearest point, it is acceptable for practical purposes and from the point of computational loads.

1 citations