scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Accentuating the rank positions in an agreement index with reference to a consensus order

01 Nov 2015-International Transactions in Operational Research (John Wiley & Sons, Ltd (10.1111))-Vol. 22, Iss: 6, pp 969-995
TL;DR: An index that measures the agreement level between an individual opinion and a collective opinion when both are expressed by rankings of a set of alternatives is proposed, an interesting weighted version of the well-known Kendall's ranks correlation index.
About: This article is published in International Transactions in Operational Research.The article was published on 2015-11-01. It has received 10 citations till now. The article focuses on the topics: Index (economics).
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the evaluation and decision models: a Critical Perspective is presented. But the authors do not discuss the evaluation of decision models in terms of their performance in the field of operational research.
Abstract: (2002). Evaluation and Decision Models: a Critical Perspective. Journal of the Operational Research Society: Vol. 53, No. 7, pp. 809-809.

91 citations

Journal ArticleDOI
TL;DR: The system is designed to provide advice to the decision-makers to increase group consensus level while maintaining the individual consistency of each decision-maker and is based on the use of fuzzy outranking relations to model individual and group preferences.
Abstract: In this paper, we present an implemented, web-based multicriteria group decision support system for solving multicriteria ranking problems by a collaborative group of decision-makers in sequential or parallel coordination mode and in a distributed and asynchronous environment. This system employs an order-based consensus model for collaborative groups that moves from consistency to consensus. The system is based on consensus measures and it has been designed to provide advice to the decision-makers to increase group consensus level while maintaining the individual consistency of each decision-maker. It is based on the use of fuzzy outranking relations to model individual and group preferences. For the exploitation of the models—formulated as a multiobjective optimization problem and solved with a multiobjective evolutionary algorithm—the system generates advice on how decision-makers should change their preferences to reach a ranking of alternatives with a high degree of consistency and consensus.

29 citations

Book ChapterDOI
29 Mar 2015
TL;DR: This paper shows how the inferring parameters model may be used to help the decision makers with different interest to iteratively reach an agreement on how to rank cities at a time, reflecting the preferences at the individual level and at the collective level.
Abstract: This work is based on a disaggregation approach for the ELECTRE III method for the group decision-making We provide a procedure in which the group is supported for modifying the parameters of outranking methods in an iterative and interactive process In this work, we provide an application of the procedure through evaluating eight municipal districts for Water Company to invest in projects of water supply An inferring parameters model performed by NSGA-II obtains marginal information from decision makers with more disagreement, which supports the stage of parameters modification in correspondence with the preferences of the whole group (collective ranking) This paper shows how the inferring parameters model may be used to help the decision makers with different interest to iteratively reach an agreement on how to rank cities at a time, reflecting the preferences at the individual level and at the collective level

9 citations

Journal ArticleDOI
TL;DR: An original proposal for modeling multicriteria situations where multiple evaluators take part of the evaluation process that allows each evaluator to have its own set of criteria, and avoids the usual inconsistency of adopting pre-processing compensatory methods for introducing it into non-compensatory algorithms.
Abstract: Highlights: This paper describes an original proposal for modeling Multicriteria problems taking into account more than one evaluator. It allows each evaluator to have its own set of criteria. It also avoids the incoherency of adopting compensatory techniques into non-compensatory algorithms. Goal: This paper describes an original proposal for modeling multicriteria situations where multiple evaluators take part of the evaluation process. This proposal allows each evaluator to have its own set of criteria, including their weights, and also avoids the usual inconsistency of adopting pre-processing compensatory methods for introducing it into non-compensatory algorithms. Design / Methodology / Approach: In order to better describe how ELECTRE ME works, a multicriteria-multiple evaluator situation is modeled by ELECTRE TRI ME (as we have called the ELECTRE TRI variation that incorporates the principles of multiple evaluators). Results: ELECTRE ME was able to avoid the inconsistency of adopting contradictory mechanisms of aggregating preferences while modeling multicriteria & multiple evaluators problems (first called here as MCDA-ME). Limitations: Although the proposal focuses in situations with multiple evaluators, there is no restriction for its application in situations where there is only one decision maker. Practical implications: Another important feature of ELECTRE ME is that it allows each evaluator to consider its own set of criteria and its own scale for evaluation. Originality / Value: ELECTRE ME avoids a contradictory approach to use compensatory algorithms (such as weighted mean) as an input in non-compensatory outranking methods. Despite the fact that non-compensatory principle is in the heart of the ELECTRE methods, it has not found a previous proposal with the attributes shown in this study: to incorporate outranking concepts in situations where more than one evaluator is present and, by extension, allow each evaluator to have its own set of criteria.

8 citations

References
More filters
Book ChapterDOI
01 Jan 1985
TL;DR: Analytic Hierarchy Process (AHP) as mentioned in this paper is a systematic procedure for representing the elements of any problem hierarchically, which organizes the basic rationality by breaking down a problem into its smaller constituent parts and then guides decision makers through a series of pairwise comparison judgments to express the relative strength or intensity of impact of the elements in the hierarchy.
Abstract: This chapter provides an overview of Analytic Hierarchy Process (AHP), which is a systematic procedure for representing the elements of any problem hierarchically. It organizes the basic rationality by breaking down a problem into its smaller constituent parts and then guides decision makers through a series of pair-wise comparison judgments to express the relative strength or intensity of impact of the elements in the hierarchy. These judgments are then translated to numbers. The AHP includes procedures and principles used to synthesize the many judgments to derive priorities among criteria and subsequently for alternative solutions. It is useful to note that the numbers thus obtained are ratio scale estimates and correspond to so-called hard numbers. Problem solving is a process of setting priorities in steps. One step decides on the most important elements of a problem, another on how best to repair, replace, test, and evaluate the elements, and another on how to implement the solution and measure performance.

16,547 citations

Journal ArticleDOI
TL;DR: A method of scaling ratios using the principal eigenvector of a positive pairwise comparison matrix is investigated, showing that λmax = n is a necessary and sufficient condition for consistency.

8,117 citations


Additional excerpts

  • ...In approaches that are related to multicriteria decision analysis, we found that, in the late 1970s, Saaty (1977, 1980) developed the analytic hierarchy process, which became an important approach to multicriteria decision making....

    [...]

Book
05 Sep 2011
TL;DR: The present article is a commencement at attempting to remedy this deficiency of scientific correlation, and the meaning and working of the various formulæ have been explained sufficiently, it is hoped, to render them readily usable even by those whose knowledge of mathematics is elementary.
Abstract: All knowledge—beyond that of bare isolated occurrence—deals with uniformities. Of the latter, some few have a claim to be considered absolute, such as mathematical implications and mechanical laws. But the vast majority are only partial; medicine does not teach that smallpox is inevitably escaped by vaccination, but that it is so generally; biology has not shown that all animals require organic food, but that nearly all do so; in daily life, a dark sky is no proof that it will rain, but merely a warning; even in morality, the sole categorical imperative alleged by Kant was the sinfulness of telling a lie, and few thinkers since have admitted so much as this to be valid universally. In psychology, more perhaps than in any other science, it is hard to find absolutely inflexible coincidences; occasionally, indeed, there appear uniformities sufficiently regular to be practically treated as laws, but infinitely the greater part of the observations hitherto recorded concern only more or less pronounced tendencies of one event or attribute to accompany another. Under these circumstances, one might well have expected that the evidential evaluation and precise mensuration of tendencies had long been the subject of exhaustive investigation and now formed one of the earliest sections in a beginner’s psychological course. Instead, we find only a general naı̈ve ignorance that there is anything about it requiring to be learnt. One after another, laborious series of experiments are executed and published with the purpose of demonstrating some connection between two events, wherein the otherwise learned psychologist reveals that his art of proving and measuring correspondence has not advanced beyond that of lay persons. The consequence has been that the significance of the experiments is not at all rightly understood, nor have any definite facts been elicited that may be either confirmed or refuted. The present article is a commencement at attempting to remedy this deficiency of scientific correlation. With this view, it will be strictly confined to the needs of practical workers, and all theoretical mathematical demonstrations will be omitted; it may, however, be said that the relations stated have already received a large amount of empirical verification. Great thanks are due from me to Professor Haussdorff and to Dr. G. Lipps, each of whom have supplied a useful theorem in polynomial probability; the former has also very kindly given valuable advice concerning the proof of the important formulæ for elimination of ‘‘systematic deviations.’’ At the same time, and for the same reason, the meaning and working of the various formulæ have been explained sufficiently, it is hoped, to render them readily usable even by those whose knowledge of mathematics is elementary. The fundamental procedure is accompanied by simple imaginary examples, while the more advanced parts are illustrated by cases that have actually occurred in my personal experience. For more abundant and positive exemplification, the reader is requested to refer to the under cited research, which is entirely built upon the principles and mathematical relations here laid down. In conclusion, the general value of the methodics recommended is emphasized by a brief criticism of the best correlational work hitherto made public, and also the important question is discussed as to the number of ‘‘cases’’ required for an experimental series.

3,687 citations

Journal ArticleDOI
01 Jul 1952
TL;DR: Saari as discussed by the authors introduced Arrow's Theorem and founded the field of social choice theory in economics and political science, and introduced a new foreword by Nobel laureate Eric Maskin, introducing Arrow's seminal book to a new generation of students and researchers.
Abstract: Originally published in 1951, Social Choice and Individual Values introduced "Arrow's Impossibility Theorem" and founded the field of social choice theory in economics and political science. This new edition, including a new foreword by Nobel laureate Eric Maskin, reintroduces Arrow's seminal book to a new generation of students and researchers. "Far beyond a classic, this small book unleashed the ongoing explosion of interest in social choice and voting theory. A half-century later, the book remains full of profound insight: its central message, 'Arrow's Theorem,' has changed the way we think."-Donald G. Saari, author of Decisions and Elections: Explaining the Unexpected

1,380 citations


"Accentuating the rank positions in ..." refers background or methods in this paper

  • ...Although ordinal rankings play an important role in voting and elections, an added challenge is Arrow’s fundamental “impossibility” theorem (Arrow, 1963), stating that no voting scheme can guarantee five natural fairness properties: universal domain, transitivity, unanimity, independence with respect to irrelevant alternatives (referred in the Kemeny and Snell’s model as rank reversal), and nondictatorship....

    [...]

  • ...…generated by the different algorithms is largely unknown (Tavana et al., 2008), this happens because, according to Arrow’s impossibility theorem (Arrow, 1963), there is no aggregate ranking that satisfies simultaneously several necessary fair representation conditions, where the concept of…...

    [...]

  • ..., 2008), this happens because, according to Arrow’s impossibility theorem (Arrow, 1963), there is no aggregate ranking that satisfies simultaneously several necessary fair representation conditions, where the concept of “fair” elude precise and formal definition, which accounts for the numerous alternative models and interpretations that exist for group decision making....

    [...]

  • ..., 2008), this happens because, according to Arrow’s impossibility theorem (Arrow, 1963), there is no aggregate ranking that satisfies simultaneously several necessary fair representation conditions, where the concept of “fair” elude precise and formal definition, which accounts for the numerous alternative models and interpretations that exist for group decision making. In the aggregation of ordinal preferences to form a consensus, Cook (2006) examines the main issue of consensus among ordinal rankings of a set of alternatives and the various ad hoc procedures developed over time. The problem of deriving a “consensus ranking” from preferences provided in pairwise formats was first examined by Kemeny and Snell (1962). They studied the group ranking problem with ordinal preferences only, and proposed an axiomatic approach for dealing with ordinal preferences. Their model seeks an optimal group ranking that minimizes the number of reversed preferences. This model, however, has an important drawback: It is computationally prohibitive to solve (NPhard; Hochbaum and Levin, 2006). Although ordinal rankings play an important role in voting and elections, an added challenge is Arrow’s fundamental “impossibility” theorem (Arrow, 1963), stating that no voting scheme can guarantee five natural fairness properties: universal domain, transitivity, unanimity, independence with respect to irrelevant alternatives (referred in the Kemeny and Snell’s model as rank reversal), and nondictatorship. Garcı́a-Lapresta and Pérez-Román (2011) use the Kemeny metric to measure distance between orders and introduce a class of consensus measures based on the distances among individual weak orders....

    [...]

  • ...Arrow and Raynaud (1986) considered the problem in which rankings that were provided, for example, by a group of evaluators, must be combined into a common group ranking....

    [...]

Reference EntryDOI
15 Jul 2005
TL;DR: The Analytic Hierarchy Process (AHP) as discussed by the authors is a theory of relative measurement of intangible criteria, where a scale of priorities is derived from pairwise comparison measurements only after the elements to be measured are known.
Abstract: The Analytic Hierarchy Process (AHP) is a theory of relative measurement of intangible criteria. With this approach to relative measurement, a scale of priorities is derived from pairwise comparison measurements only after the elements to be measured are known. The ability to do pairwise comparisons is our biological heritage and we need it to cope with a world where everything is relative and constantly changing and thus, there are no fixed standards to measure things on. In traditional measurement, one has a scale that one applies to measure any element that comes along that has the property the scale is for, and the elements are measured one by one, not by comparing them with each other. In the AHP, paired comparisons are made with judgments using numerical values taken from the AHP absolute fundamental scale of 1 to 9. A scale of relative values is derived from all these paired comparisons and it also belongs to an absolute scale that is invariant under the identity transformation like the system of real numbers. The AHP is useful for making multicriteria decisions involving benefits, opportunities, costs, and risks. The ideas are developed in stages and illustrated with examples of real-life decisions. The subject is transparent and easy to understand why it is done the way it is along the lines discussed here. The AHP has a generalization to dependence and feedback; the Analytic Network Process (ANP) is not discussed here. Keywords: analytic hierarchy process; decision making; prioritization; benefits; costs; complexity

946 citations