scispace - formally typeset
Search or ask a question

Showing papers in "Social Choice and Welfare in 2008"


Journal ArticleDOI
TL;DR: It is demonstrated that winner selection in two prominent proportional representation voting systems is a computationally intractable problem—implying that these systems are impractical when the assembly is large, and in settings where the size of the Assembly is constant, the problem can be solved in polynomial time.
Abstract: We demonstrate that winner selection in two prominent proportional representation voting systems is a computationally intractable problem—implying that these systems are impractical when the assembly is large. On a different note, in settings where the size of the assembly is constant, we show that the problem can be solved in polynomial time.

220 citations


Journal ArticleDOI
TL;DR: Recently, some algorithms for computing the exact number of integer solutions in a system of linear constraints have been proposed in social choice literature by Huang and Chua (Soc Choice Welfare 17:143-155 2000) and by Gehrlein (SocChoice Welfare 19:503-512 2002; Rev Econ Des 9:317-336 2006) as discussed by the authors.
Abstract: In voting theory, analyzing the frequency of an event (e.g. a voting paradox), under some specific but widely used assumptions, is equivalent to computing the exact number of integer solutions in a system of linear constraints. Recently, some algorithms for computing this number have been proposed in social choice literature by Huang and Chua (Soc Choice Welfare 17:143–155 2000) and by Gehrlein (Soc Choice Welfare 19:503–512 2002; Rev Econ Des 9:317–336 2006). The purpose of this paper is threefold. Firstly, we want to do justice to Eugene Ehrhart, who, more than forty years ago, discovered the theoretical foundations of the above mentioned algorithms. Secondly, we present some efficient algorithms that have been recently developed by computer scientists, independently from voting theorists. Thirdly, we illustrate the use of these algorithms by providing some original results in voting theory.

109 citations


Journal ArticleDOI
TL;DR: This work examines the incentives of an interest group to provide verifiable policy-relevant information to a political decision-maker and to exert political pressure on her and identifies several factors that induce risk proclivity.
Abstract: We examine the incentives of an interest group to provide verifiable policy-relevant information to a political decision-maker and to exert political pressure on her. In our view information provision is a risky attempt to affect the politician’s beliefs about the desirability of the lobby’s objective. The circumstances under which political pressure can be applied specify the lobby’s valuation of different beliefs of the politician and, thus, her attitude toward risk. We identify several factors that induce risk proclivity (and thus information provision), which allows to explain the stylized fact that lobbies engage both in information provision and political pressure. Moreover, our approach gives a novel explanation for the fact that interest groups often try to provide information credibly. We finally study the extent to which this preference for credibility is robust and identify some instances in which lobbies may prefer to strategically withhold information.

79 citations


Journal ArticleDOI
TL;DR: It is shown that for conclusion- and premise-based aggregation rules to be mutually consistent, the aggregation must always be “oligarchic”, that is: unanimous within a subset of agents, and typically even be dictatorial.
Abstract: Generalizing the celebrated “discursive dilemma”, we analyze judgement aggregation problems in which a group of agents independently votes on a set of complex propositions (the “conclusions”) and on a set of “premises” by which the conclusions are truth-functionally determined. We show that for conclusion- and premise-based aggregation rules to be mutually consistent, the aggregation must always be “oligarchic”, that is: unanimous within a subset of agents, and typically even be dictatorial. We characterize exactly when consistent non-dictatorial (or anonymous) aggregation rules exist, allowing for arbitrary conclusions and arbitrary interdependencies among premises.

78 citations


Journal ArticleDOI
TL;DR: All of the six screening rules associated with the rule of k names that are used in reality do violate stability if the voters do not act strategically, but it is shown that there are screening rules which satisfy stability.
Abstract: Barbera and Coelho (WP 264, CREA-Barcelona Economics, 2007) documented six screening rules associated with the rule of k names that are used by diferent institutions around the world. Here, we study whether these screening rules satisfy stability. A set is said to be a weak Condorcet set a la Gehrlein (Math Soc Sci 10:199–209) if no candidate in this set can be defeated by any candidate from outside the set on the basis of simple majority rule. We say that a screening rule is stable if it always selects a weak Condorcet set whenever such set exists. We show that all of the six procedures which are used in reality do violate stability if the voters do not act strategically. We then show that there are screening rules which satisfy stability. Finally, we provide two results that can explain the widespread use of unstable screening rules.

74 citations


Journal ArticleDOI
Hans Gersbach1, Volker Hahn1
TL;DR: The authors examine whether the publication of the individual voting records of central-bank council members is socially beneficial when the public is unsure about the efficiency of central bankers and central bankers are angling for re-appointment.
Abstract: We examine whether the publication of the individual voting records of central-bank council members is socially beneficial when the public is unsure about the efficiency of central bankers and central bankers are angling for re-appointment. We show that publication is initially harmful since it creates a conflict between socially desirable and individually optimal behavior for somewhat less efficient central bankers. However, after re-appointment, losses will be lower when voting records are published since the government can distinguish highly efficient from less efficient central bankers more easily and can make central bankers individually accountable. In our model, the negative effects of voting transparency dominate, and expected overall losses are always larger when voting records are published.

60 citations


Journal ArticleDOI
TL;DR: It is shown that under the Impartial Culture assumption the probability that the Tideman winner is the Dodgson winner converges to 1 as the number of voters increase, but it is also shown that this convergence is not exponentially fast.
Abstract: It is known that Dodgson’s rule is computationally very demanding. Tideman (Soc Choice Welf 4:185–206, 1987) suggested an approximation to it but did not investigate how often his approximation selects the Dodgson winner. We show that under the Impartial Culture assumption the probability that the Tideman winner is the Dodgson winner converges to 1 as the number of voters increase. However we show that this convergence is not exponentially fast. We suggest another approximation—we call it Dodgson Quick—for which this convergence is exponentially fast. Also we show that the Simpson and Dodgson rules are asymptotically different.

58 citations


Journal ArticleDOI
TL;DR: The results indicate that subjects punish even when they cannot alter the current distribution of payoffs, and despite its cost, punishment progressively improves welfare in association with a decrease in the aggregate level of inequality over time.
Abstract: This paper reports the results of an experiment that investigates the relationships between inequality and punishment. In particular, we analyze how inter-personal comparisons affect altruistic punishment behavior. In addition, we examine how punishment affects inequality over time. We compare two treatments of a two-stage public good game, one in which costly punishment reduces the immediate payoff inequality between the punisher and the target, and one in which it does not affect the current level of inequality. Our results indicate that subjects punish even when they cannot alter the current distribution of payoffs. We find however that in both treatments, the intensity of punishment increases in the level of inequality. Finally, despite its cost, we show that punishment improves welfare in association with a decrease in the level of inequality over time.

49 citations


Journal ArticleDOI
TL;DR: It is shown that acyclic sets arising from this construction are distributive sublattices of the weak Bruhat order, and Fishburn’s alternating scheme is shown to be a special case of the Abello/Chameni-Nembua acyClic sets.
Abstract: We describe Abello’s acyclic sets of linear orders [SIAM J Discr Math 4(1):1–16, 1991] as the permutations visited by commuting equivalence classes of maximal reduced decompositions. This allows us to strengthen Abello’s structural result: we show that acyclic sets arising from this construction are distributive sublattices of the weak Bruhat order. This, in turn, shows that Abello’s acyclic sets are, in fact, the same as Chameni-Nembua’s distributive covering sublattices (S.T.D.C s). Fishburn’s alternating scheme is shown to be a special case of the Abello/Chameni-Nembua acyclic sets. Any acyclic set that arises in this way can be represented by an arrangement of pseudolines, and we use this representation to derive a simple closed form for the cardinality of the alternating scheme. The higher Bruhat orders prove to be a natural mathematical framework for this approach to the acyclic sets problem.

49 citations


Journal ArticleDOI
TL;DR: This work focuses on two principles of distributional egalitarianism along the line of the Pigou–Dalton transfer principle and the Lorenz domination principle, and shows that there exists no social evaluation relation satisfying one of these egalitarian principles and the weakened continuity and rationality axioms even in the absence of the Pareto principle.
Abstract: There exists a utilitarian tradition a la Sidgwick of treating equal generations equally. Diamond showed that there exists no social evaluation ordering over infinite utility streams in the presence of the Pareto principle, the Sidgwick principle, and continuity. Instead of requiring the Sidgwick principle of procedural fairness, we focus on two principles of distributional egalitarianism along the line of the Pigou–Dalton transfer principle and the Lorenz domination principle, and show that there exists no social evaluation relation satisfying one of these egalitarian principles and the weakened continuity and rationality axioms even in the absence of the Pareto principle.

48 citations


Journal ArticleDOI
TL;DR: It is shown that substantial time discounting can arise from the planner’s taste for catastrophe avoidance, even if the probability of the world ending is infinitesimally small.
Abstract: A social welfare function treating all generations equally is derived from a set of axioms that allow for preferences for catastrophe avoidance or risk equity. Implications for the case where there is a risk of world extinction are studied. We show that substantial time discounting can arise from the planner’s taste for catastrophe avoidance, even if the probability of the world ending is infinitesimally small.

Journal ArticleDOI
TL;DR: This work defines two families of rules to adjudicate conflicting claims and identifies the subfamily of consistent rules that are obtained by exchanging, for each claims problem, how well agents with relatively larger claims are treated as compared toagents with relatively smaller claims.
Abstract: We define two families of rules to adjudicate conflicting claims. The first family contains the constrained equal awards, constrained equal losses, Talmud, and minimal overlap rules. The second family, which also contains the first two of these rules, is obtained from the first family by exchanging, for each claims problem, how well agents with relatively larger claims are treated as compared to agents with relatively smaller claims. In each case, we identify the subfamily of consistent rules.

Journal ArticleDOI
TL;DR: This paper proposed an axiomatic model that connects group beliefs to beliefs of the group members, and derived group beliefs in a simple multiplicative form if people's information is independent and a more complex form if information overlaps arbitrarily.
Abstract: If a group is modelled as a single Bayesian agent, what should its beliefs be? I propose an axiomatic model that connects group beliefs to beliefs of the group members. The group members may have different information, different prior beliefs and even different domains (algebras) within which they hold beliefs, accounting for differences in awareness and conceptualisation. As is shown, group beliefs can incorporate all information spread across individuals without individuals having to explicitly communicate their information (that may be too complex or personal to describe, or not describable in principle in the language). The group beliefs derived here take a simple multiplicative form if people’s information is independent (and a more complex form if information overlaps arbitrarily). This form contrasts with familiar linear or geometric opinion pooling and the (Pareto) requirement of respecting unanimous beliefs.

Journal ArticleDOI
TL;DR: This paper proposes a partial equality-of-opportunity ordering based on the inequality of opportunity curve, a mechanism that gives preference to those who are worse off in terms of opportunity, and provides a complete ordering that depends on a sensitivity parameter representing the degree of priority in the equality of opportunity policy.
Abstract: This paper proposes a partial equality-of-opportunity ordering based on the inequality-of-opportunity curve, a mechanism that gives preference to those who are worse off in terms of opportunity. Moreover, it provides a complete ordering that depends on a sensitivity parameter representing the degree of priority in the equality-of-opportunity policy. The Moreno-Ternero approach is obtained as a particular case. This proposal is applied to a set of 12 countries to compare their degree of equality of opportunity. Results show the relevance for economic policy of observing inequality of opportunity over tranches. Denmark dominates, in terms of post-tax income, all other economies in our sample.

Journal ArticleDOI
TL;DR: A numerical scheme for computing the Banzhaf swing probability when votes are neither equiprobable nor independent and a modified square-root rule for two-tier voting systems that takes into account both the homogeneity and the size of constituencies is provided.
Abstract: This paper discusses a numerical scheme for computing the Banzhaf swing probability when votes are neither equiprobable nor independent. Examples indicate a substantial bias in the Banzhaf measure of voting power if neither assumption is met. The analytical part derives the exact magnitude of the bias due to the common probability of an affirmative vote deviating from one half and due to common correlation in unweighted simple-majority games. The former bias is polynomial, the latter is linear. A modified square-root rule for two-tier voting systems that takes into account both the homogeneity and the size of constituencies is also provided.

Journal ArticleDOI
TL;DR: This work identifies a problem that generalizes Sen’s ‘liberal paradox’, whereby the assignment of rights to two or more individuals or subgroups is inconsistent with the unanimity principle, whereby unanimously accepted propositions are collectively accepted.
Abstract: In the emerging literature on judgment aggregation over logically connected propositions, expert rights or liberal rights have not been investigated yet. A group making collective judgments may assign individual members or subgroups with expert knowledge on, or particularly affected by, certain propositions the right to determine the collective judgment on those propositions. We identify a problem that generalizes Sen’s ‘liberal paradox’. Under plausible conditions, the assignment of rights to two or more individuals or subgroups is inconsistent with the unanimity principle, whereby unanimously accepted propositions are collectively accepted. The inconsistency can be avoided if individual judgments or rights satisfy special conditions.

Journal ArticleDOI
TL;DR: A series of fairness notions of decreasing restrictiveness that are based on Rawls’ maximin equity criterion and impose welfare lower bounds are considered and it is shown that the corresponding mechanisms generate the smallest deficit for each economy among all k-fair Groves mechanisms.
Abstract: We study allocation problems in which a costly task is to be assigned and money transfers are used to achieve fairness among agents. We consider a series of fairness notions (k-fairness for \({k \in \{1,\dots,n\}}\) where n is the number of agents) of decreasing restrictiveness that are based on Rawls’ maximin equity criterion and impose welfare lower bounds. These fairness notions were introduced by Porter et al. (J Econ Theory 118:209–228, 2004) who also introduced two classes of Groves mechanisms that are 1-fair and 3-fair, respectively, and generate deficits that are bounded above. We show that these classes are the largest such classes of Groves mechanisms. We generalize these mechanisms for each \({k \in \{2,\dots,n\}}\) and show that the corresponding mechanisms generate the smallest deficit for each economy among all k-fair Groves mechanisms.

Journal ArticleDOI
TL;DR: This paper discusses and characterizes a distance function on the set of quasi choice functions that extends Kemeny’s use of the symmetric difference distance to set functions and hence to a more general model of choice.
Abstract: This paper discusses and characterizes a distance function on the set of quasi choice functions. The derived distance function is in the spirit of the widely used Kemeny metric on binary relations but extends Kemeny’s use of the symmetric difference distance to set functions and hence to a more general model of choice.

Journal ArticleDOI
Mark Fey1
TL;DR: This work considers the size of several of these tournament solutions in tournaments with a large but finite number of alternatives, and finds that with probability approaching one, the top cycle set, the uncovered set, and the Banks set are equal to the entire set of alternatives in a randomly chosen large tournament.
Abstract: A tournament can be viewed as a majority preference relation without ties on a set of alternatives. In this way, voting rules based on majority comparisons are equivalent to methods of choosing from a tournament. We consider the size of several of these tournament solutions in tournaments with a large but finite number of alternatives. Our main result is that with probability approaching one, the top cycle set, the uncovered set, and the Banks set are equal to the entire set of alternatives in a randomly chosen large tournament. That is to say, each of these tournament solutions almost never rules out any of the alternatives under consideration. We also discuss some implications and limitations of this result.

Journal ArticleDOI
TL;DR: Amartya Sen’s capability approach shows that some functionings are not only the result of capabilities, but also their prerequisite, resulting in a mutual dependency between capabilities and functionings.
Abstract: Amartya Sen’s capability approach has recently been widely discussed as a theoretical basis for making resource allocation decisions in health care. The purpose of this paper is to analyze the relationship between capabilities and functionings in the capability approach. The paper shows that some functionings are not only the result of capabilities, but also their prerequisite. That is, there is a dual role of some functionings as both ends and instruments, resulting in a mutual dependency between capabilities and functionings. Functionings may be a direct requirement for capabilities, but also an indirect one because they ensure the absence of mental disorders or negative thoughts, both of which are relevant constraints on freedom. This has important implications. It supports a policy that ensures for everyone an initial endowment of (1) mental and physical health, (2) education, and (3) other functionings with a direct or indirect impact on capabilities.

Journal ArticleDOI
TL;DR: It is proved that in pure exchange economies with two agents and a finite number of goods, a social choice function is strategy-proof and Pareto-efficient if and only if it is dictatorial.
Abstract: In this paper we prove that in pure exchange economies with two agents and a finite number of goods, a social choice function on the Cobb-Douglas preference domain is strategy-proof and Pareto-efficient if and only if it is dictatorial. Indeed, we can establish substantially stronger results that are the impossibilities on any arbitrarily small 1-dimensional intervals of Cobb-Douglas preferences.

Journal ArticleDOI
Fuhito Kojima1
TL;DR: This work investigates strategic behavior of students under the Boston mechanism when schools may have complex priority structures and shows that any outcome of a Nash equilibrium is a stable matching when the school priorities are substitutable.
Abstract: The Boston mechanism is a centralized student assignment mechanism used in many school districts in the US. We investigate strategic behavior of students under the Boston mechanism when schools may have complex priority structures. We show that a stable matching is supported as an outcome of a Nash equilibrium under a general environment. We further show that any outcome of a Nash equilibrium is a stable matching when the school priorities are substitutable.

Journal ArticleDOI
TL;DR: In this paper, a subfamily of monotonic standards of comparison (monotonic standard) is used to construct scheduling rules for claims problems with indivisible goods, that is, problems in which a certain amount of a certain type of goods has to be distributed among a group of agents, when this amount is not enough to satisfy agents' demands.
Abstract: In this work we deal with rationing problems. In particular with claims problems with indivisible goods, that is, problems in which a certain amount of indivisible units (of an homogeneous good), has to be distributed among a group of agents, when this amount is not enough to satisfy agents’ demands. We use a subfamily of standards of comparisons (monotonic standards) to construct scheduling methods to solve this type of problems. The rules constructed in this way can be interpreted as discrete versions of the constrained equal awards and constrained equal losses rules when the good is perfectly divisible. They not only enjoy similar properties, but have stronger relations with the cea and cel rules in terms of expected values and the size of indivisibility.

Journal ArticleDOI
TL;DR: Following an interpretation of bankruptcy problems in terms of TU games, it is shown that the Minimal Overlap Value is the unique solution for bankruptcy games which satisfies Anonymity and Core Transition Responsiveness.
Abstract: This paper provides an analysis of the Minimal Overlap Rule, a solution for bankruptcy problems introduced by O’Neill (1982) We point out that this rule can be understood as a composition of Ibn Ezra’s proposal and the recommendation given by the Constrained Equal Loss Rule Following an interpretation of bankruptcy problems in terms of TU games, we show that the Minimal Overlap Value is the unique solution for bankruptcy games which satisfies Anonymity and Core Transition Responsiveness

Journal ArticleDOI
TL;DR: In this article, the authors considered one-to-one, one-sided matching problems in which agents can either be matched as pairs or remain single, and they introduced a so-called bi-choice graph for each pair of stable matchings.
Abstract: We consider one-to-one, one-sided matching (roommate) problems in which agents can either be matched as pairs or remain single. We introduce a so-called bi-choice graph for each pair of stable matchings and characterize its structure. Exploiting this structure we obtain as a corollary the “lone wolf” theorem and a decomposability result. The latter result together with transitivity of blocking leads to an elementary proof of the so-called stable median matching theorem, showing how the often incompatible concepts of stability (represented by the political economist Adam Smith) and fairness (represented by the political philosopher John Rawls) can be reconciled for roommate problems. Finally, we extend our results to two-sided matching problems.

Journal ArticleDOI
Shin Sato1
TL;DR: It is proved that there is no social choice correspondence satisfying anonymity, neutrality, a range condition, and either of the concepts of strategy-proofness.
Abstract: We introduce two new concepts of strategy-proofness for social choice correspondences based on the theory of preferences over sets of alternatives under complete uncertainty. One is based on Pattanaik and Peleg (Soc Choice Welf 1:113–122, 1984) and the other is based on Bossert et al. (Econ Theory 16:295–312, 2000). We prove that there is no social choice correspondence satisfying anonymity, neutrality, a range condition, and either of our concepts of strategy-proofness.

Journal ArticleDOI
TL;DR: This paper analyze the consequences of several invariance conditions of scale invariance, an invariance condition that applies when all amounts are multiplied by a constant without change of units.
Abstract: A frequent motivation for the use of scale invariance in the bankruptcy literature is that it imposes that the outcome of a bankruptcy problem does not depend on the units of measurement. We show that this interpretation is not correct. Scale invariance is an invariance condition that applies when all amounts are multiplied by a constant (without change of units). With this interpretation in mind, it is natural to consider other invariance conditions, for example one that applies when all amounts are increased by the same constant. In this paper, we analyze the consequences of several invariance conditions.

Journal ArticleDOI
TL;DR: This paper proposes axiomatic foundations for individual fairness-motivated preferences that cover most of the models developed to rationalise observed behaviour in experiments and proposes a simple functional form in which the weight on each person’s payoff depends on a reference vector of how much each person deserves.
Abstract: Much work in social choice theory takes individual preferences as uninvestigated inputs into aggregation functions designed to reflect considerations of fairness. Advances in experimental and behavioural economics show that fairness can also be an important motivation in the preferences of individuals themselves. A proper characterisation of how fairness concerns enter such preferences can enrich the informational basis of many social choice exercises. This paper proposes axiomatic foundations for individual fairness-motivated preferences that cover most of the models developed to rationalise observed behaviour in experiments. These models fall into two classes: Outcome-based models, which see preferences as defined only over distributive outcomes, and context-dependent models, which allow rankings over distributive outcomes to change systematically with non-outcome factors. I accommodate outcome-based and context-sensitive fairness concerns by modelling fairness-motivated preferences as a reference-dependent preference structure. I first present a set of axioms and two theorems that generate commonly used outcome-based models as special cases. I then generalise the axiomatic basis to allow for reference-dependence, and derive a simple functional form in which the weight on each person’s payoff depends on a reference vector of how much each person deserves.

Journal ArticleDOI
TL;DR: It turns out that the Wage of a manager is always at least as high as the wage of its subordinates, which implies that the wage differences are maximal for linear production functions, and they are minimal for Cobb–Douglas production functions.
Abstract: In this paper, we present a cooperative model of a hierarchically structured firm to study wage differences between different levels in such a firm. We consider a class of wage functions that are based on marginal contributions to production. It turns out that the wage of a manager is always at least as high as the wage of its subordinates. On the other hand, the wage of a manager never exceeds the sum of the wages of its direct subordinates. These bounds are sharp in the sense that we can characterize for which production processes they are reached. For the class of constant elasticity of substitution (CES) production functions this implies that the wage differences are maximal for linear production functions, and they are minimal for Cobb–Douglas production functions.

Journal ArticleDOI
TL;DR: In this paper, an extension of Harsanyi's Impartial Observer Theorem based on the representation of ignorance as the set of all possible probability distributions over individuals is proposed.
Abstract: We propose an extension of Harsanyi’s Impartial Observer Theorem based on the representation of ignorance as the set of all possible probability distributions over individuals. We obtain a characterization of the observer’s preferences that, under our most restrictive conditions, is a convex combination of Harsanyi’s utilitarian and Rawls’ egalitarian criteria. This representation is ethically meaningful, in the sense that individuals’ utilities are cardinally measurable and fully comparable. This allows us to conclude that the impartiality requirement cannot be used to decide between Rawls’ and Harsanyi’s positions.