scispace - formally typeset
Search or ask a question

Showing papers in "Synthese in 2004"


Book ChapterDOI
01 Mar 2004-Synthese
TL;DR: This work constructs logical languages which allow one to represent a variety of possible types of changes affecting the information states of agents in a multi-agent setting by formalizing these changes by defining a notion of epistemic program, a Kripke model of ‘actions’.
Abstract: We construct logical languages which allow one to represent a variety of possible types of changes affecting the information states of agents in a multi-agent setting. We formalize these changes by defining a notion of epistemic program. The languages are two-sorted sets that contain not only sentences but also actions or programs. This is as in dynamic logic, and indeed our languages are not significantly more complicated than dynamic logics. But the semantics is more complicated. In general, the semantics of an epistemic program is what we call a program model. This is a Kripke model of ‘actions’, representing the agents’ uncertainty about the current action in a similar way that Kripke models of ‘states’ are commonly used in epistemic logic to represent the agents’ uncertainty about the current state of the system. Program models induce changes affecting agents’ information, which we represent as changes of the state model, called epistemic updates. Formally, an update consists of two operations: the first is called the update map, and it takes every state model to another state model, called the updated model; the second gives, for each input state model, a transition relation between the states of that model and the states of the updated model.

354 citations


Journal ArticleDOI
01 May 2004-Synthese
TL;DR: The two theorems, the Condorcet paradox and Arrow's impossibility theorem, are compared, and it is suggested that, while the framework of preference aggregation can be mapped into the context of judgment aggregation, there exists no obvious reverse mapping.
Abstract: The ``doctrinal paradox'' or ``discursive dilemma'' shows that propositionwise majority voting over the judgments held by multiple individuals on some interconnected propositions can lead to inconsistent collective judgments on these propositions. List and Pettit (2002) have proved that this paradox illustrates a more general impossibility theorem showing that there exists no aggregation procedure that generally produces consistent collective judgments and satisfies certain minimal conditions. Although the paradox and the theorem concern the aggregation of judgments rather than preferences, they invite comparison with two established results on the aggregation of preferences: the Condorcet paradox and Arrow's impossibility theorem. We may ask whether the new impossibility theorem is a special case of Arrow's theorem, or whether there are interesting disanalogies between the two results. In this paper, we compare the two theorems, and show that they are not straightforward corollaries of each other. We further suggest that, while the framework of preference aggregation can be mapped into the framework of judgment aggregation, there exists no obvious reverse mapping. Finally, we address one particular minimal condition that is used in both theorems – an independence condition – and suggest that this condition points towards a unifying property underlying both impossibility results.

196 citations


Journal ArticleDOI
01 Feb 2004-Synthese
TL;DR: Drawing on recent historical and philosophical work, I articulate eight operationally accessible and distinct senses of objectivity, arguing that none of the eight senses is strictly reducible to the others, giving objectivity itsreducible complexity.
Abstract: The terms ``objectivity'' and ``objective'' are among the mostused yet ill-defined terms in the philosophy of science and epistemology. Common to all thevarious usages is the rhetorical force of ``I endorse this and you should too'', orto put it more mildly, that one should trust the outcome of the objectivity-producing process.The persuasive endorsement and call to trust provide some conceptual coherenceto objectivity, but the reference to objectivity is hopefully not merely an attemptat persuasive endorsement. What, in addition to epistemological endorsement,does objectivity carry with it? Drawing on recent historical and philosophical work,I articulate eight operationally accessible and distinct senses of objectivity.While there are links among these senses, providing cohesion to the concept, I argue thatnone of the eight senses is strictly reducible to the others, giving objectivity itsirreducible complexity.

153 citations


Journal ArticleDOI
01 Aug 2004-Synthese
TL;DR: McCulloch and Pitts's contributions included (i) a formalism whose refinement and generalization led to the notion of finite automata (an important formalism in computability theory), (ii) a technique that inspired the idea of logic design (a fundamental part of modern computer design), and (iii) the first use of computation to address the mind–body problem.
Abstract: Despite its significance in neuroscience and computation, McCulloch and Pitts's celebrated 1943 paper has received little historical and philosophical attention. In 1943 there already existed a lively community of biophysicists doing mathematical work on neural networks. What was novel in McCulloch and Pitts's paper was their use of logic and computation to understand neural, and thus mental, activity. McCulloch and Pitts's contributions included (i) a formalism whose refinement and generalization led to the notion of finite automata (an important formalism in computability theory), (ii) a technique that inspired the notion of logic design (a fundamental part of modern computer design), (iii) the first use of com- putation to address the mind-body problem, and (iv) the first modern computational theory of mind and brain.

120 citations


Book ChapterDOI
01 Mar 2004-Synthese
TL;DR: Models: alternating transition systems, multi-player game models (alias concurrent game structures) and coalition effectivity models turn out to be intimately related, while alternating epistemic transition systems share much of their philosophical and formal apparatus.
Abstract: We draw parallels between several closely related logics that combine – in different proportions – elements of game theory, computation tree logics, and epistemic logics to reason about agents and their abilities. These are: the coalition game logics CL and ECL introduced by Pauly in 2000, the alternating-time temporal logic ATL developed by Alur, Henzinger and Kupferman between 1997 and 2002, and the alternating-time temporal epistemic logic ATEL by van der Hoek and Wooldridge (2002). In particular, we establish some subsumption and equivalence results for their semantics, as well as interpretation of the alternating-time temporal epistemic logic into ATL. The focus in this paper is on models: alternating transition systems, multi-player game models (alias concurrent game structures) and coalition effectivity models turn out to be intimately related, while alternating epistemic transition systems share much of their philosophical and formal apparatus. Our approach is constructive: we present ways to transform between different types of models and languages.

103 citations


Journal ArticleDOI
01 May 2004-Synthese
TL;DR: This paper offers a comparison between two decision rules for use when uncertainty is depicted by a non-trivial, convex2 set of probability func tions T, different from the canonical Bayesian decision theory of expected utility.
Abstract: This paper offers a comparison between two decision rules for use when uncertainty is depicted by a non-trivial, convex2 set of probability func tions T. This setting for uncertainty is different from the canonical Bayesian decision theory of expected utility, which uses a singleton set, just one probability function to represent a decision maker's uncertainty. Justifications for using a non-trivial set of probabilities to depict uncer tainty date back at least a half century (Good 1952) and a foreshadowing of that idea can be found even in Keynes' (1921), where he allows that not all hypotheses may be comparable by qualitative probability in accord with, e.g., the situation where the respective intervals of probabilities for two events merely overlap with no further (joint) constraints, so that neither of the two events is more, or less, or equally probable compared with the other. Here, I will avail myself of the following simplifying assumption: Throughout, I will avoid the complexities that ensue when the decision maker's values for outcomes also are indeterminate and, in parallel with her or his uncertainty, are then depicted by a set of (cardinal) utilities. That is, for this discussion, I will contrast two decision rules when the decision maker's uncertainties, but not her/his values are indeterminate. The more familiar decision rule of the pair under discussion, T Maximin,3 requires that the decision maker ranks an option by its lower expected value, taken with respect to the convex set of probabilities, T, and then to choose an option whose lower expected value is maximum. This decision rule (as simplified by the two assumptions, above) was given a representation in terms of a binary preference relation over Anscombe Aumann horse lotteries (Gilboa and Schmeidler 1989), has been discussed

83 citations


Journal ArticleDOI
01 Apr 2004-Synthese
TL;DR: A diachronic Dutch Book argument in favor of 1/3 is offered in the Sleeping Beauty problem, where Beauty is uncertain whether the outcome of a certain coin toss was heads or tails.
Abstract: In the Sleeping Beauty problem, Beauty is uncertain whether the outcome of a certain coin toss was heads or tails. One argument suggests that her degree of belief in heads should be 1/3, while a second suggests that it should be 1/2. Prima facie, the argument for 1/2 appears to be stronger. I offer a diachronic Dutch Book argument in favor of 1/3. Even for those who are not routinely persuaded by diachronic Dutch Book arguments, this one has some important morals.

72 citations


Book ChapterDOI
01 Sep 2004-Synthese
TL;DR: In this paper, the authors use Bayesian networks to model the independence and competence assumptions of Condorcet's classical jury model, and show that the probability of a correct majority decision converges to certainty as the jury size increases, a seemingly unrealistic result.
Abstract: Under the independence and competence assumptions of Condorcet’s classical jury model, the probability of a correct majority decision converges to certainty as the jury size increases, a seemingly unrealistic result. Using Bayesian networks, we argue that the model’s independence assumption requires that the state of the world (guilty or not guilty) is the latest common cause of all jurors’ votes. But often – arguably in all courtroom cases and in many expert panels – the latest such common cause is a shared ‘body of evidence’ observed by the jurors. In the corresponding Bayesian network, the votes are direct descendants not of the state of the world, but of the body of evidence, which in turn is a direct descendant of the state of the world. We develop a model of jury decisions based on this Bayesian network. Our model permits the possibility of misleading evidence, even for a maximally competent observer, which cannot easily be accommodated in the classical model. We prove that (i) the probability of a correct majority verdict converges to the probability that the body of evidence is not misleading, a value typically below 1; (ii) depending on the required threshold of ‘no reasonable doubt’, it may be impossible, even in an arbitrarily large jury, to establish guilt of a defendant ‘beyond any reasonable doubt’.

64 citations


Journal ArticleDOI
01 Oct 2004-Synthese
TL;DR: It will be that a weaker version of the form of realism, mainly developed by Steven French and James Ladyman, is more plausible – namely, epistemic structural realism.
Abstract: In the last decade, structural realism has been presented as the most promising strategy for developing a defensible realist view of science. Nevertheless, controversy still continues in relation to the exact meaning of the proposed structuralism. The stronger version of structural realism, the so-called ontic structural realism, has been argued for on the basis of some ideas related to quantum mechanics. In this paper, I will first outline these arguments, mainly developed by Steven French and James Ladyman, then challenge them, putting a particular emphasis on a metaphysical principle (the Principle of the Identity of the Indiscernibles) which, even though it is crucial for the whole argument, hasn't been, in my opinion, clearly stated and examined yet. My overall view will be that a weaker version of the form of realism we are considering is more plausible – namely, epistemic structural realism.

59 citations


Journal ArticleDOI
01 Oct 2004-Synthese
TL;DR: The coordination problem is proved to be insolvable by showing that it is equivalent to the ''coordinated attack'' problem, which is demonstrably insolvable in epistemic logic.
Abstract: The tripartite account of propositional, fallibilist knowledge that p as justified true belief can become adequate only if it can solve the Gettier Problem. However, the latter can be solved only if the problem of a successful coordination of the resources (at least truth and justification) necessary and sufficient to deliver propositional, fallibilist knowledge that p can be solved. In this paper, the coordination problem is proved to be insolvable by showing that it is equivalent to the ''coordinated attack'' problem, which is demonstrably insolvable in epistemic logic. It follows that the tripartite account is not merely inadequate as it stands, as proved by Gettier-type counterexamples, but demonstrably irreparable in principle, so that efforts to improve it can never succeed.

47 citations


Journal ArticleDOI
Haim Gaifman1
01 May 2004-Synthese
TL;DR: A rigorous way of assigning probabilities to statements in pure arithmetic and a philosophical discussion that highlights the shifting contextual character of subjective probabilities and beliefs are sketched.
Abstract: There are three sections in this paper. Thefirst is a philosophical discussion of the general problem of reasoning under limited deductive capacity. The second sketches a rigorous way of assigning probabilities to statements in pure arithmetic; motivated by the preceding discussion, it can nonetheless be read separately. The third is a philosophical discussion that highlights the shifting contextual character of subjective probabilities and beliefs.

Journal ArticleDOI
Carol Rovane1
01 May 2004-Synthese
TL;DR: This paper is intended as a tribute to a philosopher who saw clearly from early on that human size is not basic to agency and, among the very small number of philosophers who saw this, is perhaps the only one who had right the general direction of reasons why they are not.
Abstract: Isaac Levi and I agree that there can be group agents formed out of two or more human beings.1 I’ve argued elsewhere that if we properly understand the detailed philosophical grounds for the possibility of such group agents, those very grounds offer up another possibility in quite the opposite direction, the possibility of two or more multiple agents within the same human being.2 In this paper I want to address some of the natural reservations that are likely to arise about both of these claims, by first articulating as sympathetically as I can some of the reasons why it might seem, nevertheless, that human beings are basic agents – basic in a sense that does not carry over to the group and multiple cases and, indeed, basic in a sense that makes them the only bona fide cases of agency. In order to explain why these reasons are unconvincing, I will have to provide a somewhat more detailed account than Levi himself has given of how and why group agents are possible, and to take up as well what makes for the possibility of multiple agents, which he does not discuss. But this paper is intended as a tribute to a philosopher who saw clearly from early on that human size is not basic to agency and, among the very small number of philosophers who saw this, is perhaps the only one who had right the general direction of reasons why they are not. It’s a real pleasure to have this chance to acknowledge not just his influence but the solidarity he showed during my efforts to work out what is a manifestly unorthodox and unpopular philosophical view. Levi’s central philosophical project is to give a normative account of rational choice. It is in the context of this project that he has important things to say about agency. However, he keeps his metaphysical assumptions about agents to the bare minimum that are required for his normative purposes, leaving any further metaphysical questions about their nature untouched. This is not merely because he lacks a taste for metaphysics, which he does. It is a matter of policy. In his view, philosophers should, as far as possible, leave substantive questions about the nature of things to science. Accordingly, he never introduces a metaphysical assumption unless he is absolutely driven to it by independent philosophical considerations.

Journal ArticleDOI
01 May 2004-Synthese
TL;DR: A series of observations on rational decision making with incompletely resolved internal dissensions and the nature and use of incomplete valuational orderings and the ways and means of extending their reach are presented.
Abstract: The subtitle of Isaac Levi’s book, Hard Choices, explains the nature of the problem that he addresses in that classic work: ‘Decision Making under Unresolved Conflict’.1 We have learned greatly from Levi’s analyses of why, despite our best efforts, the valuational conflicts that we face may not always be fully resolved when the point of decision making comes, and how we may nevertheless use systematic reasoning to decide what one should sensibly do despite the presence of unsettled conflicts. Indeed, through a variety of contributions stretching over several decades, Isaac Levi has powerfully illuminated the challenges of decision making in the presence of imperfect information, conflicting evidence, divergent values, discordant commitments, and other sources of internal dissension.2 I seize the wonderful occasion of celebrating Isaac Levi’s work and accomplishments by presenting a series of observations on rational decision making with incompletely resolved internal dissensions. In that context, I comment also on the nature and use of incomplete valuational orderings and the ways and means of extending their reach. I pursue these issues in the form of addressing a series of questions. Sometimes I draw on Levi’s work, and at other times, I comment on differences that we may still have. As will be obvious, even when we disagree, my understanding of these issues is strongly influenced by Levi’s thinking.

Journal ArticleDOI
01 Mar 2004-Synthese
TL;DR: An argument will be presented to make the case that Mach believed in the mind-independent elements from the 1870s, while other aspects of his thought evolved over time, and his ontology is concentrated on Mach's ontology, as it bears oneconomy of thought, not his epistemology per se.
Abstract: A full appreciation for Ernst Mach's doctrine of the economy of thought must takeaccount of his direct realism about particulars (elements) and his anti-realism aboutspace-time laws as economical constructions. After a review of thought economy,its critics and some contemporary forms, the paper turns to the philosophical rootsof Mach's doctrine. Mach claimed that the simplest, most parsimonious theorieseconomized memory and effort by using abstract concepts and laws instead ofattending to the details of each individual event or experiment. For Mach, theindividual case never truly repeated in all of its uniqueness, nor was all of theindividual detail of a physical element adequately captured in abstract laws andschemata, however necessary these were for the pursuit of science. As can beshown from specific passages, some already published, some not, Mach's elementsincluded physical qualia in nature similar to Russell's unsensed sensibilia, whichexisted even where there were no conscious observers. An argument will be presentedto make the case that Mach believed in the mind-independent elements from the 1870son, while other aspects of his thought evolved over time; I have thus dated the referencesto reflect this historical progression. I concentrate on Mach's ontology, as it bears oneconomy of thought, not his epistemology per se, which might well have been restrictedto observable elements/ sensations. After his own conversion to neutral monism, in the1920s, Bertrand Russell echoed Mach's call for a `future science' capable of handlingthe `intrinsic character' of qualitative data directly without the excessive abstraction ofphysics.

Journal ArticleDOI
01 Jul 2004-Synthese
TL;DR: It is concluded that those versions of the conservative thesis that survive critical scrutiny fail to live up to the aspirations of the thesis as a substantive canon of rationality, and that to the extent that principles of conservatism are epistemically promising, they are not plausible.
Abstract: According to the thesis of epistemic conservatism it would be unreasonable to change one's beliefs in the absence of any good reasons. Although it is claimed that epistemic conservatism has informed and resolved a number of positions and problems in epistemology, it is difficult to identify a single representative view of the thesis. This has resulted in advancing a series of disparate and largely unconnected arguments to establish conservatism. In this paper, I begin by casting doubt on the claim of widespread and genuine applications of the conservative policy. I then distinguish between three main varieties of epistemic conservatism, namely, differential, perseverance and generation conservatism Having evaluated various arguments that have been offered or may be considered on behalf of the conservative thesis, I close by concluding that those versions of the thesis that survive critical scrutiny fail to live up to the aspirations of the thesis as a substantive canon of rationality, that to the extent that principles of conservatism are epistemically promising, they are not plausible. While to the extent that they are plausible, they are not of much epistemic interest.

Book ChapterDOI
01 Mar 2004-Synthese
TL;DR: The main goal of the paper is to show that in terms of evolutionary game theory, language use and language organisation can motivate the emergence and self-sustaining force of conventional meaning and some conversational interpretation strategies in Terms of weaker and, perhaps, more plausible assumptions.
Abstract: In this paper we study language use and language organisation by making use of Lewisean signalling games. Standard game theoretical approaches are contrasted with evolutionary ones to analyze conventional meaning and conversational interpretation strategies. It is argued that analyzing successful communication in terms of standard game theory requires agents to be very rational and fully informed. The main goal of the paper is to show that in terms of evolutionary game theory we can motivate the emergence and self-sustaining force of (i) conventional meaning and (ii) some conversational interpretation strategies in terms of weaker and, perhaps, more plausible assumptions.

Journal ArticleDOI
01 Apr 2004-Synthese
TL;DR: The main purpose of this article is to show the great significance of space-time geometry in predetermining the laws which are supposed to govern the behaviour of matter, and to support the thesis that matter itself can be built from geometry, in the sense that particles of matter as well as the other forces of nature emerges in the same way that gravity emerges from geometry.
Abstract: The physicist's conception of space-time underwent two major upheavals thanks to the general theory of relativity and quantum mechanics. Both theories play a fundamental role in describing the same natural world, although at different scales. However, the inconsistency between them emerged clearly as the limitation of twentieth-century physics, so a more complete description of nature must encompass general relativity and quantum mechanics as well. The problem is a theorists' problem par excellence. Experiment provide little guide, and the inconsistency mentioned above is an important problem which clearly illustrates the intermingling of philosophical, mathematical, and physical thought. In fact, in order to unify general relativity with quantum field theory, it seems necessary to invent a new mathematical framework which will generalise Riemannian geometry and therefore our present conception of space and space-time. Contemporary developments in theoretical physics suggest that another revolution may be in progress, through which a new kind of geometry may enter physics, and space-time itself can be reinterpreted as an approximate, derived concept. The main purpose of this article is to show the great significance of space-time geometry in predetermining the laws which are supposed to govern the behaviour of matter, and further to support the thesis that matter itself can be built from geometry, in the sense that particles of matter as well as the other forces of nature emerges in the same way that gravity emerges from geometry. Scientific research is not a process of steady accumulation of absolute truths, which has culminated in present theories, but rather a much more dynamic kind of process in which there are no final theoretical concepts valid in unlimited domains. (David Bohm) The more we understand about the physical world, and deeper we probe into the laws of nature, the more we are driven into the world of mathematics and of mathematical concepts. (Roger Penrose)

Journal ArticleDOI
Michela Massimi1
01 Jun 2004-Synthese
TL;DR: It is argued that demonstrative induction can deal with the problem of the underdetermination of theory by evidence and cast the historical case study of spectroscopy in the early 1920s, where the choice among different theories was apparently underdetermined by spectroscopic evidence concerning the alkalidoublets and their anomalous Zeeman effect.
Abstract: In this paper I argue that demonstrative induction can deal with the problem of underdetermination of theory by evidence. I present the historical case study of spectroscopy in the early 1920s, where the choice among different theories was apparently underdetermined by spectroscopic evidence concerning the alkali doublets and their anomalous Zeeman effect. By casting this historical episode within the methodological framework of demonstrative induction, the local underdetermination among Bohr's, Heisenberg's, and Pauli's rival theories is resolved in favour of Pauli's theory of the electron's spin.

Journal ArticleDOI
01 Aug 2004-Synthese
TL;DR: It is argued that TMS with its ability to draw causal inferences on function and its neural representations is a valuable neurophysiological tool for investigating the causal basis of neuronal functions and can provide substantive insight into the modern interdisciplinary and (anti)reductionist neurophilosophical debates concerning the relationships between brain functions and mental abilities.
Abstract: Transcranial magnetic stimulation (TMS) is a method capable of transiently modulating neural excitability. Depending on the stimulation parameters information processing in the brain can be either enhanced or disrupted. This way the contribution of different brain areas involved in mental processes can be studied, allowing a functional decomposition of cognitive behavior both in the temporal and spatial domain, hence providing a functional resolution of brain/mind processes. The aim of the present paper is to argue that TMS with its ability to draw causal inferences on function and its neural representations is a valuable neurophysiological tool for investigating the causal basis of neuronal functions and can provide substantive insight into the modern interdisciplinary and (anti)reductionist neurophilosophical debates concerning the relationships between brain functions and mental abilities. Thus, TMS can serve as a heuristic method for resolving causal issues in an arena where only correlative tools have traditionally been available.

Journal ArticleDOI
01 Jan 2004-Synthese
TL;DR: It is argued that one particular objection that has been raised on numerous occasions is misguided and concerns the randomness of the sample on which the inductive extrapolation is based.
Abstract: In 1947 Donald Cary Williams claimed in The Ground of Induction to have solved the Humean problem of induction, by means of an adaptation of reasoning first advanced by Bernoulli in 1713. Later on David Stove defended and improved upon Williams’ argument in The Rationality of Induction (1986). We call this proposed solution of induction the ‘Williams-Stove sampling thesis’. There has been no lack of objections raised to the sampling thesis, and it has not been widely accepted. In our opinion, though, none of these objections has the slightest force, and, moreover, the sampling thesis is undoubtedly true. What we will argue in this paper is that one particular objection that has been raised on numerous occasions is misguided. This concerns the randomness of the sample on which the inductive extrapolation is based.

Journal ArticleDOI
01 Oct 2004-Synthese
TL;DR: According to a plausible and influential account of perceptual knowledge, the truth-makers of beliefs that constitute perceptual knowledge must feature in the causal explanation of how the authors acquire those beliefs, but this account runs into difficulties when it tries to accommodate time perception.
Abstract: According to a plausible and influential account of perceptual knowledge, the truth-makers of beliefs that constitute perceptual knowledge must feature in the causal explanation of how we acquire those beliefs. However, this account runs into difficulties when it tries to accommodate time perception – specifically perception of order and duration – since the features we are apparently tracking in such perception are (it is argued) not causal. The central aim of the paper is to solve this epistemological puzzle. Two strategies are examined. The first strategy locates the causal truth-makers within the psychological mechanism underlying time perception, thus treating facts about time order and duration as mind-dependent. This strategy, however, is problematic. The second strategy modifies the causal account of perceptual knowledge to include a non-causal component in the explanation of belief-acquisition, namely chronometric explanation. Applying this much more satisfactory approach to perceptual knowledge of time, we can preserve the mind-independence of order and duration, but not that of time's flow.

Journal ArticleDOI
01 Mar 2004-Synthese
TL;DR: A generalization of Ehrenfest's urn model is suggested to treat a wide class of stochastic processes describing the changes of microscopic objects, which are homogeneous Markov chains.
Abstract: A generalization of Ehrenfest's urn model is suggested. This will allow usto treat a wide class of stochastic processes describing the changes ofmicroscopic objects. These processes are homogeneous Markov chains. Thegeneralization proposed is presented as an abstract conditional (relative)probability theory. The probability axioms of such a theory and some simpleadditional conditions, yield both transition probabilities and equilibriumdistributions. The resulting theory interpreted in terms of particles andsingle-particle states, leads to the usual formulae of quantum and classicalstatistical mechanics; in terms of chromosomes and allelic types, it allowsthe deduction of many genetical models including the Ewens sampling formula;in terms of agents' strategies, it gives a justification of the ``herdbehaviour'' typical of a population of heterogeneous economic agents.

Journal ArticleDOI
01 May 2004-Synthese
TL;DR: It is shown in this paper that at least one of them has been misdiagnosed by the theorists of cognitive fallacies, known as the conjunctive fallacy or the conjunction effect.
Abstract: One of the major current developments in cognitive psychology is what is usually referred to as the theory of cognitive fallacies originated by Amos Tversky and Daniel Kahneman. The purported repercussions of their theory extend beyond psychology, however. A flavor of how seriously the fad of cognitive fallacies has been taken is perhaps conveyed by a quote from Piatelli-Palmerini (1994, xiii) who predicted “that sooner or later, Amos Tversky and Daniel Kahneman will win the Nobel Prize for economics”. His prediction was fulfilled in 2002. The theory of cognitive fallacies is not merely a matter of bare facts of psychology. The phenomena (certain kinds of spontaneous cognitive judgments) that are the evidential basis of the theory obtain their theoretical interests mainly from the fact that they are interpreted as representing fallacious, that is, irrational judgments on the part of the subject in question. Such an interpretation presupposes that we can independently establish what it means for a probability judgment to be rational. In the case of typical cognitive fallacies studied in the recent literature, this rationality is supposed to have been established by our usual probability calculus in its Bayesian use. The fame of the cognitive fallacies notwithstanding, I will show in this paper that at least one of them has been misdiagnosed by the theorists of cognitive fallacies. In reality, there need not be anything fallacious or otherwise irrational about the judgments that are supposed to exhibit this “fallacy”. I will also comment on how this alleged fallacy throws light on the way probabilistic concepts should and should not be applied to human knowledge-seeking (truth-seeking) activities. The so-called fallacy I will discuss is known as the conjunctive fallacy or the conjunction effect. It is best introduced by means of an example. I follow here the formulation of Piatelli-Palmerini (1974, 65–67, abbreviated). Consider the following information that one is supposed to have received:

Journal ArticleDOI
01 Aug 2004-Synthese
TL;DR: Whether one of the key methods in cognitive neuroscience – the contrastive analysis – suffers from any serious confounding when applied to the field of consciousness studies and if there are any systematic difficulties when studying consciousness with this method that make the results untrustworthy is discussed.
Abstract: Several authors within psychology, neuroscience and philosophy take for granted that standard empirical research techniques are applicable when studying consciousness. In this article, it is discussed whether one of the key methods in cognitive neuroscience – the contrastive analysis – suffers from any serious confounding when applied to the field of consciousness studies; that is to say, if there are any systematic difficulties when studying consciousness with this method that make the results untrustworthy. Through an analysis of theoretical arguments in favour of using contrastive analysis, combined with analyses of empirical findings, I conclude by arguing for three factors that currently are confounding of research using contrastive analysis. These are (1) unconscious processes, (2) introspective reports, and (3) attention.

Journal ArticleDOI
01 Jan 2004-Synthese
TL;DR: There is a triangular relation between the fields of Multi-Agent Systems, Reinforcement Learning and Evolutionary Game Theory and it is illustrated how these new insights can contribute to a better understanding of learning in MAS and to new improved learning algorithms.
Abstract: In this paper we revise Reinforcement Learning and adaptiveness in Multi-Agent Systems from an Evolutionary Game Theoretic perspective. More precisely we show there is a triangular relation between the fields of Multi-Agent Systems, Reinforcement Learning and Evolutionary Game Theory. We illustrate how these new insights can contribute to a better understanding of learning in MAS and to new improved learning algorithms. All three fields are introduced in a self-contained manner. Each relation is discussed in detail with the necessary background information to understand it, along with major references to relevant work.

Book ChapterDOI
01 Sep 2004-Synthese
TL;DR: This paper deals with the problem of verification of game-like structures by means of symbolic model checking and unbounded model checking (a SAT based technique) is applied for the first time to verification of ATEL.
Abstract: This paper deals with the problem of verification of game-like structures by means of symbolic model checking. Alternating-time Temporal Epistemic Logic (ATEL) is used for expressing properties of multi-agent systems represented by alternating epistemic temporal systems as well as concurrent epistemic game structures. Unbounded model checking (a SAT based technique) is applied for the first time to verification of ATEL. An example is given to show an application of the technique.

Journal ArticleDOI
01 Jan 2004-Synthese
TL;DR: Peter Klein has developed a sophisticated version of infinitism according to which all justified beliefs depend upon an infinite regress of reasons, which successfully responds to the most compelling extant objections to the view.
Abstract: One way to solve the epistemic regress problem would be to show that we can acquire justification by means of an infinite regress. This is infinitism. This view has not been popular, but Peter Klein has developed a sophisticated version of infinitism according to which all justified beliefs depend upon an infinite regress of reasons. Klein's argument for infinitism is unpersuasive, but he successfully responds to the most compelling extant objections to the view. A key component of his position is his claim that an infinite regress is necessary, but not sufficient, for justified belief. This enables infinitism to avoid a number of otherwise compelling objections. However, it commits infinitism to the existence of an additional feature of reasons that is necessary and, together with the regress condition, sufficient for justified belief. The trouble with infinitism is that any such condition could account for the connection between justification and truth only by undermining the rationale for the regress condition itself.

Journal ArticleDOI
Chuang Liu1
01 Feb 2004-Synthese
TL;DR: It is argued that dispositional properties should be regarded as admissible properties for laws and that such an inclusion supplies the much needed connection between idealized models and the laws they `produce' or `accommodate'.
Abstract: In this paper, I first give a brief summary of a critique of the traditional theories of approximation and idealization; and after identifying one of the major roles of idealization as detaching component processes or systems from joints of nature, a detailed analysis is given of idealized laws -- which are discoverable and/or applicable -- in such processes and systems (i.e. idealized model systems). Then, arguments are given for the inclusion of dispositional properties in the class of admissible properties for laws; and such an inclusion turns out to be crucial to our understanding of the relation between idealized models and the laws they 'produce' or 'accommodate'. And then I argue that idealized laws so produced or accommodated in the models are either true simpliciter or only approximately true but not so because of the idealizations in question. Finally I compare my theory with some existing theories of laws of nature.

Journal ArticleDOI
01 Jan 2004-Synthese
TL;DR: It is argued that, in order to account for examples where the indexicals `now' and `here' do not refer to the time and location of the utterance, the authors do not have to assume that they have different characters (reference-fixing rules), governed by a single metarule or metacharacter.
Abstract: It is argued that, in order to account for examples where the indexicals `now' and `here' do not refer to the time and location of the utterance, we do not have to assume (pace Quentin Smith) that they have different characters (reference-fixing rules), governed by a single metarule or metacharacter. The traditional, the fixed character view is defended: `now' and `here' always refer to the time and location of the utterance. It is shown that when their referent does not correspond to the time and/or location of the utterance, `now' and `here' work in an anaphoric way, inheriting their reference from another noun phrase. The latter may be explicit or implicit in the discourse. It is also shown that `now' and `here' can inherit their reference from a presupposed or tacit reference. In that case, they are coreferential with what will be labeled a `tacit initiator'. This anaphoric interpretation has the merit of fitting within the Kaplanian distinction between pure indexicals (`now', `here', `today', etc.) and demonstratives (`this', `that', `she', etc.).

Book ChapterDOI
01 Sep 2004-Synthese
TL;DR: The main results are that certain activities of connectionist networks can be interpreted as non-monotonic inferences, and that there is a strict correspondence between the coding of knowledge in Hopfield networks and the knowledge representation in weight-annotated Poole systems.
Abstract: There is a gap between two different modes of computation: the symbolic mode and the subsymbolic (neuron-like) mode. The aim of this paper is to overcome this gap by viewing symbolism as a high-level description of the properties of (a class of) neural networks. Combining methods of algebraic semantics and nonmonotonic logic, the possibility of integrating both modes of viewing cognition is demonstrated. The main results are (a) that certain activities of connectionist networks can be interpreted as non-monotonic inferences, and (b) that there is a strict correspondence between the coding of knowledge in Hopfield networks and the knowledge representation in weight-annotated Poole systems. These results show the usefulness of non-monotonic logic as a descriptive and analytic tool for analyzing emerging properties of connectionist networks. Assuming an exponential development of the weight function, the present account relates to optimality theory — a general framework that aims to integrate insights from symbolism and connectionism. The paper concludes with some speculations about extending the present ideas.