scispace - formally typeset
Search or ask a question

Showing papers in "Synthese in 1983"


Journal ArticleDOI
01 May 1983-Synthese
TL;DR: In this paper, the potentiality of coherence theories to explain the truth connection by means of higher level convictions about probabilities, which are called doxastic ascent, is discussed.
Abstract: A central issue in epistemology concerns the connection between truth and justification. The burden of our paper is to explain this connection. Reliabilism, defended by Goldman, assumes that the connection is one of reliability. We argue that this assumption is too strong. We argue that foundational theories, such as those articulated by Pollock and Chisholm fail to elucidate the connection. We consider the potentiality of coherence theories to explain the truth connection by means of higher level convictions about probabilities, which we call doxastic ascent, and defend such a theory. Our defense appeals to the work of Reid and contemporary cognitive psychology in order to account for the psychological reality of higher level evaluations.

146 citations


Journal ArticleDOI
01 Feb 1983-Synthese
TL;DR: In this paper, a dyadic conditional-obligation operator O(B/A) is proposed to resolve the contrary-duty paradox in deontic logic. But the dyadic operator is not able to resolve Chisholm's paradox.
Abstract: Paradoxes thrive in deontic logic. Perhaps the most interesting and significant is Chisholm's 'contrary to duty paradox'. It is important because it has been taken to show that the formal representation of deontic reasoning requires a dyadic conditional-obligation operator O(B/A), it ought to be the case that B given A. We begin our own addition to l'histoire d'O with a review of the paradox and the reasoning which leads to the introduction of the dyadic operator. We will see that there have been two quite different kinds of conditional-obligation operators suggested to cope with the paradox. Each one has its virtues but captures only a part of O's personality. We argue that neither is able to resolve the paradox. To accomplish that we will need to introduce considerations of tense, and affirm a distinction between conditional and actual obligation. The system of deontic logic we develop, 3-D, contains all this and a rule for detaching tensed actual-ought statements from conditional-ought statements in certain circumstances. After showing how the paradox is resolved in 3-D we conclude with some suggestions concerning the application of the system to moral and legal reasoning. First, let's review Chisholm's paradox. 1 Consider the following sentences: (1) It ought to be the case that Arabella buys a train ticket to visit her grandmother. (2) It ought to be that if Arabella buys the ticket she calls to tell her that she is coming. (3) If Arabella does not buy the ticket it ought to be that she not tell her that she is coming. (4) ArabeUa does not buy the ticket. It appears that (i) the statements 1-4 are consistent. Furthermore it appears that (ii) no one of these statements logically implies any other one.

73 citations


Journal ArticleDOI
01 Apr 1983-Synthese
TL;DR: In this article, the authors focus on attempts by Wilfrid Sellars and Laurence Bonjour to show that putative immediate knowledge really depends on higher-level knowledge or justified belief about the status of the beliefs involved in the putative immediately knowledge.
Abstract: Immediate knowledge is here construed as true belief that does not owe its status as knowledge to support by other knowledge (or justified belief) of the same subject. The bulk of the paper is devoted to a criticism of attempts to show the impossibility of immediate knowledge. I concentrate on attempts by Wilfrid Sellars and Laurence Bonjour to show that putative immediate knowledge really depends on higher-level knowledge or justified belief about the status of the beliefs involved in the putative immediate knowledge. It is concluded that their arguments are lacking in cogency.

73 citations




Journal ArticleDOI
01 Jan 1983-Synthese
TL;DR: Barwise and Perry as discussed by the authors showed that scenes are partial submodels of the model that represents the actual world, which is consistent with Barwise's semantics as well as conventional model theoretic semantics.
Abstract: of the form 'NP V's S', where V is a perception verb and S is a sentence with the main verb(s) in the naked infinitive f o r m i.e., in the infinitive form without 'to'. The idea of situation semantics as applied to NI sentences I is that the embedded sentences-e.g. , 'Mary enter(s)' in (1)-are to be evaluated with respect to a partial model which represents something like the totality of what the perceiver perceives. This model, called a \"scene\" by Barwise, provides a truth value only for some sentences intuitively, those whose truth or falsity is determined by what the perceiver perceives. (What Barwise actually says is \"If our semantics represents the world as some kind of model M, then the scenes and other situations will be something like partial submodels of M \" (p. 393). There is nothing in Barwise (1981) or in Barwise and Perry (1981) to indicate that scenes are not isomorphic to partial models, so I will speak from now on as if scenes are partial submodels of the model that represents the actual world. More generally, I will tend to assume that Barwise's semantics approaches as nearly as possible to ordinary model theoretic semantics as is consistent with Barwise (1981) and Barwise and Perry (1981). Occasionally I will comment parenthetically on the ways in which situation semantics may diverge from more familiar theories, the general trend of these comments being that

43 citations


Journal ArticleDOI
Ernest Lepore1
01 Feb 1983-Synthese
TL;DR: This discussion is to reconstruct the case against SS by demonstrating that the concept of truth is central to semantics and that a theory which issues in truth-conditions for sentences of a language L must be the heart of a semantic theory for L.
Abstract: (among others) all argue that Structural Semantic theories (hereafter, SS) do not articulate relations between expressions and the world, that they do not provide an account of the conditions under which sentences are true, and therefore, these theories are not really semantics. In their place, many philosophers and linguists endorse model-theoretic semantics (hereafter, MTS). t They do so because they believe that MTS compensates for what is deficient in SS. My aim in this discussion is to reconstruct the case against SS by demonstrating that the concept of truth is central to semantics and that a theory which issues in truth-conditions for sentences of a language L must be the heart of a semantic theory for L. But I will also argue that MTS theories by themselves, somewhat surprisingly, are inadequate in exactly the same way as SS theories. If I am correct, then the widespread view that MTS can provide either a theory of meaning or a theory of truth-conditions for the sentences of a natural language is mistaken. SS theorists countenance properties and relations like synonymy, antinomy, meaningfulness, meaninglessness or semantic anomaly, redundancy, and ambiguity as a good initial conception of the range of semantics. They do so because, for them, a semantic theory for a language L is a theory of meaning for L and they believe that properties and relations like these are central to our concept of meaning. Therefore, any theory which did not bear on all, or at least many, of these phenomena should be suspect as a semantic theory [8, 141.

40 citations


Journal ArticleDOI
Graeme Forbes1
01 Feb 1983-Synthese
TL;DR: In this paper, two versions of a single puzzle are presented, which are modal ones, which deserve to be called paradoxes, and some apparatus in terms of which the apparently conflicting principles which generate the puzzles can be rendered consistent.
Abstract: 0. This paper is about two puzzles, or two versions of a single puzzle, which deserve to be called paradoxes, and develops some apparatus in terms of which the apparently conflicting principles which generate the puzzles can be rendered consistent. However , the apparatus itself is somewhat controversial: the puzzles are modal ones, and the resolution to be advocated requires the adoption of a counterpar t theoret ic semantics of essentially the kind proposed by David Lewis, ~ which in turn requires qualified reject ion of certain modal theses about identity which are valid in $5. Of these, we will label the strongest ' the necessi ty of identity' and write it as

37 citations



Journal ArticleDOI
01 Mar 1983-Synthese
TL;DR: In this article, the question whether and in what way languages and language use involve convention is addressed, with special reference to David Lewis's account of convention in general, and data are presented which show that Lewis has not captured the sense of 'convention' involved when we speak of adopting a linguistic convention.
Abstract: The question whether and in what way languages and language use involve convention is addressed, With special reference to David Lewis's account of convention in general. Data are presented which show that Lewis has not captured the sense of 'convention' involved when we speak of adopting a linguistic convention. He has, In effect, attempted an account of social conventions. An alternative account of social convention and an account of linguistic convention are sketched.

34 citations


Book ChapterDOI
01 Apr 1983-Synthese
TL;DR: In this paper, it is argued that beliefs, a special class of representations, have their contents limited by the sort of information the system in which they occur can pick up and process.
Abstract: By examining the general conditions in which a structure could come to represent another state of affairs, it is argued that beliefs, a special class of representations, have their contents limited by the sort of information the system in which they occur can pick up and process. If a system—measuring instrument, animal or human being—cannot process information to the effect that something is Q, it cannot represent something as Q. From this it follows (for simple, ostensively acquired concepts at least) that if an organism that has the information-processing capabilities for knowing that something is Q.

Journal ArticleDOI
01 Nov 1983-Synthese
TL;DR: In this paper, it is shown that the moral principle of reasonableness has a stringently rational justification in that to deny or violate it is to incur self-contradiction.
Abstract: Rationality and reasonableness are often sharply distinguished from one another and are even held to be in conflict. On this construal, rationality consists in means-end calculation of the most efficient means to one's ends (which are usually taken to be self-interested), while reasonableness consists in equitableness whereby one respects the rights of other persons as well as oneself. To deal with this conflict, it is noted that both rationality and reasonableness are based on reason, which is analyzed as the power of attaining truth, and especially necessary truth. It is then shown that, by the rationality involved in reason, the moral principle of reasonableness, the Principle of Generic Consistency (PGC), has a stringently rational justification in that to deny or violate it is to incur self-contradiction. Objections are considered bearing on relevance and motivation. It is concluded that, where reasonableness and egoistic rationality conflict, the former is rationally superior.


Journal ArticleDOI
01 Nov 1983-Synthese
TL;DR: This paper reviewed studies based on Wason's "4-card" selection task, and made a claim that the issue of whether or not scientists are rational should be approached by philosophers and psychologists with appropriate respect for the complexities of the issue.
Abstract: Recent advances in the cognitive psychology of inference have been of great interest to philosophers of science. The present paper reviews one such area, namely studies based upon Wason's “4-card” selection task. It is argued that interpretation of the results of the experiments is complex, because a variety of inference strategies may be used by subjects to select evidence needed to confirm or disconfirm a hypothesis. Empirical evidence suggests that which strategy is used depends in part on the semantic, syntactic, and pragmatic context of the inference problem at hand. Since the factors of importance are also present in real-world science, and similarly complicate its interpretation, the selection task, though it does not present a “quick fix”, represents a kind of microcosm of great utility for the understanding of science. Several studies which have examined selection strategies in more complex problem-solving environments are also reviewed, in an attempt to determine the limits of generalizability of the simpler selection tasks. Certain interpretational misuses of laboratory research are described, and a claim made that the issue of whether or not scientists are rational should be approached by philosophers and psychologists with appropriate respect for the complexities of the issue.

Journal ArticleDOI
Deborah G. Mayo1
01 Dec 1983-Synthese
TL;DR: The crux of the argument is that by being able to objectively control error frequencies NPT is able to evaluate what has or has not been learned from the result of a statistical test, and provides an objective theory of statistics.
Abstract: Theories of statistical testing may be seen as attempts to provide systematic means for evaluating scientific conjectures on the basis of incomplete or inaccurate observational data. The Neyman-Pearson Theory of Testing (NPT) has purported to provide an objective means for testing statistical hypotheses corresponding to scientific claims. Despite their widespread use in science, methods of NPT have themselves been accused of failing to be objective; and the purported objectivity of scientific claims based upon NPT has been called into question. The purpose of this paper is first to clarify this question by examining the conceptions of (I) the function served by NPT in science, and (II) the requirements of an objective theory of statistics upon which attacks on NPT's objectivity are based. Our grounds for rejecting these conceptions suggest altered conceptions of (I) and (II) that might avoid such attacks. Second, we propose a reformulation of NPT, denoted by NPT*, based on these altered conceptions, and argue that it provides an objective theory of statistics. The crux of our argument is that by being able to objectively control error frequencies NPT* is able to objectively evaluate what has or has not been learned from the result of a statistical test.

Journal ArticleDOI
01 Mar 1983-Synthese
TL;DR: Aristotle tells us more than once that "to be" is said in many ways, whatever that means as discussed by the authors. But in the course of considering what it might mean, Owen, a long time ago (1960), told us that, while some people held that "being" has a single meaning in all its applications, Aristotle was one of those who denied this.
Abstract: Aristotle tells us more than once that ‘to be’ is said in many ways, whatever that means. I had better say straight off that I can find very little in the present paper that tells us what that means. But in the course of considering what it might mean, Owen, a long time ago (1960), told us that, while some people held that ‘being’ has “a single meaning” in all its applications, Aristotle was one of those who denied this. In his view, to be was to be something or other . . . .1

Journal ArticleDOI
01 Jul 1983-Synthese
TL;DR: A basic framework for handling theories and for analyzing a number of key metatheoretical concepts is considered, in broad outline, which encourages the application of model-theoretic concepts and results to questions con cerning the structure and dynamics of empirical theories.
Abstract: The objective of the present paper is to introduce a new approach to the study of the logical structure of scientific theories. We shall consider, in broad outline, a basic framework for handling theories and for analyzing a number of key metatheoretical concepts. The framework itself may be formally captured within a suitable system of set theory. However, its major innovative and heuristic thrust derives from some notions em ployed in modern abstract logic; in particular, it encourages the application of model-theoretic concepts and results to questions con cerning the structure and dynamics of empirical theories. The viewpoint adopted here shares some features in common with previous attempts at framing metascientific concepts, especially with the 'logistic' or model-theoretic, and with the so-called structuralist ap proaches. Like the former, it stresses the use of formal semantics in metascience; with the latter, it also appeals to some of the more abstract and structural characteristics of theories. However, it differs from these and other traditional approaches, notably in the role in which logic is portrayed. The nature and function of logic in scientific theorizing is here construed in an unusually liberal fashion: it is neither seen as a constraint on the syntactical form of theories, nor viewed as imposing limitations on the kinds of inferences permitted in science. This flexible and versatile conception of logic will be seen to amount to a somewhat radical departure from traditional usage. It also constitutes what the authors believe to be the clearest grounds for upholding the present framework. It should be mentioned at the outset that no new formal results are brought in this paper;1 nor do we pretend to offer a complete and polished semantical machinery whose fine details are fully developed at each point. Instead, we shall attempt to present something of the flavour of this framework with a minimum of technical fuss.2 We shall also try to provide some arguments in its support, and to indicate how it may lead to some fresh insights on a range of familiar, and sometimes controversial,

Journal ArticleDOI
01 May 1983-Synthese
TL;DR: The authors argue that a precise formulation of this new approach reveals that it is inadequate as a solution to skepticism, and that a more precise formulation reveals that such an approach reveals it is not the best solution.
Abstract: Traditionally, skeptics as well as their opponents have agreed that in order to know that p one must be able, by some preferred means, to rule out all the alternatives to p. Recently, however, some philosophers have attempted to avert skepticism not (merely) by weakening the preferred means but rather by articulating a subset of the alternatives to p — the so-called relevant alternatives — and insisting that knowledge that p requires only that we be able (by the preferred means) to rule out members of the set. In this paper I argue that a precise formulation of this new approach reveals it inadequate as a solution to skepticism.

Journal ArticleDOI
01 Aug 1983-Synthese

Journal ArticleDOI
01 Dec 1983-Synthese
TL;DR: Weighted averaging is a method for aggregating the totality of information, both regimented and unregimented, possessed by an individual or group of individuals as discussed by the authors, which is justified by Wagner's Theorem exhibiting that any method satisfying the conditions of the Irrelevance of Alternatives and Zero Unanimity must, when applied to three or more alternatives, be weighted averaging.
Abstract: Weighted averaging is a method for aggregating the totality of information, both regimented and unregimented, possessed by an individual or group of individuals. The application of such a method may be warranted by a theorem of the calculus of probability, simple conditionalization, or Jeffrey's formula for probability kinematics, all of which average in terms of the prior probability of evidence statements. Weighted averaging may, however, be applied as a method of rational aggregation of the probabilities of diverse perspectives or persons in cases in which the weights cannot be articulated as the prior probabilities of statements of evidence. The method is justified by Wagner's Theorem exhibiting that any method satisfying the conditions of the Irrelevance of Alternatives and Zero Unanimity must, when applied to three or more alternatives, be weighted averaging.

Journal ArticleDOI
01 Sep 1983-Synthese
TL;DR: The pedagogical turn has two forms: one asks how a concept or term would be learned, the other how they might be taught as discussed by the authors, which is a psychological doctrine about how or what people learn.
Abstract: My concern is with Wittgenstein's "pedagogical turn", the move he makes when he asks, "How would one learn or teach this concept or word?" I shall pay some attention to a hierarchy of modes of teaching that can be discovered in Wittgenstein's later work in order to move on into a consideration of some points about the acquisition of "world pictures" (Weltbilder) in On Certainty. I shall conclude with some comments about a problem in the ethics of education indoctrination for On Certainty highlights this problem in an interesting and unusual way. The pedagogical turn has two forms: one asks how a concept or term would be learned, the other how they might be taught. The first of these, as I have tried to show in another context (Macmillan 1981), is usually limited to showing either how complex a particular concept is,1 or else that a particular form of words could not be learned because of its confusedness.2 What is not attempted by Wittgenstein with the learning form is a psychological doctrine about how or what people learn rather, it is to be seen as part of his larger attack on the nature of language-games and concepts. The fact of learning is a given in his discussion rather than the center of attention.3 Nor does Wittgenstein hold that learning and teaching are the same. This is as it should be, since learning and teaching are quite different concepts. While a question, "Who taught you that?" might be asked in any case where a learning claim has been made, its relevance can be rejected easily by appeals to self-teaching or to experience. As Witt genstein points out (PI II, 227), there are some things in this case, expert judgment about the genuineness of expressions of feeling that are learned but for which teaching is either irrelevant or takes unusual forms.

Journal ArticleDOI
01 Dec 1983-Synthese
TL;DR: In this paper, it is argued that we need a richer version of Bayesian decision theory, admitting both subjective and objective probabilities and providing rational criteria for choice of our prior probabilities.
Abstract: It is argued that we need a richer version of Bayesian decision theory, admitting both subjective and objective probabilities and providing rational criteria for choice of our prior probabilities. We also need a theory of tentative acceptance of empirical hypotheses. There is a discussion of subjective and of objective probabilities and of the relationship between them, as well as a discussion of the criteria used in choosing our prior probabilities, such as the principles of indifference and of maximum entropy, and the simplicity ranking of alternative hypotheses.

Journal ArticleDOI
Ernest Sosa1
01 Apr 1983-Synthese
TL;DR: In this paper, the nature of epistemic justification and its supervenience are discussed, as well as its role in epistemic knowledge and knowledge and justification, and its relationship to epistemic belief.
Abstract: A. Knowledge and Justification: The nature of epistemic justification and its supervenience.

Journal ArticleDOI
01 Nov 1983-Synthese
TL;DR: It is argued that an adequate understanding of the rationality of an agent's actions is not possible without a satisfactory theory of the agent's memory and of the trade-offs involved in management of the memory, particularly involving “compartmentalization” of the belief set.
Abstract: A tacit and highly idealized model of the agent's memory is presupposed in philosophy. The main features of a more psychologically realistic duplex (orn-plex) model are sketched here. It is argued that an adequate understanding of the rationality of an agent's actions is not possible without a satisfactory theory of the agent's memory and of the trade-offs involved in management of the memory, particularly involving “compartmentalization” of the belief set. The discussion identifies some basic constraints on the organization of knowledge representations in general.

Journal ArticleDOI
01 Apr 1983-Synthese
TL;DR: In this paper, the authors examine modest foundationalism in relation to some important criteria of epistemic dependence, including causal and epistemic dependences, and distinguish four kinds of reasons: reasons to believe, reasons one has for believing, reasons for which one believes, and reasons why one believes.
Abstract: This paper is an examination of modest foundationalism in relation to some important criteria of epistemic dependence. The paper distinguishes between causal and epistemic dependence and indicates how each might be related to reasons. Four kinds of reasons are also distinguished: reasons to believe, reasons one has for believing, reasons for which one believes, and reasons why one believes. In the light of all these distinctions, epistemic dependence is contrasted with defeasibility, and it is argued that modest foundationalism is not committed to criteria of epistemic dependence on which foundational beliefs are indefeasible. Modest foundationalism is contrasted with coherentism and is shown to be hospitable to a causal criterion of epistemic dependence, compatible with reliabilism, and neutral with respect to skepticism.

Journal ArticleDOI
01 Aug 1983-Synthese

Journal ArticleDOI
01 Dec 1983-Synthese
TL;DR: In this article, the authors argue that philosophical theories of objective probability have failed to satisfy amethodological standard, a requirement to the effect that the conception offered be specified with the precision appropriate for a physical interpretation of an abstract formal calculus and be fully explicated in terms of concepts, objects or phenomena understood independently of the idea of physical probability.
Abstract: I argue that to the extent to which philosophical theories of objective probability have offered theoretically adequateconceptions of objective probability (in connection with such desiderata as causal and explanatory significance, applicability to single cases, etc.), they have failed to satisfy amethodological standard — roughly, a requirement to the effect that the conception offered be specified with the precision appropriate for a physical interpretation of an abstract formal calculus and be fully explicated in terms of concepts, objects or phenomena understood independently of the idea of physical probability. The significance of this, and of the suggested methodological standard, is then briefly discussed.

Journal ArticleDOI
01 Jan 1983-Synthese
TL;DR: In this paper, a new theory of how perception can be intentional with the classic account is presented, based on the existence of information specifying its sources in the environment, which is called the specificity theory.
Abstract: For the purposes of this paper the "intentionality of perceiving" is considered to be the ability of a perceiving organism to apprehend objects in the surrounding environment. Whether one studies the subjective (human) experience of perceiving or the objective facts of perceptually guided behavior, behavior as adaptively adjusted to the environment, the importance of perceptual intentionality is obvious (see Koffka, 1935, and Merleau-Ponty, 1947/1962, for phenomenology; Gibson, 1979, and Holt, 1915, for behavior). In this essay I will contrast a startling and revolutionary new theory of how perception can be intentional with the classic account. The classic account is the theory that perceptual intentionality always reduces to the having of a mental representation (Aquila, 1977; Fodor & Pylyshyn, 1981). 1 The novel theory is that of James Gibson, for whom intentionality is based on the existence of information specifying its sources in the environment (Gibson, 1979; Turvey, Shaw, Reed, & Mace, 1981). For convenience I will speak of either the representational or the specificity theories of why perceiving is intentional.

Book ChapterDOI
01 Mar 1983-Synthese
TL;DR: In earlier papers, this paper sketches an approach to logical and linguistic semantics which embodies some of the same ideas on which Wittgenstein's notion of language-game is based, and certain activities of this kind are construed as games in the strict sense of the mathematical theory of games.
Abstract: In earlier papers, I have sketched an approach to logical and linguistic semantics which embodies some of the same ideas on which Witt genstein's notion of language-game is based.1 One of these ideas is that in order to appreciate the semantics of a word (or any other primitive expression of a language) we should study its function in the rule-governed human activities which serve to connect our language (or a fragment of a language) with the world. What Wittgenstein called language-games can typically be considered as such linking activities. In the languages (or parts of languages) I will study in this paper, certain activities of this kind are construed as games in the strict sense of the mathematical theory of games. They are called semantical games, and the semantics based on them is called game theoretical semantics. Its basic ideas are explained most easily by reference to formal but interpreted first-order languages. Such a language, say L, can be assumed to have a finite number of primitive predicates which are interpreted on some given fixed domain D. Their being interpreted on D amounts to saying that any atomic sentence formed from one of the predicates of L plus the appropriate number of proper names of the elements of D (whether the names are in L or not) has a definite truth-value, true or false. One of the main tasks of any semantics for first-order formal languages is to extend this assignment of truth-values to the rest of the sentences of L. This can be done by defining certain two-person games G(S), one for each sentence S of L. The players are called Myself (or I) and Nature. The game G(5) can be thought of as an attempt on the part of myself to verify 5 against the schemes of a recalcitrant Nature. This motivates the games rules, which may be formulated as follows:

Journal ArticleDOI
01 Oct 1983-Synthese
TL;DR: It is shown that the causal inference rules which link correlation, a kind of partial correlation, and a conception of causation are invalid, and anew methodology is required for causal inference.
Abstract: Two kinds of causal inference rules which are widely used by social scientists are investigated. Two conceptions of causation also widely used are explicated — the INUS and probabilistic conceptions of causation. It is shown that the causal inference rules which link correlation, a kind of partial correlation, and a conception of causation areinvalid. It is concluded anew methodology is required for causal inference.