scispace - formally typeset
Search or ask a question

Showing papers in "Synthese in 1966"


Book ChapterDOI
01 Dec 1966-Synthese
TL;DR: In this paper, an analysis of grading principles from the viewpoint of statistical decision theory and game theory is presented, with a focus on the problem of devising any general ethical rules of behavior for simple two-person games.
Abstract: The present paper offers an analysis of grading principles from the viewpoint of statistical decision theory and game theory. The mistaken notion is widely held that the plain man is really clear about practical ethical and moral issues and that philosophers need only tidy up certain wayward corners of the subject.2 Personally I find difficult the problem of devising any general ethical rules of behavior for simple two-person games; the ethical complexities of progressive taxation, tariff barriers, or treatment of sexual psychopaths are beyond any exact conceptual analysis. That decisions are and must be made about these issues no more proves that their ethical aspects are completely understood than does the fact that the Romans built bridges prove that they had any quantitative grasp of the mechanical theory of stress.

194 citations



Journal ArticleDOI
01 Dec 1966-Synthese
TL;DR: The immediate basis for a special theory of behavior under uncertainty is the subjective sensation that an action may not uniquely determine the price-income parameters defining the set of possible actions.

74 citations


Journal ArticleDOI
01 Nov 1966-Synthese
TL;DR: A new orchestration of this material is presented in this article, with a perspective on measurement theory which has greater breadth, solidity, and extrapolative thrust than has been attained previously.
Abstract: Among the foundational issues of scientific methodology, the theory of measurement enjoys a notable distinction namely, attention. For while most conceptual procedures in technical science still remain apallingly underexamined by serious metascience, a voluminous literature has formed around the topics of measurement and scaling, especially in the behavioral sciences of the past two decades. Moreover, the swelling chorus of these contributions has achieved a harmony which increasingly approaches unison, its composition being Campbell's [2] classic theme embellished by modern set-theoretical notions of a formal representation system (e.g., Suppes & Zinnes [17]), together with secondary motifs from Stevens' [14] theory of scale types and a still unfinished coda on 'conjoint measurement' (Luce& Tukey [9]). It is no intent of mine to suggest that anything is basically amiss with this development. Quite the opposite: What has happened in measurement theory is convincing evidence that powerful advances in scientific metatheory are possible when rigorous thinkers are willing to put some intellectual muscle into the enterprise. Even so, the tonal balance of current measurement theory does not ring true to my ear. Importantly distinct melodic lines have become fused where they should be played at counterpoint. The theory of scale types has effected a strange inversion of the 'meaningfulness' air which I find teeth-grittingly discordant. And certain fundamental tones which I would score forte are at present scarcely audible. What I shall here attempt, therefore, is a new orchestration of this material. At no one place will my version differ radically from standard doctrine on these matters; but through an accretion of differences in emphasis and phrasing I hope to convey a perspective on measurement theory which has greater breadth, solidity, and extrapolative thrust than has been attained previously. Actually, my primary concern will be with scaling, not measurement. For I shall argue that 'measurement' in the tough sense of the word must be distinguished from scaling, and that very little of the literature on

70 citations


Journal ArticleDOI
01 Nov 1966-Synthese

51 citations


Journal ArticleDOI
01 Dec 1966-Synthese
TL;DR: A moral or deontic interpretation of the calculus of intrinsic preferability is proposed which, it is believed, enables us to solve the problem of supererogation.
Abstract: We first summarize and comment upon a 'calculus of intrinsic preferability' which we have presented in detail elsewhere. 1 Then we set forth ' the problem of supererogation' a problem which, according to some, has presented difficulties for deontic logic. And, finally, we propose a moral or deontic interpretation of the calculus of intrinsic preferability which, we believe, enables us to solve the problem of supererogation.

46 citations


Journal Article
01 Jan 1966-Synthese

24 citations


Journal ArticleDOI
01 Dec 1966-Synthese
TL;DR: An approach to model building is presented under the title of a black-box theory of learning, which permits the statement of assumptions of any desired complexity in a language which clearly exhibits their theoretical content.
Abstract: It is argued that current attempts to model human learning behavior commonly fail on one of two counts: either the model assumptions are artificially restricted so as to permit the application of mathematical techniques in deriving their consequences, or else the required complex assumptions are imbedded in computer programs whose technical details obscure the theoretical content of the model. The first failing is characteristic of so-called mathematical models of learning, while the second is characteristic of computer simulation models. An approach to model building which avoids both these failings is presented under the title of a black-box theory of learning. This method permits the statement of assumptions of any desired complexity in a language which clearly exhibits their theoretical content.

14 citations


Journal ArticleDOI
01 Dec 1966-Synthese
TL;DR: The principal task of this paper is to point out some questions which arise in attempting to deal with decision problems concerned with decisions about strategies for using information within Richard Jeffrey's theory.
Abstract: The theory of subjective probability and utility recently proposed by Professor Richard Jeffrey [1 1 [2], [-3] has several unique features and appears to be in some ways distinctly more satisfactory than earlier theories. There is, however, one very important class of decision problems which is not\" discussed by Professor Jeffrey problems concerned with decisions about strategies for using information. The principal task of this paper is to point out some questions which arise in attempting to deal with these decision problems within Jeffrey's theory. To do this it will be necessary to examine in some detail Jeffrey's discussion of the relation between his theory and those of Ramsey and Savage [4], [5]. There is some obscurity in this discussion, which must be cleared up before the problem of policy decisions (or strategies) can be discussed. The theory that Jeffrey proposes is unique in at least two respects: (a) Probabilities and utilities (desirabilities in Jeffrey's terminology) are assigned to the same entities propositions. (b) Only truth-functional methods of compounding propositions are employed. In this theory two related measures, P and D, defined on a Boolean algebra of propositions (excluding the null element F), are considered which satisfy the following axioms. For all propositions x and y, not identical with F:

8 citations


Journal ArticleDOI
01 Dec 1966-Synthese
TL;DR: In this paper, the authors suggest that the actual extent of the difficulties that lie in wait for a measure of cardinal utilities may have been exaggerated, and suggest an approach, or a point of view, from which the measurement of utilities can be placed upon a firm footing.
Abstract: One of the principal objections to the application of the classical concepts and machinery of utilitarianism turns on various caveats, some urged on philosophical and some on economic grounds, regarding the idea of a measure of cardinal utilities. This project, it is argued, encounters insuperable difficulties. Individual preferences cannot be compared interpersonally we cannot get an intersubjective measure of X's preference for oranges over apples in comparison to Y's preference for peaches over apricots. Considerations of this sort led economists to tend to abandon numerical utilities, and to try and make do with comparative preferences. My aim here is to suggest that the actual extent of the difficulties that lie in wait for a measure of cardinal utilities may have been exaggerated, and to suggest an approach, or a point of view, from which the measurement of utilities can be placed upon a firm footing. The line of thought to be developed will be presented in two stages: the metrization of an individual's preferences, and the interpersonalization of metrized individual preferences, t Consider the preferability-comparisons that a person X may make within a set of three items, a, b, c (which may, at this stage of the discussion be thought of as either particular commodities or as states of

7 citations


Journal ArticleDOI
01 Nov 1966-Synthese
TL;DR: GoodGood as discussed by the authors is a book that bridges the gap between the philosopher and the practical statistician, focusing on the properties and interrelations of several proposed measures of evidential support or confirmation especially when they are applied to arguments from finite samples to finite populations and from finite populations to finite samples.
Abstract: The recent publication of these four books is testimony to the fact that serious philosophical studies on probability, confirmation and induction are tending more and more to steer clear of Hume's problem and to concentrate on developing systematic accounts of criteria for sound inductive inferences and decision making under uncertainty. In Information and Confirmation, H~tkan TSrnebohm examines the properties and interrelations of several proposed measures of evidential support or confirmation especially when they are applied to arguments from finite samples to finite populations and from finite populations to finite samples. The measures especially selected by Trrnebohm for study include range measures of probability, likelihoods and measures of semantic information. One of the objectives of I. J. Good 's book, The Estimation of Probabilities is " to bridge the gap between the philosopher and the practical statistician".: Whether he has succeeded or not, he has written a book which every philosopher interested in the foundations of statistics will find useful. The central problem tackled by Good is that of determining prior probabili ty distributions for use in deriving posterior probabilities with the aid of likelihoods via Bayes theorem. In examining this question, Good does not find it necessary to commit hostages to one specific interpretation of the calculus of probabilities. Although he offers a valuable catalogue of interpretations and has much to say of interest on this score, he attempts to develop his discussion in a manner which is as neutral as possible regarding interpretation.


Book ChapterDOI
01 Sep 1966-Synthese
TL;DR: In this article, von Wright introduced an algorithm for tableaux in logique, based on the notion of tableaux de Boole, a structure of which le quotient se trouve muni d'une structure of the tableaux.
Abstract: I1 s'agit de l'interpr6tation des \"tableaux s6mantiques\" en logique alg6brique [1]. J'ai acquis depuis longtemps d6jh la conviction que l'algorithme des tableaux joue un rSle dans des calculs alg6briques tels que calcul bool6ien, calcul bool6ien monadique, etc... Le glissement de rues pr6occupations relatives aux tableaux s6mantiques [2] ~t celles qui sont relatives ~t la logique alg6brique s'est effectu6 de fagon toute naturelle en 1957-1958, par un processus d6celable h travers rues Notes aux Comptes-Rendus de 1958 [3], [4]. La raison en est que les syst~mes sur lesquels j 'ai travaill6 alors (le calcul propositionnel classique, tel quel ou avec les modalit6s M, M' , M \" de von Wright, le calcul des pr6dicats classique) ont tous un point commun (qu'ils partagent aussi avec les calculs intuitionnistes): ils satisfont au \"th6or~me de remplacement\". Ainsi, k F ~ G d6finit, sur leurs ensembles de formules, une relation d'6quivalence compatible avec les lois de formation, de sorte que le quotient se trouve muni d'une structure alg6brique. Celle-ci appartient ~t une esp~ce de structures de cette nature qui comporte d'autres &res, et l'id6e vient d'appliquer l'algorithme l'6tude des identit6s r6sultant des axiomes de d6finition des structures de cette esp~ce. D~s lors, il n 'y a plus de formalisation de la logique; donc, on ne fair plus intervenir de mod61es; t'algorithme est ~t justifier directement, ~t partir des axiomes de d6finition. Aux notations pros, il introduit un calcul de s6quences g6n6ralis6es, pour lequel la m~me disposition en tableaux peut ~tre utilis6e. C'est pourquoi, h ces tableaux, je pr6f~re donner le nora de \"tableaux de Beth\" de m~me que l'on donne le nora de \"tableaux de Boole\" aux figures de l'algorithme des \"tables de v6rit6\", quand on les transpose au calcul bool6ien. En fait, on use d'une interpretation nouvelle des formules. Si L e s t une classe de structures de l'esp~ce consid6r6e, assez vaste pour les besoins du raisonnement ult6rieur, et notamment pour contenir des contre-

Book ChapterDOI
01 Sep 1966-Synthese
TL;DR: In this article, the authors define the argument unique le plus puissant en faveur de l’emploi d’une logique non-classique en mecanique quantique.
Abstract: Le but de cet article est simple. Je desire enoncer aussi clairement que possible, sans une longue digression dans des questions techniques, ce que je considere etre l’argument unique le plus puissant en faveur de l’emploi d’une logique non classique en mecanique quantique. Il y a une tres grande litterature mathematique et philosophique sur la logique de la mecanique quantique, mais a quelques exceptions pres, cette litterature fournit une tres pauvre justification intuitive du fait qu’on considere une logique non classique pour commencer. Le fameux article de Birkhoff et von Neumann (1936) constitue un exemple classique dans la litterature mathematique. Bien que Birkhoff et von Neumann aient examine exhaustivement le developpement des proprietes des geometries projectives et des geometries de lattices qui sont liees a la logique de la mecanique quantique, ils consacrent moins d’un tiers de page (p. 831) aux raisons physiques qui entrainent la consideration de telles lattices. Qui plus est, les quelques lignes qu’ils y consacrent sont loin d’etre claires. La litterature philosophique est toute aussi mauvaise a ce sujet. Une des discussions philosophiques les mieux connues la-dessus est celle qui est exposee dans le dernier chapitre du livre de Reichenbach (1944) sur les fondations de la mecanique quantique. Reichenbach offre une logique fonctionnelle de verite a trois valeurs qui semble avoir peu de rapport avec les propositions de la mecanique quantique, qu’elles soient de nature experimentale ou theorique.

Journal ArticleDOI
01 Sep 1966-Synthese
TL;DR: In this paper, the conditions impérés on doit tenir compte quand on effectue un changement de théorie for passing to a théory meilleure are discussed.
Abstract: ConclusionOn s'occupe de la logique quantique depuis 1935 et pendant cette période beaucoup de résultats ont été obtenus, encore me suis-je abstenu de mentionner les résultats obtenus ces dernières années par les logiciens dans ce domaine. Certains de ces résultats ont un intérêt direct pour les théories physiques. En particulier on voit de quelles conditions impératives on doit tenir compte quand on effectue un changement de théorie pour passer à une théorie meilleure. L'exemple de la théorie fonctionnelle des corpuscules est significatif.

Journal ArticleDOI
S. G. O'Hair1
01 Dec 1966-Synthese
TL;DR: In this article, the authors discuss epistemic concepts such as knowledge, rational belief and evidence, and raise a number of substantive questions in the theory of knowledge, such as what kinds of statements we know or rationally believe to be true, or whether knowledge or rational belief rest on a basis or foundation.
Abstract: 1. The Subject. What is epistemology? We might begin by noting a parallel to the distinction between meta-ethics and normative ethics. Some questions in the theory of knowledge are 'definitional' in character, concerned with the definition or analysis of epistemic concepts such as knowledge, rational belief and evidence. Others are 'substantive', concerned, for example, with what kinds of statement (sentence, proposit ion or whatever) we know or rationally believe to be true, or with whether knowledge or rational belief rest on a basis or foundation. But this, of course, is only a beginning. What is an epistemic concept? Not only do we lack a characterization of such concepts; there is not even a generally accepted list of them. Knowledge is one, of course, by definition. No doubt rat ional belief is another. Some philosophers include belief and even truth. Why? Would it not be better to say that belief is a psychological concept and truth a semantic one? Of course, knowledge is often analysed as a case of true belief. But is every concept that appears in the analysis of an epistemic concept itself epistemic? Again, many philosophers consider that the detailed investigation of more particular concepts such as those of induction, memory or perception belongs to epistemology. Once again, why? Further questions can be raised about the substantive issues. What are these issues? How are they related to one another? Are they all aspects of one or two basic issues?. Unfortunately, neither of the books reviewed here helps with these questions. ~ Schettter, whose book is clear and lively, but very selective, virtually ignores substantive questions in epistemology. 2 He concentrates on the question \"What is knowledge?\" which he rightly construes as definitional (5):

Book ChapterDOI
01 Sep 1966-Synthese
TL;DR: The first version of this bibliography appeared in Dialectica 19 (1965) 166−79; a revised version is reprinted here with the permission of the Editor of DialectICA as mentioned in this paper.
Abstract: The following bibliography lists all published books and articles. Book reviews are not included. This bibliography appeared for the first time in Dialectica 19 (1965) 166–79; a revised version is reprinted here with the permission of the Editor of Dialectica. The list is based upon Beth’s own records. Thanks are due to Mrs. Beth for placing these records at my disposal, and to Mr. J. A. W. Kamp for checking the items on the list against the original books and offprints in Beth’s possession and to Mr. P. G. E. Wesly for the final revision. A complete set of these books and offprints of the articles is also available in the library of the Instituut voor Grondslagenonderzoek en Filosofie der Exacte Wetenschappen at the University of Amsterdam.

Journal ArticleDOI
01 Sep 1966-Synthese
TL;DR: In this article, the authors design les elements of F par les lettres a, b, c,..., les vecteurs de l'espace V par x, y, z, u,....
Abstract: Faisons d’abord quelques remarques sur la theorie des espaces vectoriels sur un corps F. Nous designerons les elements de F par les lettres a, b, c,..., les vecteurs de l’espace V par x, y, z, u, ... .On sait que dans la definition de F entre essentiellement une relation d’ecartement # qui doit satisfaire aux axiomes suivants:

Journal ArticleDOI
01 Sep 1966-Synthese
TL;DR: In this article, a caractere historique is proposed for traiting a theory of the nature of axiomes geometriques, which is based on the theory of E. W. Beth.
Abstract: Le sujet, que je me propose de traiter, a un caractere historique. Je crains que, pour cette raison, il ne s’accorde mal avec les autres sujets traites dans ce colloque. On peut remarquer cependant qu’aussi E. W. Beth s’est vivement interesse a plusieurs questions historiques. En outre, la theorie conventionaliste de Poincare sur la nature des axiomes geometriques est encore discutee, en Europe par exemple par M. H. Freudenthal, en Amerique par M. A. Grunbaum. On n’est toujours pas d’accord sur le contenu et les merites de cette theorie. Il n’y a pas de doute qu’elle est toujours plus ou moins actuelle.

Book ChapterDOI
01 Sep 1966-Synthese
TL;DR: In this paper, the attractions of the tableau-method, developed by Beth since 1955, are discussed and some possibilities for further research are indicated. But such a large aim necessitates a rather loose manner of describing the subject.
Abstract: In this paper I want to show the attractions of the tableau-method, developed by Beth since 1955.1 Moreover, I shall indicate some possibilities for further research. Such a large aim necessitates a rather loose manner of describing the subject.

Book ChapterDOI
01 Sep 1966-Synthese
TL;DR: In this paper, the theory of models is studied in the context of the 1963 International Symposium at Berkeley, ed. by J. W. Addison, L. Henkin and A. Tarski.
Abstract: La premiere partie de cet article (Paragraphes 1 a 4 inclus) est parue en traduction anglaise a The Theory of Models (Proceedings of the 1963 International Symposium at Berkeley, ed. by J. W. Addison, L. Henkin and A. Tarski), Studies in Logic and the Foundations of Mathematics, Amsterdam 1965, 96–106. Rappelons ses definitions et ses principaux resultats.