scispace - formally typeset
Search or ask a question
JournalISSN: 0039-7857

Synthese 

Springer Science+Business Media
About: Synthese is an academic journal published by Springer Science+Business Media. The journal publishes majorly in the area(s): Philosophy of science & Philosophy of language. It has an ISSN identifier of 0039-7857. Over the lifetime, 8112 publications have been published receiving 133507 citations.


Papers
More filters
Journal ArticleDOI
01 Aug 1996-Synthese
TL;DR: F fuzzy logic is used in this paper to describe an imprecise logical system, FL, in which the truth-values are fuzzy subsets of the unit interval with linguistic labels such as true, false, not true, very true, quite true, not very true and not very false, etc.
Abstract: The term fuzzy logic is used in this paper to describe an imprecise logical system, FL, in which the truth-values are fuzzy subsets of the unit interval with linguistic labels such as true, false, not true, very true, quite true, not very true and not very false, etc. The truth-value set, ℐ, of FL is assumed to be generated by a context-free grammar, with a semantic rule providing a means of computing the meaning of each linguistic truth-value in ℐ as a fuzzy subset of [0, 1]. Since ℐ is not closed under the operations of negation, conjunction, disjunction and implication, the result of an operation on truth-values in ℐ requires, in general, a linguistic approximation by a truth-value in ℐ. As a consequence, the truth tables and the rules of inference in fuzzy logic are (i) inexact and (ii) dependent on the meaning associated with the primary truth-value true as well as the modifiers very, quite, more or less, etc. Approximate reasoning is viewed as a process of approximate solution of a system of relational assignment equations. This process is formulated as a compositional rule of inference which subsumes modus ponens as a special case. A characteristic feature of approximate reasoning is the fuzziness and nonuniqueness of consequents of fuzzy premisses. Simple examples of approximate reasoning are: (a) Most men are vain; Socrates is a man; therefore, it is very likely that Socrates is vain. (b) x is small; x and y are approximately equal; therefore y is more or less small, where italicized words are labels of fuzzy sets.

1,273 citations

Journal ArticleDOI
01 Oct 1974-Synthese
TL;DR: This paper argued that many philosophers who accept reductivism do so primarily because they wish to endorse the generality of physics vis d vis the special sciences: roughly, the view that all events which fall under any science are physical events and hence fall under the laws of physics.
Abstract: A typical thesis of positivistic philosophy of science is that all true theories in the special sciences should reduce to physical theories in the long run. This is intended to be an empirical thesis, and part of the evidence which supports it is provided by such scientific successes as the molecular theory of heat and the physical explanation of the chemical bond. But the philosophical popularity of the reductivist program cannot be explained by reference to these achievements alone. The development of science has witnessed the proliferation of specialized disciplines at least as often as it has witnessed their reduction to physics, so the widespread enthusiasm for reduction can hardly be a mere induction over its past successes. I think that many philosophers who accept reductivism do so primarily because they wish to endorse the generality of physics vis d vis the special sciences: roughly, the view that all events which fall under the laws of any science are physical events and hence fall under the laws of physics. 1 For such philosophers, saying that physics is basic science and saying that theories in the special sciences must reduce to physical theories have seemed to be two ways of saying the same thing, so that the latter doctrine has come to be a standard construal of the former. In what follows, I shall argue that this is a considerable confusion. What has traditionally been called 'the unity of science' is a much stronger, and much less plausible, thesis than the generality of physics. If this is true it is important. Though reductionism is an empirical doctrine, it is intended to play a regulative role in scientific practice. Reducibility to physics is taken to be a constraint upon the acceptability of theories in the special sciences, with the curious consequence that the more the special sciences succeed, the more they ought to disappear. Methodological problems about psychology, in particular, arise in just this way: the assumption that the subject-matter of psychology is part of the subject-matter of physics is taken to imply that psychological theories must reduce to physical theories, and it is this latter principle

1,235 citations

Journal ArticleDOI
01 Jul 2001-Synthese
TL;DR: It is shown how a formal semantictheory of discourse interpretation can be used to define speech acts and to avoid murky issues concerning the metaphysics of action.
Abstract: In this paper, we address several puzzles concerning speech acts,particularly indirect speech acts. We show how a formal semantictheory of discourse interpretation can be used to define speech actsand to avoid murky issues concerning the metaphysics of action. Weprovide a formally precise definition of indirect speech acts, includingthe subclass of so-called conventionalized indirect speech acts. Thisanalysis draws heavily on parallels between phenomena at the speechact level and the lexical level. First, we argue that, just as co-predicationshows that some words can behave linguistically as if they're `simultaneously'of incompatible semantic types, certain speech acts behave this way too.Secondly, as Horn and Bayer (1984) and others have suggested, both thelexicon and speech acts are subject to a principle of blocking or ``preemptionby synonymy'': Conventionalized indirect speech acts can block their`paraphrases' from being interpreted as indirect speech acts, even ifthis interpretation is calculable from Gricean-style principles. Weprovide a formal model of this blocking, and compare it withexisting accounts of lexical blocking.

1,143 citations

Journal ArticleDOI
01 Jul 1989-Synthese
TL;DR: In this article, an alternative theory of knowing that takes into account the thinking organism's cognitive isolation from "reality" is presented, focusing specifically on the adaptive function of cognition, Piaget's scheme theory, the process of communication, and the subjective perspective on social interaction.
Abstract: The existence of objective knowledge and the possibility of communicating it by means of language have traditionally been taken for granted by educators. Recent developments in the philosophy of science and the historical study of scientific accomplishments have deprived these presuppositions of their former plausibility. Sooner or later, this must have an effect on the teaching of science. In this paper I am presenting a brief outline of an alternative theory of knowing that takes into account the thinking organism’s cognitive isolation from ‘reality’. This orientation was proposed by Vico at the beginning of the 18th century, disregarded for two hundred years, and then propounded independently by Piaget as a developmentaly grounded constructivist epistemology. The paper focuses specifically on the adaptive function of cognition, Piaget’s scheme theory, the process of communication, and the subjective perspective on social interaction. In the concluding section it then suggests some of the consequences the shift of epistemological presuppositions might have for the practice of teaching.

1,097 citations

Journal ArticleDOI
01 Sep 1979-Synthese
TL;DR: The Mathematical Foundations of Quantum Mechanics as mentioned in this paper were the first to provide a rigorous mathematical formulation of quantum theory and a systematic comparison with classical mechanics so that the full ramifications of the quantum revolution could be clearly revealed.
Abstract: Classical mechanics was first envisaged by Newton, formed into a powerful tool by Euler, and brought to perfection by Lagrange and Laplace. It has served as the paradigm of science ever since. Even the great revolutions of 19th century phys icsnamely, the FaradayMaxwell electro-magnetic theory and the kinetic t h e o r y w e r e viewed as further support for the complete adequacy of the mechanistic world view. The physicist at the end of the 19th century had a coherent conceptual scheme which, in principle at least, answered all his questions about the world. The only work left to be done was the computing of the next decimal. This consensus began to unravel at the beginning of the 20th century. The work of Planck, Einstein, and Bohr simply could not be made to fit. The series of ad hoc moves by Bohr, Eherenfest, et al., now called the old quantum theory, was viewed by all as, at best, a stopgap. In the period 1925-27 a new synthesis was formed by Heisenberg, Schr6dinger, Dirac and others. This new synthesis was so successful that even today, fifty years later, physicists still teach quantum mechanics as it was formulated by these men. Nevertheless, two foundational tasks remained: that of providing a rigorous mathematical formulation of the theory, and that of providing a systematic comparison with classical mechanics so that the full ramifications of the quantum revolution could be clearly revealed. These tasks are, of course, related, and a possible fringe benefit of the second task might be the pointing of the way 'beyond quantum theory'. These tasks were taken up by von Neumann as a consequence of a seminar on the foundations of quantum mechanics conducted by Hilbert in the fall of 1926. In papers published in 1927 and in his book, The Mathemat ical Foundations of Quantum Mechanics, von Neumann provided the first completely rigorous

1,055 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
2023252
2022649
20211,390
2020262
2019266
2018299