scispace - formally typeset
Search or ask a question

Showing papers in "Logique Et Analyse in 2016"


Journal Article
TL;DR: A general model for the rational resolution of disputes about logic is proposed, dispensing with a traditional notion of the a priori in logic and some objections to which this might give rise.
Abstract: In this paper, I propose a general model for the rational resolution of disputes about logic, and discuss a number of its features. These include its dispensing with a traditional notion of the a priori in logic, and some objections to which this might give rise.

21 citations


Journal ArticleDOI
TL;DR: The authors describes how three now almost forgotten mid-20th-century logicians, the American Paul Jacoby and the Frenchmen Augustin Sesmat and Robert Blanche, all ardent Catholics, tried to restore traditional predicate logic to a position of respectability by expanding the classic Square of Opposition to a hexagon of logical relations, showing the logical and cognitive advantages of such an expansion.
Abstract: The present study describes how three now almost forgotten mid-20th-century logicians, the American Paul Jacoby and the Frenchmen Augustin Sesmat and Robert Blanche, all three ardent Catholics, tried to restore traditional predicate logic to a position of respectability by expanding the classic Square of Opposition to a hexagon of logical relations, showing the logical and cognitive advantages of such an expansion. The nature of these advantages is discussed in the context of modern research regarding the relations between logic, language, and cognition. It is desirable to call attention to these attempts, as they are, though almost totally forgotten, highly relevant against the backdrop of the clash between modern and traditional logic. It is argued that this clash was and is unnecessary, as both forms of predicate logic are legitimate, each in its own right. The attempts by Jacoby, Sesmat, and Blanche are, moreover, of interest to the history of logic in a cultural context in that, in their own idiosyncratic ways, they fit into the general pattern of the Catholic cultural revival that took place roughly between the years 1840 and 1960. The Catholic Church had put up stiff resistance to modern mathematical logic, considering it dehumanizing and a threat to Catholic doctrine. Both the wider cultural context and the specific implications for logic are described and analyzed, in conjunction with the more general philosophical and doctrinal issues involved.

20 citations


Journal ArticleDOI
TL;DR: In this article, the authors argue that one of the two key assumptions on which the proof rests deprives McGee's result of the significance he and the realist want to attribute to it.
Abstract: Mathematical realists have long invoked the categoricity of axiomatizations of arithmetic and analysis to explain how we manage to fix the intended meaning of their respective vocabulary. Can this strategy be extended to set theory? Although traditional wisdom recommends a negative answer to this question, Vann McGee (1997) has offered a proof that purports to show otherwise. I argue that one of the two key assumptions on which the proof rests deprives McGee's result of the significance he and the realist want to attribute to it. I consider two strategies to deal with the problem --- one of which is outlined by McGee himself (2000) --- and argue that both of them fail. I end with some remarks on the prospects for mathematical realism in the light of my discussion.

12 citations


Journal Article
TL;DR: In this paper, a set theory with a universal set, CUSι, is presented, which uses a different sequence of restricted equivalence relations from Church's, such that the singleton function is a 2-equivalence class and hence a set, but lacks unrestricted axioms of sum and product set.
Abstract: A Platonistic set theory with a universal set, CUSι, in the spirit of Alonzo Church’s “Set Theory with a Universal Set,” is presented; this theory uses a different sequence of restricted equivalence relations from Church’s, such that the singleton function is a 2-equivalence class and hence a set, but (like Emerson Mitchell’s set theory, and unlike Church’s), it lacks unrestricted axioms of sum and product set. The theory has an axiom of unrestricted pairwise union, however, and unrestricted complements. An interpretation of the axioms in a set theory similar to Zermelo-Fraenkel set theory with global choice and urelements (which play the role of new sets) is presented, and the interpretations of the axioms proved, which proves their relative consistency. The verifications of the basic axioms are presented in considerably greater generality than necessary for the main result, to answer a query of Thomas Forster and Richard Kaye. The existence of the singleton function partially rebuts a conjecture of Church about the unification of his set theory with Quine’s New Foundations, but the natural extension of the theory leads to a variant of the Russell paradox.

11 citations


Journal Article
TL;DR: The project of logic as a theoretical tool useful for the sciences and humanities involves, as a crucial step, logical formalization, the conversion of sentences of natural language to formulas of a formal language.
Abstract: The project of logic as a theoretical tool useful for the sciences and humanities involves, as a crucial step, logical formalization – the conversion of sentences of natural language to formulas of a formal language. But what do we do, exactly, when we do logical formalization? What are the criteria of adequacy of the conversion? In how far is logic normative? The paper offers answers to these central (but surprisingly rather neglected) questions and shows that getting a proper grasp on the process of formalization is important for understanding the nature of logic. The key point is that logic as a theoretical tool manages to consolidate our linguistic – in particular argumentative – practices by means of attaining a specific sort of reflective equilibrium. The paper provides a detailed discussion of the answers to the above questions implied by this understanding of logic.

10 citations


Journal Article
TL;DR: In particular, the authors showed that the notion of uniformity for finitely additive probability measures on the natural numbers emerging from this supertask is unreasonably weak, i.e. there are many scenarios consistent with the supertask.
Abstract: We mathematically model the supertask, introduced by Hansen (2014; 2015), in which an infinity of gods together select a random natural number by each randomly removing a finite number of balls from an urn, leaving one final ball. We show that this supertask is highly underdetermined, i.e. there are many scenarios consistent with the supertask. In particular we show that the notion of uniformity for finitely additive probability measures on the natural numbers emerging from this supertask is unreasonably weak.

3 citations


Journal Article
TL;DR: In this article, it was shown that the formula [p ←→ []¬p] is unsatisfiable in the modal logic KD4 characterised by frames that are strict partial orders without maximal elements.
Abstract: We (further) demystify Yablo’s paradox by showing that it can be thought of as the fact that the formula [](p ←→ []¬p) is unsatisfiable in the modal logic KD4 characterised by frames that are strict partial orders without maximal elements. This modal treatment also unifies the two versions of Yablo’s paradox, the original version and its dual.

3 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that the logic underlying these concepts need not be classical logic and establish weak sufficient conditions for both the finest splitting theorem from [25] and the least-letter-set theorem (LWS) from [27].
Abstract: When our current beliefs face a certain problem – e.g. when we receive new information contradicting them –, then we should not remove beliefs that are not related to this problem. This principle is known as “minimal mutilation” or “conservativity” [21]. To make it formally precise, Rohit Parikh [32] defined a Relevance axiom for (classical) theory revision, which is based on the notion of a language splitting. I show that both concepts can and should be applied in a much broader context than mere revision of theories in the traditional sense. First, I generalize their application to belief change in general, and strengthen the axiom of relevance in order to make it fully syntax-independent. This is done by making use of the least letter-set representation of a set of formulas [27]. Second, I show that the logic underlying both concepts need not be classical logic and establish weak sufficient conditions for both the finest splitting theorem from [25] and the least letter-set theorem from [27]. Both generalizations are illustrated by means of the paraconsistent logic CLuNs and compared to ideas from [14, 36, 24].

3 citations


Journal ArticleDOI
TL;DR: In this article, the existence of a truthmaker necessitates the truth of the proposition it makes true, and it is argued in detail that truthmaking is a matter of grounding truth and grounding is a dependency relation that neither entails nor reduces to necessitation.
Abstract: This paper is an argument against Truthmaker Necessitarianism — the doctrine that the existence of a truthmaker necessitates the truth of the proposition it makes true Armstrong’s sufficiency argument for necessitarianism is examined and shown to be question begging It is then argued in detail that truthmaking is a matter of grounding truth and that grounding is a dependency relation that neither entails nor reduces to necessitation

2 citations


Journal ArticleDOI
TL;DR: In this article, a formal deontic logic is proposed to test the validity of legal inferences in Canadian legal discourse, and a semi-formal method based on this logic is presented.
Abstract: The aim of the present paper is to introduce a method to test the validity of legal inferences. We begin by presenting the rationale of our method and then we expose the philosophical foundations of our analysis. If formal philosophy is to be of help to legal discourse, then it must first reflect upon the law's fundamental characteristics that should be taken into account. Our analysis shows that (Canadian) legal discourse possesses three fundamental characteristics which ought to be considered if one wants to represent the formal structure of legal arguments. These characteristics are the presupposed consistency of legal discourse, the fact that there is a hierarchy between norms and obligations to preserve this consistency and the fact that legal inferences are subjected to the principle of deontic consequences. We present a formal deontic logic which is built according to these characteristics and provide the completeness results. Finally, we present a semi-formal method (based on the proposed deontic logic) to test the validity of legal inferences. This paper contributes to the literature insofar as it provides a method that covers a portion of the intuitive validity of legal inferences which is not covered by other frameworks.

1 citations


Journal Article
TL;DR: In this article, the authors distinguish various senses of completeness relevant to the debate and argue that the move to inconsistency-tolerance might not be enough to achieve a complete picture of the mathematical landscape in any of those senses, for there is positive reason to endorse a richer Platonism with place for trivial parts in the Mathematical landscape.
Abstract: Really full blooded platonism (RFBP) is aimed to achieve a complete picture of the mathematical landscape by accepting inconsistent mathematics. I distinguish various senses of completeness relevant to the debate and argue that the move to inconsistency-tolerance might not be enough to achieve a complete picture of the mathematical landscape in any of those senses, for there is positive reason to endorse a richer Platonism with place for trivial parts in the mathematical landscape.

Journal ArticleDOI
TL;DR: In this paper, the authors propose to bridge the gap between deontic logic, categorial grammar and category theory by analyzing Forrester's (1984) paradox through the framework of Lambek's (1958) syntactic calculus.
Abstract: The present paper aims to bridge the gap between deontic logic, categorial grammar and category theory. We propose to analyze Forrester's (1984) paradox through the framework of Lambek's (1958) syntactic calculus. We first recall the definition of the syntactic calculus and then explain how Lambek (1988) defines it within the framework of category theory. Then, we briefly present Forrester's paradox in conjunction with standard deontic logic, showing that this paradox contains some features that reflect many problems within the literature. Finally, we analyze Forrester's paradox within the framework of the syntactic calculus and we show how a typed syntax can provide conceptual insight regarding some of the problems that deontic logic faces.

Journal Article
TL;DR: In this article, the relationship between second-order comprehen- sion and unrestricted mereological fusion (over atoms) is clarified, and an extension PAF of Peano arithmetic with a new binary mereological notion of "fusion" is introduced.
Abstract: In this article, the relationship between second-order comprehen- sion and unrestricted mereological fusion (over atoms) is clarified. An extension PAF of Peano arithmetic with a new binary mereological notion of “fusion”, and a scheme of unrestricted fusion, is introduced. It is shown that PAF interprets full second-order arithmetic, Z2.