scispace - formally typeset
Search or ask a question

Showing papers in "Logic and Logical Philosophy in 2010"


Journal ArticleDOI
TL;DR: The present paper presents axiomatizations of six systems of deontic action logic based on Boolean algebra and sets theoretical models for them and shows the relations among them and the position of some existing theories on the resulting picture.
Abstract: Within the scope of interest of deontic logic, systems in which names of actions are arguments of deontic operators (deontic action logic) have attracted less interest than purely propositional systems. However, in our opinion, they are even more interesting from both theoretical and practical point of view. The fundament for contemporary research was established by K. Segerberg, who introduced his systems of basic deontic logic of urn model actions in early 1980s. Nowadays such logics are considered mainly within propositional dynamic logic (PDL). Two approaches can be distinguished: in one of them deontic operators are introduced using dynamic operators and the notion of violation, in the other at least some of them are taken as primitive. The second approach may be further divided into the systems based on Boolean algebra of actions and the systems built on the top of standard PDL. In the present paper we are interested in the systems of deontic action logic based on Boolean algebra. We present axiomatizations of six systems and set theoretical models for them. We also show the relations among them and the position of some existing theories on the resulting picture. Such a presentation allows the reader to see the spectrum of possibilities of formalization of the subject.

29 citations


Journal ArticleDOI
TL;DR: In this paper, the authors make a further attempt to establish Lupasco's concepts as significant contributions to the history and philosophy of logic, in line with the work of Godel, general relativity, and the ontological turn in philosophy.
Abstract: The advent of quantum mechanics in the early 20 th Century had profound consequences for science and mathematics, for philosophy (Schrodinger), and for logic (von Neumann). In 1968, Putnam wrote that quantum mechanics required a revolution in our understanding of logic per se. However, applications of quantum logics have been little explored outside the quantum domain. Dummett saw some implications of quantum logic for truth, but few philosophers applied similar intuitions to epistemology or ontology. Logic remained a truth-functional ’science’ of correct propositional reasoning. Starting in 1935, the Franco-Romanian thinker Stephane Lupasco described a logical system based on the inherent dialectics of energy and accordingly expressed in and applicable to complex real processes at higher levels of reality. Unfortunately, Lupasco’s fifteen major publications in French went unrecognized by mainstream logic and philosophy, and unnoticed outside a Francophone intellectual community, albeit with some translations into other Romance languages. In English, summaries of Lupasco’s logic appeared ca. 2000, but the first major treatment and extension of his system was published in 2008 (see Brenner 2008). This paper is a further attempt to establish Lupasco’s concepts as significant contributions to the history and philosophy of logic, in line with the work of Godel, general relativity, and the ontological turn in philosophy.

29 citations


Journal ArticleDOI
TL;DR: An adaptive logic framework based on CDPM is proposed which is able to deal with deontic conflicts by means of restricting the inheritance principle and allows for certain rules to be applied as much as possible.
Abstract: Lou Goble proposed powerful conditional deontic logics (CDPM) that are able to deal with deontic conflicts by means of restricting the inheritance principle. One of the central problems for dyadic deontic logics is to properly treat the restricted applicability of the principle “strengthening the antecedent”. In most cases it is desirable to derive from an obligation A under condition B, that A is also obliged under condition B and C. However, there are important counterexamples. Goble proposed a weakened rational monotonicity principle to tackle this problem. This solution is suboptimal as it is for some examples counter-intuitive or even leads to explosion. The paper identifies also other problems of Goble’s systems. For instance, to make optimal use of the restricted inheritance principle, in many cases the user has to manually add certain statements to the premises. An adaptive logic framework based on CDPM is proposed which is able to tackle these problems. It allows for certain rules to be applied as much as possible. In this way counter-intuitive consequences as well as explosion can be prohibited and no user interference is required. Furthermore, for non-conflicting premise sets the adaptive logics are equivalent to Goble’s dyadic version of standard deontic logic.

22 citations


Journal ArticleDOI
TL;DR: Two new first-order paraconsistent logics with De Morgan-type negations and co-implication, called symmetricParaconsistent logic (SPL) and dual paraconsistant logic (DPL), are introduced as Gentzen-type sequent calculi, and the completeness theorems with respect to these semantics are proved.
Abstract: Two new first-order paraconsistent logics with De Morgan-type negations and co-implication, called symmetric paraconsistent logic (SPL) and dual paraconsistent logic (DPL), are introduced as Gentzen-type sequent calculi. The logic SPL is symmetric in the sense that the rule of contraposition is admissible in cut-free SPL. By using this symmetry property, a simpler cut-free sequent calculus for SPL is obtained. The logic DPL is not symmetric, but it has the duality principle. Simple semantics for SPL and DPL are introduced, and the completeness theorems with respect to these semantics are proved. The cut-elimination theorems for SPL and DPL are proved in two ways: One is a syntactical way which is based on the embedding theorems of SPL and DPL into Gentzen’s LK, and the other is a semantical way which is based on the completeness theorems.

7 citations


Journal ArticleDOI
TL;DR: This paper explicates informally and formally what it means to join many objects into a single entity from point of view of mereology, the theory of part of (parthood) relation and motivates basic axioms for part of relation.
Abstract: As it is indicated in the title, this paper is devoted to the problem of defining mereological (collective) sets. Starting from basic properties of sets in mathematics and differences between them and so called conglomerates in Section 1, we go on to explicate informally in Section 2 what it means to join many objects into a single entity from point of view of mereology, the theory of part of (parthood) relation. In Section 3 we present and motivate basic axioms for part of relation and we point to their most fundamental consequences. Next three sections are devoted to formal explication of the notion of mereological set (collective set) in terms of sums, fusions and aggregates. We do not give proofs of all theorems. Some of them are complicated and their presentation would divert the reader’s attention from the main topic of the paper. In such cases we indicate where the proofs can be found and analyzed by those who are interested.

5 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyse Beziau's anti-Slater move and show both its right intuitions and its technical limits, and suggest that Slater's criticism is much akin to a well-known one by Suszko (1975) against the conceivability of many-valued logics.
Abstract: In 1995 Slater argued both against Priest’s paraconsistent system LP (1979) and against paraconsistency in general, invoking the fundamental opposition relations ruling the classical logical square. Around 2002 Beziau constructed a double defence of paraconsistency (logical and philosophical), relying, in its philosophical part, on Sesmat’s (1951) and Blanche’s (1953) “logical hexagon”, a geometrical, conservative extension of the logical square, and proposing a new (tridimensional) “solid of opposition”, meant to shed new light on the point raised by Slater. By using n-opposition theory (NOT) we analyse Beziau’s anti-Slater move and show both its right intuitions and its technical limits. Moreover, we suggest that Slater’s criticism is much akin to a well-known one by Suszko (1975) against the conceivability of many-valued logics. This last criticism has been addressed by Malinowski (1990) and Shramko and Wansing (2005), who developed a family of tenable logical counter-examples to it: trans-Suszkian systems are radically many-valued. This family of new logics has some strange logical features, essentially: each system has more than one consequence operator. We show that a new, deeper part of the aforementioned geometry of logical oppositions (NOT), the “logical poly-simplexes of dimension m”, generates new logical-geometrical structures, essentially many-valued, which could be a very natural (and intuitive) geometrical counterpart to the “strange”, new, non-Suszkian logics of Malinowski, Shramko and Wansing. By a similar move, the geometry of opposition therefore sheds light both on the foundations of paraconsistent logics and on those of many-valued logics.

5 citations


Journal ArticleDOI
TL;DR: Results include the concrete condition to enrich the system PCL1 with the classical negation, a comparison of the concrete notion of “behaving classically” given by da Costa and by Waragai and Shidori, and a characterisation of the notion of”behavingclassically’ given by Warakai andShidori.
Abstract: In [Waragai & Shidori, 2007], a system of paraconsistent logic called PCL1, which takes a similar approach to that of da Costa, is proposed. The present paper gives further results on this system and its related systems. Those results include the concrete condition to enrich the system PCL1 with the classical negation, a comparison of the concrete notion of “behaving classically” given by da Costa and by Waragai and Shidori, and a characterisation of the notion of “behaving classically” given by Waragai and Shidori.

5 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined modal logics C1, D1, E1, S0.5, S 0.5 and S 0.5 and proved that considered logics are determined by some classes of logics.
Abstract: In the present paper we examine very weak modal logics C1, D1, E1, S0.5◦, S0.5◦+(D), S0.5 and some of their versions which are closed under replacement of tautological equivalents (rte-versions). We give semantics for these logics, formulated by means of Kripke style models of the form , where w is a «distinguished» world, A is a set of worlds which are alternatives to w, and V is a valuation which for formulae and worlds assigns the truth-vales such that: (i) for all formulae and all worlds, V preserves classical conditions for truth-value operators; (ii) for the world w and any formula ϕ, V(⬜ϕ,w) = 1 iff ∀x∈A V(ϕ,x) = 1; (iii) for other worlds formula ⬜ϕ has an arbitrary value. Moreover, for rte-versions of considered logics we must add the following condition: (iv) V(⬜χ,w) = V(⬜χ[ϕ/ψ],w), if ϕ and ψ are tautological equivalent. Finally, for C1, D1and E1 we must add queer models of the form in which: (i) holds and (ii') V(⬜ϕ,w) = 0, for any formula ϕ. We prove that considered logics are determined by some classes of above models.

4 citations


Journal ArticleDOI
Hans Lycke1
TL;DR: In this paper, a new class of inconsistency-adaptive logics are characterized, namely inconsistency- Adaptive modal logics, which cope with inconsistencies in a modal context and allow the derivation of sufficient consequences to adequately explicate the part of human reasoning they are intended for.
Abstract: In this paper, I will characterize a new class of inconsistency-adaptive logics, namely inconsistency-adaptive modal logics. These logics cope with inconsistencies in a modal context. More specifically, when faced with inconsistencies, inconsistency-adaptive modal logics avoid explosion, but still allow the derivation of sufficient consequences to adequately explicate the part of human reasoning they are intended for.

4 citations


Journal ArticleDOI
TL;DR: A class of semantic tableau systems for some dyadic deontic logics is developed and soundness results are obtained for every tableau system.
Abstract: The purpose of this paper is to develop a class of semantic tableau systems for some dyadic deontic logics. We will consider 16 different pure dyadic deontic tableau systems and 32 different alethic dyadic deontic tableau systems. Possible world semantics is used to interpret our formal languages. Some relationships between our systems and well known dyadic deontic logics in the literature are pointed out and soundness results are obtained for every tableau system. Completeness results are obtained for all 16 pure dyadic deontic systems and for 16 alethic dyadic deontic systems.

4 citations


Journal ArticleDOI
TL;DR: In this article, the notion of recursively defined function and the Benardete-Yablo paradox were used together with some inherent features of causality and time, as usually conceived, to derive two results: that no ungrounded causal chain exists and time has a beginning.
Abstract: We use two logical resources, namely, the notion of recursively defined function and the Benardete-Yablo paradox, together with some inherent features of causality and time, as usually conceived, to derive two results: that no ungrounded causal chain exists and that time has a beginning.

Journal ArticleDOI
TL;DR: In this article, a restriction of R-Mingle with the variable sharing property and the Ackermann properties is defined, from an intuitive semantical point of view, this restriction is an alternative to Anderson and Belnap's logic of entailment.
Abstract: A restriction of R-Mingle with the variable-sharing property and the Ackermann properties is defined. From an intuitive semantical point of view, this restriction is an alternative to Anderson and Belnap’s logic of entailment E.

Journal ArticleDOI
TL;DR: This paper proposes its own theory based on procedural semantics of Transparent Intensional Logic (TIL) and explicate concept in terms of the key notion of TIL, namely construction viewed as an abstract, algorithmically structured procedure.
Abstract: The goal of this paper is a philosophical explication and logical rectification of the notion of concept. We take into account only those contexts that are relevant from the logical point of view. It means that we are not interested in contexts characteristic of cognitive sciences, particularly of psychology, where concepts are conceived of as some kind of mental objects or representations. After a brief recapitulation of various theories of concept, in particular Frege’s and Church’s ones, we propose our own theory based on procedural semantics of Transparent Intensional Logic (TIL) and explicate concept in terms of the key notion of TIL, namely construction viewed as an abstract, algorithmically structured procedure.

Journal ArticleDOI
TL;DR: In this paper, the authors show that Whitehead's definition of point, based on abstractive class and covering, is not adequate and if we admit such a definition it is also questionable that a point exists.
Abstract: This note is motivated by Whitehead’s researches in inclusion-based point-free geometry as exposed in An Inquiry Concerning the Principles of Natural Knowledge and in The concept of Nature. More precisely, we observe that Whitehead’s definition of point, based on the notions of abstractive class and covering, is not adequate. Indeed, if we admit such a definition it is also questionable that a point exists. On the contrary our approach, in which the diameter is a further primitive, enables us to avoid such a drawback. Moreover, since such a notion enables us to define a metric in the set of points, our proposal looks to be a good starting point for a foundation of the geometry metrical in nature (as proposed, for example, by L.M. Blumenthal).

Journal ArticleDOI
TL;DR: The aim of this paper is to present some basic notions of the theory of quantum computing and to compare them with the basic notionsof the classical theory of computation to provide the necessary technical preliminaries presented in a way accessible to the general philosophical audience.
Abstract: The aim of this paper is to present some basic notions of the theory of quantum computing and to compare them with the basic notions of the classical theory of computation. I am convinced, that the results of quantum computation theory (QCT) are not only interesting in themselves, but also should be taken into account in discussions concerning the nature of mathematical knowledge. The philosophical discussion will however be postponed to another paper. QCT seems not to be well-known among philosophers (at least not to the degree it deserves), so the aim of this paper is to provide the necessary technical preliminaries presented in a way accessible to the general philosophical audience.

Journal ArticleDOI
TL;DR: In this paper, the authors show that Viger's argument is based on a flawed premise and hence does not in fact demonstrate what he claims it demonstrates, which is that Anselm's so-called ontological argument falls prey to Russell's paradox.
Abstract: In this short note we respond to the claim made by Christopher Viger in [4] that Anselm’s so-called ontological argument falls prey to Russell’s paradox. We show that Viger’s argument is based on a flawed premise and hence does not in fact demonstrate what he claims it demonstrates.

Journal ArticleDOI
TL;DR: In this article, a Routley-Meyer semantics for Ackermann's logics of strenge implikation Π and Π ′ is provided.
Abstract: The aim of this paper is to provide a Routley-Meyer semantics for Ackermann’s logics of “strenge Implikation” Π ′ and Π ′′ . Besides the Disjunctive Syllogism, this semantics validates the rules Necessitation and Assertion. Strong completeness theorems for Π ′ and Π ′′ are proved. A brief discussion on Π ′ , Π ′′ and paraconsistency is included.

Journal ArticleDOI
TL;DR: In this article, it was shown that for any finite set V with cardinality greater or equal 2, there exists a representation of a quasi-field of sets isomorphic to de Morgan lattice.
Abstract: There is a productive and suggestive approach in philosophical logic based on the idea of generalized truth values. This idea, which stems essentially from the pioneering works by J.M. Dunn, N. Belnap, and which has recently been developed further by Y. Shramko and H. Wansing, is closely connected to the power-setting formation on the base of some initial truth values. Having a set of generalized truth values, one can introduce fundamental logical notions, more specifically, the ones of logical operations and logical entailment. This can be done in two different ways. According to the first one, advanced by M. Dunn, N. Belnap, Y. Shramko and H. Wansing, one defines on the given set of generalized truth values a specific ordering relation (or even several such relations) called the logical order(s), and then interprets logical connectives as well as the entailment relation(s) via this ordering(s). In particular, the negation connective is determined then by the inversion of the logical order. But there is also another method grounded on the notion of a quasi-field of sets, considered by Bialynicki-Birula and Rasiowa. The key point of this approach consists in defining an operation of quasi-complement via the very specific function g and then interpreting entailment just through the relation of set-inclusion between generalized truth values. In this paper, we will give a constructive proof of the claim that, for any finite set V with cardinality greater or equal 2, there exists a representation of a quasi-field of sets isomorphic to de Morgan lattice. In particular, it means that we offer a special procedure, which allows to make our negation de Morgan and our logic relevant.

Journal ArticleDOI
TL;DR: In this paper, Belnap proposed a strategy for reasoning with maximal consistent subsets based on the use of First Degree Entailment in combination with Quine's notion of prime implicate.
Abstract: In [5, 6], Belnap proposed a number of amendments to Rescher’s strategy for reasoning with maximal consistent subsets. More recently in [18], Horty explicitly endorsed Belnap’s amendment to address a related problem in handling inconsistent instructions and commands. In this paper, we’ll examine Belnap’s amendment and point out that Belnap’s suggestion in the use of conjunctive containment is open to the very objection he raised. We’ll propose a way out. The strategy turns on the use of First Degree Entailment in combination with Quine’s notion of prime implicate.