scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Fuzzy logic and approximate reasoning

01 Aug 1996-Synthese (Kluwer Academic Publishers)-Vol. 30, Iss: 3, pp 238-259
TL;DR: F fuzzy logic is used in this paper to describe an imprecise logical system, FL, in which the truth-values are fuzzy subsets of the unit interval with linguistic labels such as true, false, not true, very true, quite true, not very true and not very false, etc.
Abstract: The term fuzzy logic is used in this paper to describe an imprecise logical system, FL, in which the truth-values are fuzzy subsets of the unit interval with linguistic labels such as true, false, not true, very true, quite true, not very true and not very false, etc. The truth-value set, ℐ, of FL is assumed to be generated by a context-free grammar, with a semantic rule providing a means of computing the meaning of each linguistic truth-value in ℐ as a fuzzy subset of [0, 1]. Since ℐ is not closed under the operations of negation, conjunction, disjunction and implication, the result of an operation on truth-values in ℐ requires, in general, a linguistic approximation by a truth-value in ℐ. As a consequence, the truth tables and the rules of inference in fuzzy logic are (i) inexact and (ii) dependent on the meaning associated with the primary truth-value true as well as the modifiers very, quite, more or less, etc. Approximate reasoning is viewed as a process of approximate solution of a system of relational assignment equations. This process is formulated as a compositional rule of inference which subsumes modus ponens as a special case. A characteristic feature of approximate reasoning is the fuzziness and nonuniqueness of consequents of fuzzy premisses. Simple examples of approximate reasoning are: (a) Most men are vain; Socrates is a man; therefore, it is very likely that Socrates is vain. (b) x is small; x and y are approximately equal; therefore y is more or less small, where italicized words are labels of fuzzy sets.
Citations
More filters
Journal ArticleDOI
TL;DR: The need of problem solvers to choose between alternative systems of beliefs is stressed, and a mechanism by which a problem solver can employ rules guiding choices of what to believe, what to want, and what to do is outlined.

1,909 citations

Journal ArticleDOI
TL;DR: The computational approach to fuzzy quantifiers which is described in this paper may be viewed as a derivative of fuzzy logic and test-score semantics.
Abstract: The generic term fuzzy quantifier is employed in this paper to denote the collection of quantifiers in natural languages whose representative elements are: several, most, much, not many, very many, not very many, few, quite a few, large number, small number, close to five, approximately ten, frequently, etc. In our approach, such quantifiers are treated as fuzzy numbers which may be manipulated through the use of fuzzy arithmetic and, more generally, fuzzy logic. A concept which plays an essential role in the treatment of fuzzy quantifiers is that of the cardinality of a fuzzy set. Through the use of this concept, the meaning of a proposition containing one or more fuzzy quantifiers may be represented as a system of elastic constraints whose domain is a collection of fuzzy relations in a relational database. This representation, then, provides a basis for inference from premises which contain fuzzy quantifiers. For example, from the propositions “Most U's are A's” and “Most A's are B's,” it follows that “Most2 U's are B's,” where most2 is the fuzzy product of the fuzzy proportion most with itself. The computational approach to fuzzy quantifiers which is described in this paper may be viewed as a derivative of fuzzy logic and test-score semantics. In this semantics, the meaning of a semantic entity is represented as a procedure which tests, scores and aggregates the elastic constraints which are induced by the entity in question.

1,736 citations

Journal ArticleDOI
TL;DR: F fuzzy logic is suggested, which is the logic underlying approximate or, equivalently, fuzzy reasoning, which leads to various basic syllogisms which may be used as rules of combination of evidence in expert systems.

1,278 citations

Journal ArticleDOI
TL;DR: In this paper, fuzzy logic is viewed in a nonstandard perspective and the cornerstones of fuzzy logic-and its principal distinguishing features-are: graduation, granulation, precisiation and the concept of a generalized constraint.

1,253 citations


Cites background from "Fuzzy logic and approximate reasoni..."

  • ...Basically, fuzzy logic is a precise logic of imprecision and approximate reasoning [83,88]....

    [...]

Journal ArticleDOI
01 Feb 1986
TL;DR: In this paper, a semantical generalization of logic in which the truth values of sentences are probabilistic values (between 0 and 1) is presented, which applies to any logical system for which the consistency of a finite set of sentences can be established.
Abstract: Because many artificial intelligence applications require the ability to reason with uncertain knowledge , it is important to seek appropriate generalizations of logic for that case. We present here a semantical generalization of logic in which the truth values of sentences are probabili~ values (between 0 and 1). Our generalization applies to any logical system for which the consistency of a finite set of sentences can be established. The method described in the present paper combines logic with probability theory in such a way that probabilistic logical entaihnent reduces to ordinary logical entailment when the probabilities of all sentences are either 0 or 1.

1,240 citations

References
More filters
Book
01 Aug 1996
TL;DR: A separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.
Abstract: A fuzzy set is a class of objects with a continuum of grades of membership. Such a set is characterized by a membership (characteristic) function which assigns to each object a grade of membership ranging between zero and one. The notions of inclusion, union, intersection, complement, relation, convexity, etc., are extended to such sets, and various properties of these notions in the context of fuzzy sets are established. In particular, a separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.

52,705 citations

01 Jan 1975

8,942 citations


"Fuzzy logic and approximate reasoni..." refers background or methods in this paper

  • ...aa As pointed out in [4], modusponens may be viewed as a special case of (3....

    [...]

  • ...1 If U is a universe of discourse and F is a fuzzy subset of U, then the compatibility function (or, equivalently, membership function) of F is a mapping/tF: U-+[0,1] which associates with each u ~ U its compatibility (or grade of membership) <~aF(u), 0<~ 1, [4]....

    [...]

  • ...A more detailed discussion of this concept may be found in [4]....

    [...]

  • ...To extend the definitions of negation, conjunction, disjuction and implications in L~ to those of FL, it is convenient to employ an extension principle for fuzzy sets which may be stated as follows [4]....

    [...]

  • ...7 A more detailed discussion of linguistic modifiers and hedges may be found in [10], [11 ] and [4]....

    [...]

Book
01 Jan 1953

2,147 citations

Journal ArticleDOI
TL;DR: The implications of this process when some of the attributes of a string are “synthesized”, i.e., defined solely in terms of attributes of thedescendants of the corresponding nonterminal symbol, while other attributes are ‘inherited’, are examined.
Abstract: “Meaning” may be assigned to a string in a context-free language by defining “attributes” of the symbols in a derivation tree for that string. The attributes can be defined by functions associated with each production in the grammar. This paper examines the implications of this process when some of the attributes are “synthesized”, i.e., defined solely in terms of attributes of thedescendants of the corresponding nonterminal symbol, while other attributes are “inherited”, i.e., defined in terms of attributes of theancestors of the nonterminal symbol. An algorithm is given which detects when such semantic rules could possibly lead to circular definition of some attributes. An example is given of a simple programming language defined with both inherited and synthesized attributes, and the method of definition is compared to other techniques for formal specification of semantics which have appeared in the literature.

1,982 citations


"Fuzzy logic and approximate reasoni..." refers methods in this paper

  • ...4 This techniqueis related to Knuth's method of synthesized attributes [6]....

    [...]

Book
01 Jan 1972
TL;DR: It is the hope that the algorithms and concepts presented in this book will survive the next generation of computers and programming languages, and that at least some of them will be applicable to fields other than compiler writing.
Abstract: From volume 1 Preface (See Front Matter for full Preface) This book is intended for a one or two semester course in compiling theory at the senior or graduate level. It is a theoretically oriented treatment of a practical subject. Our motivation for making it so is threefold. (1) In an area as rapidly changing as Computer Science, sound pedagogy demands that courses emphasize ideas, rather than implementation details. It is our hope that the algorithms and concepts presented in this book will survive the next generation of computers and programming languages, and that at least some of them will be applicable to fields other than compiler writing. (2) Compiler writing has progressed to the point where many portions of a compiler can be isolated and subjected to design optimization. It is important that appropriate mathematical tools be available to the person attempting this optimization. (3) Some of the most useful and most efficient compiler algorithms, e.g. LR(k) parsing, require a good deal of mathematical background for full understanding. We expect, therefore, that a good theoretical background will become essential for the compiler designer. While we have not omitted difficult theorems that are relevant to compiling, we have tried to make the book as readable as possible. Numerous examples are given, each based on a small grammar, rather than on the large grammars encountered in practice. It is hoped that these examples are sufficient to illustrate the basic ideas, even in cases where the theoretical developments are difficult to follow in isolation. From volume 2 Preface (See Front Matter for full Preface) Compiler design is one of the first major areas of systems programming for which a strong theoretical foundation is becoming available. Volume I of The Theory of Parsing, Translation, and Compiling developed the relevant parts of mathematics and language theory for this foundation and developed the principal methods of fast syntactic analysis. Volume II is a continuation of Volume I, but except for Chapters 7 and 8 it is oriented towards the nonsyntactic aspects of compiler design. The treatment of the material in Volume II is much the same as in Volume I, although proofs have become a little more sketchy. We have tried to make the discussion as readable as possible by providing numerous examples, each illustrating one or two concepts. Since the text emphasizes concepts rather than language or machine details, a programming laboratory should accompany a course based on this book, so that a student can develop some facility in applying the concepts discussed to practical problems. The programming exercises appearing at the ends of sections can be used as recommended projects in such a laboratory. Part of the laboratory course should discuss the code to be generated for such programming language constructs as recursion, parameter passing, subroutine linkages, array references, loops, and so forth.

1,727 citations