scispace - formally typeset
Search or ask a question

Showing papers on "Fuzzy logic published in 1990"


Journal ArticleDOI
01 Apr 1990
TL;DR: The basic aspects of the FLC (fuzzy logic controller) decision-making logic are examined and several issues, including the definitions of a fuzzy implication, compositional operators, the interpretations of the sentence connectives 'and' and 'also', and fuzzy inference mechanisms, are investigated.
Abstract: For pt.I see ibid., vol.20, no.2, p.404-18, 1990. The basic aspects of the FLC (fuzzy logic controller) decision-making logic are examined. Several issues, including the definitions of a fuzzy implication, compositional operators, the interpretations of the sentence connectives 'and' and 'also', and fuzzy inference mechanisms, are investigated. Defuzzification strategies, are discussed. Some of the representative applications of the FLC, from laboratory level to industrial process control, are briefly reported. Some unsolved problems are described, and further challenges in this field are discussed. >

5,502 citations


Journal Article
TL;DR: The fuzzy logic controller (FLC) based on fuzzy logic provides a means of converting a linguistic control strategy based on expert knowledge into an automatic control strategy.
Abstract: During the past several years, fuzzy control has emerged as one of the most active and fruitful areas for research in the applications of fuzzy set theory. Fuzzy control is based on fuzzy logic. The fuzzy logic controller (FLC) based on fuzzy logic provides a means of converting a linguistic control strategy based on expert knowledge into an automatic control strategy. A survey of the FLC is presented; a general methodology for constructing an FLC and assessing its performance is described; and problems that need further research are pointed out

4,830 citations


Journal ArticleDOI
TL;DR: A fuzzy Petri net model (FPN) is presented to represent the fuzzy production rule of a rule-based system in which a fuzzy productionrule describes the fuzzy relation between two propositions and an efficient algorithm is proposed to perform fuzzy reasoning automatically.
Abstract: A fuzzy Petri net model (FPN) is presented to represent the fuzzy production rule of a rule-based system in which a fuzzy production rule describes the fuzzy relation between two propositions. Based on the fuzzy Petri net model, an efficient algorithm is proposed to perform fuzzy reasoning automatically. It can determine whether an antecedent-consequence relationship exists from proposition d/sub s/ to proposition d/sub j/, where d/sub s/ not=d/sub j/. If the degree of truth of proposition d/sub s/ is given, then the degrees of truth of proposition d/sub j/ can be evaluated. The formal description of the model and the fuzzy reasoning algorithm are shown in detail. The upper bound of the time complexity of the fuzzy reasoning algorithm is O(nm), where n is the number of places and m is the number of transitions. Its execution time is proportional to the number of nodes in a sprouting tree generated by the algorithm only generates necessary reasoning paths from a starting place to a goal place, it can be executed very efficiently. >

534 citations


Journal ArticleDOI
TL;DR: A fuzzy supervised classification method in which geographical information is represented as fuzzy sets is described, and results of classifying a Landsat MSS image are presented, and their accuracy is analyzed.
Abstract: A fuzzy supervised classification method in which geographical information is represented as fuzzy sets is described The algorithm consists of two major steps: the estimate of fuzzy parameters from fuzzy training data, and a fuzzy partition of spectral space Partial membership of pixels allows component cover classes of mixed pixels to be identified and more accurate statistical parameters to be generated, resulting in a higher classification accuracy Results of classifying a Landsat MSS image are presented, and their accuracy is analyzed >

522 citations


Journal ArticleDOI
TL;DR: It is proved theoretically that such a fuzzy controller, the smallest possible, with two inputs and a nonlinear defuzzification algorithm is equivalent to a nonfuzzy nonlinear proportional-integral (PI) controller with proportional-gain and integral-gain changing with error and rate change of error about a setpoint.

476 citations


Journal ArticleDOI
01 May 1990
TL;DR: A method of evidence fusion, based on the fuzzy integral, which nonlinearly combines objective evidence, in the form of a fuzzy membership function, with subjective evaluation of the worth of the sources with respect to the decision.
Abstract: A method of evidence fusion, based on the fuzzy integral, is developed. This technique nonlinearly combines objective evidence, in the form of a fuzzy membership function, with subjective evaluation of the worth of the sources with respect to the decision. Various new theoretical properties of this technique are developed, and its applicability to information fusion in computer vision is demonstrated through simulation and with object recognition data from forward-looking infrared imagery. >

431 citations


Journal ArticleDOI
TL;DR: A new geometric proof of the Subsethood Theorem is given, a corollary of which is that the apparently probabilistic relative frequency nA /N turns out to be the deterministic subsethood S(X, A), the degree to which the sample space X is contained in its subset A.
Abstract: Fuzziness is explored as an alternative to randomness for describing uncertainty. The new sets-as-points geometric view of fuzzy sets is developed. This view identifies a fuzzy set with a point in a unit hypercube and a nonfuzzy set with a vertex of the cube. Paradoxes of two-valued logic and set theory, such as Russell's paradox, correspond to the midpoint of the fuzzy cube. The fundamental questions of fuzzy theory—How fuzzy is a fuzzy set? How much is one fuzzy set a subset of another?—are answered geometrically with the Fuzzy Entropy Theorem, the Fuzzy Subsethood Theorem, and the Entropy-Subsethood Theorem. A new geometric proof of the Subsethood Theorem is given, a corollary of which is that the apparently probabilistic relative frequency nA /N turns out to be the deterministic subsethood S(X, A), the degree to which the sample space X is contained in its subset A. So the frequency of successful trials is viewed as the degree to which all trials are successful. Recent Bayesian polemics against fuzzy ...

413 citations



Journal ArticleDOI
TL;DR: A state-of-the-art of methodology and algorithms of fuzzy sets in the field of pattern recognition and clustering techniques are discussed and a problem of cluster validity expressed in terms of clustering indices is addressed.

354 citations


Journal ArticleDOI
TL;DR: A general-purpose fuzzy logic inference engine for real-time control applications, designed and fabricated in a 1.1- mu m, 3.3-V, double-level-metal CMOS technology, is discussed.
Abstract: A general-purpose fuzzy logic inference engine for real-time control applications, designed and fabricated in a 1.1- mu m, 3.3-V, double-level-metal CMOS technology, is discussed. Up to 102 rules are processed in parallel with a single 688 K transistor device. Features include a dynamically reconfigurable and cascadable architecture, TTL-compatible host interface, laser-programmable redundancy, a special mode for testability, RAM rule storage, and on-chip fuzzification and defuzzification. >

294 citations


BookDOI
01 Nov 1990
TL;DR: This book discusses fuzzy logic with linguistic quantifiers in multiobjective decision making and optimization, a step towards more human-consistent models, and Stochastic Versus Fuzzy Approaches and Related Issues.
Abstract: I. The General Framework.- 1. Multiobjective programming under uncertainty : scope and goals of the book.- 2. Multiobjective programming : basic concepts and approaches.- 3. Stochastic programming : numerical solution techniques by semi-stochastic approximation methods.- 4. Fuzzy programming : a survey of recent developments.- II. The Stochastic Approach.- 1. Overview of different approaches for solving stochastic programming problems with multiple objective functions.- 2. "STRANGE" : an interactive method for multiobjective stochastic linear programming, and "STRANGE-MOMIX" : its extension to integer variables.- 3. Application of STRANGE to energy studies.- 4. Multiobjective stochastic linear programming with incomplete information : a general methodology.- 5. Computation of efficient solutions of stochastic optimization problems with applications to regression and scenario analysis.- III. The Fuzzy Approach.- 1. Interactive decision-making for multiobjective programming problems with fuzzy parameters.- 2. A possibilistic approach for multiobjective programming problems. Efficiency of solutions.- 3. "FLIP" : an interactive method for multiobjective linear programming with fuzzy coefficients.- 4. Application of "FLIP" method to farm structure optimization under uncertainty.- 5. "FULPAL" : an interactive method for solving (multiobjective) fuzzy linear programming problems.- 6. Multiple objective linear programming problems in the presence of fuzzy coefficients.- 7. Inequality constraints between fuzzy numbers and their use in mathematical programming.- 8. Using fuzzy logic with linguistic quantifiers in multiobjective decision making and optimization: A step towards more human-consistent models.- IV. Stochastic Versus Fuzzy Approaches and Related Issues.- 1. Stochastic versus possibilistic multiobjective programming.- 2. A comparison study of "STRANGE" and "FLIP".- 3. Multiobjective mathematical programming with inexact data.

Journal ArticleDOI
TL;DR: An approach to intelligent PID (proportional integral derivative) control of industrial systems which is based on the application of fuzzy logic is presented, and it is possible to determine small changes on these values during the system operation, and these lead to improved performance of the transient and steady behavior of the closed-loop system.
Abstract: An approach to intelligent PID (proportional integral derivative) control of industrial systems which is based on the application of fuzzy logic is presented. This approach assumes that one has available nominal controller parameter settings through some classical tuning technique (Ziegler-Nichols, Kalman, etc.). By using an appropriate fuzzy matrix (similar to Macvicar-Whelan matrix), it is possible to determine small changes on these values during the system operation, and these lead to improved performance of the transient and steady behavior of the closed-loop system. This is achieved at the expense of some small extra computational effort, which can be very easily undertaken by a microprocessor. Several experimental results illustrate the improvements achieved. >




Journal ArticleDOI
TL;DR: The results show that the fuzzy classifier may enable the extraction of information about individual pixels and about subpixel phenomena not addressed by other classifiers.

Journal ArticleDOI
TL;DR: In this article, it is shown that any sets of linguistic values of linguistic variables can be axiomatized, which leads to a notion of hedge algebras, and some intuitive properties of hedge hedges are discussed informally.

Journal ArticleDOI
TL;DR: In this paper, necessary and sufficient conditions for some linear and quadratic equations to have a solution when the parameters are either real or complex fuzzy numbers are presented, and applications in chemistry, economics, finance and physics are presented for these types of equations.

Journal ArticleDOI
TL;DR: In this article, a fuzzy model of the reliability analysis is presented, which is based on the operation of dependence and operation of fuzziness which is contained in the qualitative expression, and the evaluation of the failure possibility and the error possibility.

Journal ArticleDOI
TL;DR: In this article, the extension principle is used to find Y = f( X 1,…, X n ) when we subtitute fuzzy numbers X i for the xi, 1⩽i ⩽n.

Journal ArticleDOI
TL;DR: Working under constraints suggested by the brain may make traditional computation more difficult, but it may lead to solutions to AI problems that would otherwise be overlooked.
Abstract: In our quest to build intelligent machines, we have but one naturally occurring model: the human brain. It follows that one natural idea for artificial intelligence (AI) is to simulate the functioning of the brain directly on a computer. Indeed, the idea of building an intelligent machine out of artificial neurons has been around for quite some time. Some early results on brain-line mechanisms were achieved by [18], and other researchers pursued this notion through the next two decades, e.g., [1, 4, 19, 21, 24]. Research in neural networks came to a virtual halt in the 1970s, however, when the networks under study were shown to be very weak computationally. Recently, there has been a resurgence of interest in neural networks. There are several reasons for this, including the appearance of faster digital computers on which to simulate larger networks, interest in building massively parallel computers, and most importantly, the discovery of powerful network learning algorithms.The new neural network architectures have been dubbed connectionist architectures. For the most part, these architectures are not meant to duplicate the operation of the human brain, but rather receive inspiration from known facts about how the brain works. They are characterized by Large numbers of very simple neuron-like processing elements;Large numbers of weighted connections between the elements—the weights on the connections encode the knowledge of a network;Highly parallel, distributed control; andEmphasis on learning internal representations automatically.Connectionist researchers conjecture that thinking about computation in terms of the brain metaphor rather than the digital computer metaphor will lead to insights into the nature of intelligent behavior.Computers are capable of amazing feats. They can effortlessly store vast quantities of information. Their circuits operate in nanoseconds. They can perform extensive arithmetic calculations without error. Humans cannot approach these capabilities. On the other hand, humans routinely perform simple tasks such as walking, talking, and commonsense reasoning. Current AI systems cannot do any of these things better than humans. Why not? Perhaps the structure of the brain is somehow suited to these tasks, and not suited to tasks like high-speed arithmetic calculation. Working under constraints suggested by the brain may make traditional computation more difficult, but it may lead to solutions to AI problems that would otherwise be overlooked.What constraints, then, does the brain offer us? First of all, individual neurons are extremely slow devices when compared to their counterparts in digital computers. Neurons operate in the millisecond range, an eternity to a VLSI designer. Yet, humans can perform extremely complex tasks, like interpreting a visual scene or understanding a sentence, in just a tenth of a second. In other words, we do in about a hundred steps what current computers cannot do in ten million steps. How can this be possible? Unlike a conventional computer, the brain contains a huge number of processing elements that act in parallel. This suggests that in our search for solutions, we look for massively parallel algorithms that require no more than 100 processing steps [9].Also, neurons are failure-prone devices. They are constantly dying (you have certainly lost a few since you began reading this article), and their firing patterns are irregular. Components in digital computers, on the other hand, must operate perfectly. Why? Such components store bits of information that are available nowhere else in the computer: the failure of one component means a loss of information. Suppose that we built AI programs that were not sensitive to the failure of a few components, perhaps by using redundancy and distributing information across a wide range of components? This would open the possibility of very large-scale implementations. With current technology, it is far easier to build a billion-component integrated circuit in which 95 percent of the components work correctly than it is to build a perfectly functioning million-component machine [8].Another thing people seem to be able to do better than computers is handle fuzzy situations. We have very large memories of visual, auditory, and problem-solving episodes, and one key operation in solving new problems is finding closest matches to old situations. Inexact matching is something brain-style models seem to be good at, because of the diffuse and fluid way in which knowledge is represented.The idea behind connectionism, then, is that we may see significant advances in AI if we approach problems from the point of view of brain-style computation rather than rule-based symbol manipulation. At the end of this article, we will look more closely at the relationship between connectionist and symbolic AI.

Journal ArticleDOI
TL;DR: In this paper, several alternative rules for generating exact choices from fuzzy weak preference relations are introduced and the extent to which exact choice sets generated by these rules satisfy fairly weak rationality conditions is studied.


Proceedings ArticleDOI
05 Dec 1990
TL;DR: It is shown that the direct method of Lyapunov can be used to determine sufficient conditions for global stability of a broad class of fuzzy control schemes and a measure of robustness is proposed that can be use to evaluate and possibly redesign a given fuzzy control system so as to enhance the range of its stable operation.
Abstract: A new approach to the stability analysis of fuzzy linguistic control (FLC) systems is presented. Specifically, it is shown that the direct method of Lyapunov can be used to determine sufficient conditions for global stability of a broad class of fuzzy control schemes. Moreover, a measure of robustness is proposed that can be used to evaluate and possibly redesign a given fuzzy control system so as to enhance the range of its stable operation. Finally, the application of the proposed methodology is shown and its implications in terms of control design are demonstrated by means of numeric examples. >

Journal ArticleDOI
TL;DR: Fuzzy logic is examined, and its application to control systems is discussed, and the possibility of interfacing fuzzy logic to existing control Systems is noted.
Abstract: Fuzzy logic is examined, and its application to control systems is discussed. The steps taken to design a fuzzy controller are described, and the possibility of interfacing fuzzy logic to existing control systems is noted. Tools for developing and modeling fuzzy control systems are described. >

Book
01 Sep 1990
TL;DR: A practical, hands-on, applications-oriented approach, it develops computer models for applications to decision-making processes, introducing the basic notion of relative grades via the fuzzy set theoretic approach.
Abstract: Until this book, the available literature on fuzzy sets has been, at best, scattered throughout industrial and university libraries. Encapsulated here is a sound discussion of the basic theoretical and practical aspects involved in fuzzy database systems. With a practical, hands-on, applications-oriented approach, it develops computer models for applications to decision-making processes, introducing the basic notion of relative grades via the fuzzy set theoretic approach. Also covers fuzzy relational databases and their calculus, and the fuzzy relational (structured) query language (FSQL). The last sections present methods for treating the incomplete information in fuzzy PROLOG database (FPDB) systems. Several examples of knowledge representation, expert systems, fuzzy control, and fuzzy clustering and information retrieval illustrate the theory. An extended sample database is used throughout the book.

Journal ArticleDOI
TL;DR: It is shown that a fuzzy subset of a group is a fuzzy subgroup iff the complement of this fuzzy subset is anti fuzzySubgroups, and it is proved that if a fuzzy set is an anti fuzzy subgroups then the fuzzifications of its lower level subsets are also anti fuzzy Subgroups.

Book ChapterDOI
01 Jan 1990
TL;DR: In this paper some results on group decision making under fuzzy preferences are reviewed and how to represent fuzzy preferences is discussed.
Abstract: In this paper some results on group decision making under fuzzy preferences are reviewed. Since preferences of men are often “fuzzy” and the direct interpersonal comparison of preferences is not easy, the above topic is inevitable and quite important. First we discuss how to represent fuzzy preferences. They might be represented by fuzzy choice sets, fuzzy binary relations or fuzzy utility functions. Secondly group decision making situations are classified according to the above levels in representing both individual and group preferences. Some methods are explained in each category.

Journal ArticleDOI
TL;DR: In this article, two approaches for constructing control charts for quality assurance when the observations are in the form of linguistic data are presented, based on fuzzy set theory and use fuzzy subsets to model the linguistic terms used to describe product quality.
Abstract: Two approaches for constructing control charts for quality assurance when the observations are in the form of linguistic data are presented. Both approaches are based on fuzzy set theory and use fuzzy subsets to model the linguistic terms used to describe product quality. They differ in the interpretation of the control limits and in the procedure used to reduce the fuzzy subsets to scalars for determining the chart parameters. The results obtained with simulated data suggest that, on the basis of sensitivity to process shifts, the control charts for linguistic data perform better than conventional p control charts. The number of linguistic terms used in classifying the observations was found to influence the sensitivity of these control charts. The transformation method used to obtain the representative values and the amount of fuzziness do not seem to affect the performance of either type of control charts.

Journal ArticleDOI
TL;DR: The present paper is intended to emphasize and explore the possibility of the use of such a theory to develop a methodology which is computationally simple and easy to use in the quantitative assessment of failure probability of catastrophic events in PSA, particularly, in level-I studies.