scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Logical Divergence, Logical Entropy, and Logical Mutual Information in Product MV-Algebras.

16 Feb 2018-Entropy (Multidisciplinary Digital Publishing Institute)-Vol. 20, Iss: 2, pp 129
TL;DR: A new kind of entropy is proposed in product MV-algebras, namely the logical entropy and its conditional version and its logical cross entropy and logical divergence are defined.
Abstract: In the paper we propose, using the logical entropy function, a new kind of entropy in product MV-algebras, namely the logical entropy and its conditional version. Fundamental characteristics of these quantities have been shown and subsequently, the results regarding the logical entropy have been used to define the logical mutual information of experiments in the studied case. In addition, we define the logical cross entropy and logical divergence for the examined situation and prove basic properties of the suggested quantities. To illustrate the results, we provide several numerical examples.
Citations
More filters
Journal ArticleDOI
08 Aug 2018-Entropy
TL;DR: It is proven that the Kullback–Leibler divergence of states on a given product MV-algebra introduced by Markechová and Riečan in (Entropy 2017, 19, 267) can be obtained as the limit of their Rényi divergence.
Abstract: This article deals with new concepts in a product MV-algebra, namely, with the concepts of Renyi entropy and Renyi divergence. We define the Renyi entropy of order q of a partition in a product MV-algebra and its conditional version and we study their properties. It is shown that the proposed concepts are consistent, in the case of the limit of q going to 1, with the Shannon entropy of partitions in a product MV-algebra defined and studied by Petrovicova (Soft Comput. 2000, 4, 41-44). Moreover, we introduce and study the notion of Renyi divergence in a product MV-algebra. It is proven that the Kullback-Leibler divergence of states on a given product MV-algebra introduced by Markechova and Riecan in (Entropy 2017, 19, 267) can be obtained as the limit of their Renyi divergence. In addition, the relationship between the Renyi entropy and the Renyi divergence as well as the relationship between the Renyi divergence and Kullback-Leibler divergence in a product MV-algebra are examined.

4 citations

Journal ArticleDOI
09 Aug 2018-Entropy
TL;DR: It is shown that the Tsallis entropy of order α, where α>1, has the property of sub-additivity, and it is proven that the proposed entropy measure is invariant under isomorphism of product MV-algebra dynamical systems.
Abstract: This paper is concerned with the mathematical modelling of Tsallis entropy in product MV-algebra dynamical systems. We define the Tsallis entropy of order α , where α ∈ ( 0 , 1 ) ∪ ( 1 , ∞ ) , of a partition in a product MV-algebra and its conditional version and we examine their properties. Among other, it is shown that the Tsallis entropy of order α , where α > 1 , has the property of sub-additivity. This property allows us to define, for α > 1 , the Tsallis entropy of a product MV-algebra dynamical system. It is proven that the proposed entropy measure is invariant under isomorphism of product MV-algebra dynamical systems.

3 citations

Journal ArticleDOI
11 Apr 2018-Entropy
TL;DR: It is shown that the Shannon entropy and the conditional Shannon entropy of fuzzy partitions can be derived from the R-norm entropy and conditional R- norm entropy of warm partitions, respectively, as the limiting cases for R going to 1.
Abstract: In the presented article, we define the R-norm entropy and the conditional R-norm entropy of partitions of a given fuzzy probability space and study the properties of the suggested entropy measures. In addition, we introduce the concept of R-norm divergence of fuzzy P-measures and we derive fundamental properties of this quantity. Specifically, it is shown that the Shannon entropy and the conditional Shannon entropy of fuzzy partitions can be derived from the R-norm entropy and conditional R-norm entropy of fuzzy partitions, respectively, as the limiting cases for R going to 1; the Kullback–Leibler divergence of fuzzy P-measures may be inferred from the R-norm divergence of fuzzy P-measures as the limiting case for R going to 1. We also provide numerical examples that illustrate the results.

3 citations

Journal ArticleDOI
TL;DR: A general type of entropy of a product MV-algebra dynamical system that includes the logical entropy and the Kolmogorov–Sinai entropy as special cases is introduced and it is proved that the proposed entropy measure is invariant under isomorphism of product MV
Abstract: The present paper is aimed at studying the entropy of dynamical systems in product MV-algebras. First, by using the concept of logical entropy of a partition in a product MV-algebra introduced and studied by Markechova et al. (Entropy 20:129, 2018), we define the logical entropy of a dynamical system in the studied algebraic structure. In addition, we introduce a general type of entropy of a product MV-algebra dynamical system that includes the logical entropy and the Kolmogorov–Sinai entropy as special cases. It is proved that the proposed entropy measure is invariant under isomorphism of product MV-algebra dynamical systems.

1 citations

Journal ArticleDOI
01 Oct 2022-Entropy
TL;DR: A multicriteria group decision-making method based on intuitionistic normal cloud and cloud distance entropy and the VIKOR method, which integrates group utility and individual regret, is extended to the intuitionisticnormal cloud environment, and thus the ranking results of the alternatives are obtained.
Abstract: The uncertainty of information is an important issue that must be faced when dealing with decision-making problems. Randomness and fuzziness are the two most common types of uncertainty. In this paper, we propose a multicriteria group decision-making method based on intuitionistic normal cloud and cloud distance entropy. First, the backward cloud generation algorithm for intuitionistic normal clouds is designed to transform the intuitionistic fuzzy decision information given by all experts into an intuitionistic normal cloud matrix to avoid the loss and distortion of information. Second, the distance measurement of the cloud model is introduced into the information entropy theory, and the concept of cloud distance entropy is proposed. Then, the distance measurement for intuitionistic normal clouds based on numerical features is defined and its properties are discussed, based on which the criterion weight determination method under intuitionistic normal cloud information is proposed. In addition, the VIKOR method, which integrates group utility and individual regret, is extended to the intuitionistic normal cloud environment, and thus the ranking results of the alternatives are obtained. Finally, the effectiveness and practicality of the proposed method are demonstrated by two numerical examples.

1 citations

References
More filters
Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

65,425 citations

Posted Content
TL;DR: The information deviation between any two finite measures cannot be increased by any statistical operations (Markov morphisms) and is invarient if and only if the morphism is sufficient for these two measures as mentioned in this paper.
Abstract: The information deviation between any two finite measures cannot be increased by any statistical operations (Markov morphisms). It is invarient if and only if the morphism is sufficient for these two measures

5,228 citations


"Logical Divergence, Logical Entropy..." refers background in this paper

  • ...The Kullback-Leibler divergence (often shortened to K-L divergence) was proposed in [49] as the distance between two probability distributions and it is currently one of the most basic quantities in information theory....

    [...]

Book
01 Jan 1990
TL;DR: This book is an updated version of the information theory classic, first published in 1990, with expanded treatment of stationary or sliding-block codes and their relations to traditional block codes and discussion of results from ergodic theory relevant to information theory.
Abstract: This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

1,810 citations

Journal ArticleDOI
TL;DR: In this article, three general methods for obtaining measures of diversity within a population and dissimilarity between populations are discussed, one is based on an intrinsic notion of diversity between individuals and others make use of the concepts of entropy and discrimination.

1,462 citations