scispace - formally typeset
Search or ask a question
Journal ArticleDOI

R-Norm Entropy and R-Norm Divergence in Fuzzy Probability Spaces

11 Apr 2018-Entropy (Multidisciplinary Digital Publishing Institute)-Vol. 20, Iss: 4, pp 272
TL;DR: It is shown that the Shannon entropy and the conditional Shannon entropy of fuzzy partitions can be derived from the R-norm entropy and conditional R- norm entropy of warm partitions, respectively, as the limiting cases for R going to 1.
Abstract: In the presented article, we define the R-norm entropy and the conditional R-norm entropy of partitions of a given fuzzy probability space and study the properties of the suggested entropy measures. In addition, we introduce the concept of R-norm divergence of fuzzy P-measures and we derive fundamental properties of this quantity. Specifically, it is shown that the Shannon entropy and the conditional Shannon entropy of fuzzy partitions can be derived from the R-norm entropy and conditional R-norm entropy of fuzzy partitions, respectively, as the limiting cases for R going to 1; the Kullback–Leibler divergence of fuzzy P-measures may be inferred from the R-norm divergence of fuzzy P-measures as the limiting case for R going to 1. We also provide numerical examples that illustrate the results.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
01 Feb 2022-Entropy
TL;DR: A systematic review on the applications of entropy and related information-theoretical concepts in the design, implementation and evaluation of cryptographic schemes, algorithms, devices and systems is presented.
Abstract: After being introduced by Shannon as a measure of disorder and unavailable information, the notion of entropy has found its applications in a broad range of scientific disciplines. In this paper, we present a systematic review on the applications of entropy and related information-theoretical concepts in the design, implementation and evaluation of cryptographic schemes, algorithms, devices and systems. Moreover, we study existing trends, and establish a roadmap for future research in these areas.

10 citations

Journal ArticleDOI
TL;DR: It is proved that the Renyi entropy of a fuzzy dynamical system is invariant under isomorphism of fuzzy Dynamical systems.
Abstract: The present paper is devoted to the study of Renyi entropy in the fuzzy case. We define the Renyi entropy of a fuzzy partition and its conditional version and derive basic properties the suggested entropy measures. In particular, it was shown that the Renyi entropy of a fuzzy partition is monotonically decreasing. Consequently, using the proposed concept of Renyi entropy, the notion of Renyi entropy of a fuzzy dynamical system is introduced. Finally, it is proved that the Renyi entropy of a fuzzy dynamical system is invariant under isomorphism of fuzzy dynamical systems.

6 citations

Journal ArticleDOI
01 Aug 2019
TL;DR: It is proven that the Kullback–Leibler divergence and Shannon’s entropy of partitions in a given product MV-algebra can be obtained as the limits of their R-norm divergence and R- norm entropy, respectively.
Abstract: The aim of the paper is to extend the results concerning the Shannon entropy and Kullback–Leibler divergence in product MV-algebras to the case of R-norm entropy and R-norm divergence. We define the R-norm entropy of finite partitions in product MV-algebras and its conditional version and derive the basic properties of these entropy measures. In addition, we introduce the concept of R-norm divergence in product MV-algebras and we prove basic properties of this quantity. In particular, it is proven that the Kullback–Leibler divergence and Shannon’s entropy of partitions in a given product MV-algebra can be obtained as the limits of their R-norm divergence and R-norm entropy, respectively.
References
More filters
Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

65,425 citations

Posted Content
TL;DR: The information deviation between any two finite measures cannot be increased by any statistical operations (Markov morphisms) and is invarient if and only if the morphism is sufficient for these two measures as mentioned in this paper.
Abstract: The information deviation between any two finite measures cannot be increased by any statistical operations (Markov morphisms). It is invarient if and only if the morphism is sufficient for these two measures

5,228 citations


"R-Norm Entropy and R-Norm Divergenc..." refers background in this paper

  • ...The notion of Kullback–Leibler divergence was introduced in [33] as the distance measure between two probability distributions....

    [...]

Book
01 Jan 1990
TL;DR: This book is an updated version of the information theory classic, first published in 1990, with expanded treatment of stationary or sliding-block codes and their relations to traditional block codes and discussion of results from ergodic theory relevant to information theory.
Abstract: This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

1,810 citations

Journal ArticleDOI
TL;DR: In this article, it was shown that there is a universal group for every effect algebra, as well as a universal vector space over an arbitrary field, which is the prototypical example of the effect algebras discussed in this paper.
Abstract: The effects in a quantum-mechanical system form a partial algebra and a partially ordered set which is the prototypical example of the effect algebras discussed in this paper. The relationships among effect algebras and such structures as orthoalgebras and orthomodular posets are investigated, as are morphisms and group- valued measures (or charges) on effect algebras. It is proved that there is a universal group for every effect algebra, as well as a universal vector space over an arbitrary field.

911 citations


"R-Norm Entropy and R-Norm Divergenc..." refers background in this paper

  • ...Currently, the subjects of intense study are algebraic systems based on the theory of fuzzy sets, for example, D-posets [34–36], MV-algebras [37–41], and effect algebras [42]....

    [...]