scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Logical entropy of quantum dynamical systems

01 Jan 2016-Central European Journal of Physics (Walter de Gruyter GmbH)-Vol. 14, Iss: 1, pp 58
TL;DR: The version of Kolmogorov-Sinai theorem is proved and the concepts of logical entropy and conditional logical entropy of hnite partitions on a quantum logic are introduced.
Abstract: Abstract This paper introduces the concepts of logical entropy and conditional logical entropy of hnite partitions on a quantum logic. Some of their ergodic properties are presented. Also logical entropy of a quantum dynamical system is dehned and ergodic properties of dynamical systems on a quantum logic are investigated. Finally, the version of Kolmogorov-Sinai theorem is proved.
Citations
More filters
Posted Content
TL;DR: A novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy and the experimental results show that the properties of generalizedrelative entropy are better than relative entropy.
Abstract: Information entropy and its extension, which are important generalization of entropy, have been applied in many research domains today. In this paper, a novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy. We presented the structure of generalized relative entropy after the discussion of defects in relative entropy. Moreover, some properties of the provided generalized relative entropy is presented and proved. The provided generalized relative entropy is proved to have a finite range and is a finite distance metric.

39 citations

Journal ArticleDOI
13 Jun 2017-Entropy
TL;DR: In this paper, a novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy, which is proved to have a finite range and is a finite distance metric, and the experimental results show that the properties of the provided relative entropy are better than relative entropy.
Abstract: Information entropy and its extension, which are important generalizations of entropy, are currently applied to many research domains. In this paper, a novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy. We present the structure of generalized relative entropy after the discussion of defects in relative entropy. Moreover, some properties of the provided generalized relative entropy are presented and proved. The provided generalized relative entropy is proved to have a finite range and is a finite distance metric. Finally, we predict nucleosome positioning of fly and yeast based on generalized relative entropy and relative entropy respectively. The experimental results show that the properties of generalized relative entropy are better than relative entropy.

39 citations

Journal ArticleDOI
06 Jan 2017
TL;DR: Using the suggested concept of entropy of partitions, the logical entropy of a dynamical system is defined and it is proved that it is the same for two dynamical systems that are isomorphic.
Abstract: In the paper by Riecan and Markechova (Fuzzy Sets Syst. 96, 1998), some fuzzy modifications of Shannon’s and Kolmogorov-Sinai’s entropy were studied and the general scheme involving the presented models was introduced. Our aim in this contribution is to provide analogies of these results for the case of the logical entropy. We define the logical entropy and logical mutual information of finite partitions on the appropriate algebraic structure and prove basic properties of these measures. It is shown that, as a special case, we obtain the logical entropy of fuzzy partitions studied by Markechova and Riecan (Entropy 18, 2016). Finally, using the suggested concept of entropy of partitions we define the logical entropy of a dynamical system and prove that it is the same for two dynamical systems that are isomorphic.

14 citations

Journal ArticleDOI
21 Aug 2017-Entropy
TL;DR: The concepts of logical entropy and logical mutual information of experiments in the intuitionistic fuzzy case are introduced, and an analogy of the Kolmogorov-Sinai theorem on generators for IF-dynamical systems is proved.
Abstract: In this contribution, we introduce the concepts of logical entropy and logical mutual information of experiments in the intuitionistic fuzzy case, and study the basic properties of the suggested measures. Subsequently, by means of the suggested notion of logical entropy of an IF-partition, we define the logical entropy of an IF-dynamical system. It is shown that the logical entropy of IF-dynamical systems is invariant under isomorphism. Finally, an analogy of the Kolmogorov–Sinai theorem on generators for IF-dynamical systems is proved.

13 citations

Journal ArticleDOI
TL;DR: It is shown that by replacing the Shannon entropy function by the logical entropy function the authors obtain the results analogous to the case of classical Kolmogorov–Sinai entropy theory of dynamical systems.
Abstract: The main purpose of the paper is to extend the results of Ellerman (Int. J. Semant. Comput. 7:121–145, 2013) to the case of dynamical systems. We define the logical entropy and conditional logical entropy of finite measurable partitions and derive the basic properties of these measures. Subsequently, the suggested concept of logical entropy of finite measurable partitions is used to define the logical entropy of a dynamical system. It is proved that two metrically isomorphic dynamical systems have the same logical entropy. Finally, we provide a logical version of the Kolmogorov–Sinai theorem on generators. So it is shown that by replacing the Shannon entropy function by the logical entropy function we obtain the results analogous to the case of classical Kolmogorov–Sinai entropy theory of dynamical systems.

12 citations

References
More filters
Journal ArticleDOI
TL;DR: In this article, three general methods for obtaining measures of diversity within a population and dissimilarity between populations are discussed, one is based on an intrinsic notion of diversity between individuals and others make use of the concepts of entropy and discrimination.

1,462 citations

Journal ArticleDOI
TL;DR: In this article, an intrinsic diversity ordering of communities is defined and is shown to be equivalent to stochastic ordering, and the sensitivity of an index to rare species is developed, culminating in a crossing-point theorem and a response theory to perturbations.
Abstract: This paper puts forth the view that diversity is an average property of a community and identifies that property as species rarity. An intrinsic diversity ordering of communities is defined and is shown to be equivalent to stochastic ordering. Also, the sensitivity of an index to rare species is developed, culminating in a crossing-point theorem and a response theory to perturbations. Diversity decompositions, analogous to the analysis of variance, are discussed for two-way classifications and mixtures. The paper concludes with a brief survey of genetic diversity, linguistic diversity, industrial concentration, and income inequality.

681 citations

Posted Content
TL;DR: In this article, the authors propose a logic of partitions that is dual to the usual Boolean logic of subsets, where the key concept is a distinction of a partition, an ordered pair of elements in distinct blocks of the partition.
Abstract: The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set -- just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle) -- just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition.

43 citations

Posted Content
TL;DR: The notion of partitions on a universe set was introduced by as mentioned in this paper, who showed that the logic of partitions can be seen as a dual logic of subsets of a set, which is called partition logic.
Abstract: Modern categorical logic as well as the Kripke and topological models of intuitionistic logic suggest that the interpretation of ordinary "propositional" logic should in general be the logic of subsets of a given universe set. Partitions on a set are dual to subsets of a set in the sense of the category-theoretic duality of epimorphisms and monomorphisms--which is reflected in the duality between quotient objects and subobjects throughout algebra. If "propositional" logic is thus seen as the logic of subsets of a universe set, then the question naturally arises of a dual logic of partitions on a universe set. This paper is an introduction to that logic of partitions dual to classical subset logic. The paper goes from basic concepts up through the correctness and completeness theorems for a tableau system of partition logic.

37 citations

Journal ArticleDOI
TL;DR: In this article, the authors propose a logic of partitions that is dual to the usual Boolean logic of subsets, where the key concept is a distinction of a partition, an ordered pair of elements in distinct blocks of the partition.
Abstract: The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g., the inclusion-exclusion principle)--just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition.

37 citations