scispace - formally typeset
Open AccessJournal ArticleDOI

Generalized information functions

Zoltán Daróczy
- 01 Mar 1970 - 
- Vol. 16, Iss: 1, pp 36-51
Reads0
Chats0
TLDR
The concept of information functions of type β (β > 0) is introduced and by means of these information functions the entropies of typeβ are defined, which have a number of interesting algebraic and analytic properties similar to Shannon's entropy.
Abstract
The concept of information functions of type β ( β > 0) is introduced and discussed. By means of these information functions the entropies of type β are defined. These entropies have a number of interesting algebraic and analytic properties similar to Shannon's entropy. The capacity of type β ( β > 1) of a discrete constant channel is defined by means of the entropy of type β . Examples are given for the computation of the capacity of type β , from which the Shannon's capacity can be derived as the limiting case β = 1.

read more

Citations
More filters
Journal ArticleDOI

Paradigms of Cognition

Flemming Topsøe
- 27 Mar 2017 - 
TL;DR: An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed and seemingly unrelated results are thereby unified, providing a general framework for the treatment of a multitude of global optimization problems across a range of disciplines such as geometry, statistics and statistical physics.
Journal ArticleDOI

On some information measures

TL;DR: The directed divergence of type β which generalizes Kullback's directed divergence or Information measure has been characterized by a simple method using a set of four postulates and a theorem regarding the unique derivation of relative Information measure of type (α, β).
Journal ArticleDOI

Bivariate certainty and information measures

TL;DR: The generalized theory of marginal certainty and information measures, as introduced by Van der Lubbe et al.
Posted Content

A short characterization of relative entropy

TL;DR: It is shown that earlier proofs can be simplified considerably, at the same time relaxing some of the hypotheses, for characterization theorems for relative entropy, q- logarithmic entropy, and q-logarithic relative entropy.
Journal ArticleDOI

Axioms for (α, β, γ)-entropy of a generalized probability scheme

TL;DR: A measure of information of type (α, β, γ) is characterized by taking certain axioms parallel to those considered earlier by Harvda and Charvat along with the recursive relation (1.7).
References
More filters
Book

Transmission of information

TL;DR: A warning device associated with arotatable body comprising means on the rotatable body for creating a magnetic field, a receiving coil on a relatively stationary member arranged such that a warning is given upon occurrence of a predetermined condition, such as pressure of a vehicle wheel deviating outside a predetermined limit.