scispace - formally typeset
Open AccessJournal ArticleDOI

Correspondence and Independence of Numerical Evaluations of Algorithmic Information Measures

TLDR
K_m proves to be a finer-grained measure and a potential alternative approach to lossless compression algorithms for small entities, where compression fails, and a first Beta version of an Online Algorithmic Complexity Calculator (OACC) is announced, based on a combination of theoretical concepts and numerical calculations.
Abstract
We show that real-value approximations of Kolmogorov-Chaitin (K_m) using the algorithmic Coding theorem as calculated from the output frequency of a large set of small deterministic Turing machines with up to 5 states (and 2 symbols), is in agreement with the number of instructions used by the Turing machines producing s, which is consistent with strict integer-value program-size complexity. Nevertheless, K_m proves to be a finer-grained measure and a potential alternative approach to lossless compression algorithms for small entities, where compression fails. We also show that neither K_m nor the number of instructions used shows any correlation with Bennett's Logical Depth LD(s) other than what's predicted by the theory. The agreement between theory and numerical calculations shows that despite the undecidability of these theoretical measures, approximations are stable and meaningful, even for small programs and for short strings. We also announce a first Beta version of an Online Algorithmic Complexity Calculator (OACC), based on a combination of theoretical concepts, as a numerical implementation of the Coding Theorem Method.

read more

Citations
More filters
Journal ArticleDOI

Correlation of automorphism group size and topological properties with program−size complexity evaluations of graphs and complex networks

TL;DR: It is shown that numerical approximations of Kolmogorov complexity (K) of graphs and networks capture some group-theoretic and topological properties of empirical networks, ranging from metabolic to social networks, and of small synthetic networks that are produced.
Posted Content

A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity

TL;DR: The Block Decomposition Method (BDM) as discussed by the authors is a block decomposition method based on Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity.
Journal ArticleDOI

Two-dimensional Kolmogorov complexity and an empirical validation of the Coding theorem method by compressibility

TL;DR: Experiments are presented, showing that the measure is stable in the face of some changes in computational formalism and that results are in agreement with the results obtained using lossless compression algorithms when both methods overlap in their range of applicability.
Posted Content

Methods of Information Theory and Algorithmic Complexity for Network Biology

TL;DR: It is shown that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data, and it is proved that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogsorv complexity and thus a robust definition of graph complexity.
Journal ArticleDOI

Structure emerges faster during cultural transmission in children than in adults.

TL;DR: It is shown that over the course of iterated reproduction of two-dimensional random dot patterns transmission accuracy increased to a similar extent in 5- to 8-year-old children and adults whereas algorithmic complexity decreased faster in children, indicating that children require more structure to render complex inputs learnable.
References
More filters

An Introduction to Kolmogorov Complexity and Its Applications

TL;DR: The book presents a thorough treatment of the central ideas and their applications of Kolmogorov complexity with a wide range of illustrative applications, and will be ideal for advanced undergraduate students, graduate students, and researchers in computer science, mathematics, cognitive sciences, philosophy, artificial intelligence, statistics, and physics.
Journal ArticleDOI

Three approaches to the quantitative definition of information

TL;DR: In this article, three approaches to the quantitative definition of information are presented: information-based, information-aware and information-neutral approaches to quantifying information in the context of information retrieval.
Journal ArticleDOI

Clustering by compression

TL;DR: Evidence of successful application in areas as diverse as genomics, virology, languages, literature, music, handwritten digits, astronomy, and combinations of objects from completely different domains, using statistical, dictionary, and block sorting compressors is reported.
Book

Complexity, Entropy and the Physics of Information

TL;DR: In this article, the authors discuss the connections between quantum and classical physics, information and its transfer, computation, and their significance for the formulation of physical theories, but also consider the origins and evolution of the information-processing entities, their complexity, and the manner in which they analyze their perceptions to form models of the Universe.
Posted Content

Clustering by compression

TL;DR: The normalized compression distance (NCD) as discussed by the authors is a similarity metric that approximates universality based on the normalized information distance (NIC) metric, which was proposed by the authors of this paper.