scispace - formally typeset
Search or ask a question
Journal ArticleDOI

On the Complexity of Finite Sequences

01 Jan 1976-IEEE Transactions on Information Theory (IEEE)-Vol. 22, Iss: 1, pp 75-81
TL;DR: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented, related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated.
Abstract: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented. The proposed complexity measure is related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated. It is further related to the number of distinct substrings and the rate of their occurrence along the sequence. The derived properties of the proposed measure are discussed and motivated in conjunction with other well-established complexity criteria.
Citations
More filters
Book
01 Jan 1996
TL;DR: A valuable reference for the novice as well as for the expert who needs a wider scope of coverage within the area of cryptography, this book provides easy and rapid access of information and includes more than 200 algorithms and protocols.
Abstract: From the Publisher: A valuable reference for the novice as well as for the expert who needs a wider scope of coverage within the area of cryptography, this book provides easy and rapid access of information and includes more than 200 algorithms and protocols; more than 200 tables and figures; more than 1,000 numbered definitions, facts, examples, notes, and remarks; and over 1,250 significant references, including brief comments on each paper.

13,597 citations


Cites methods from "On the Complexity of Finite Sequenc..."

  • ...Another complexity measure, the Ziv-Lempel complexity measure, was proposed by Ziv and Lempel [1273]....

    [...]

Journal ArticleDOI
TL;DR: The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable- to-block codes designed to match a completely specified source.
Abstract: A universal algorithm for sequential data compression is presented. Its performance is investigated with respect to a nonprobabilistic model of constrained sources. The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable-to-block codes designed to match a completely specified source.

5,844 citations

Journal ArticleDOI
TL;DR: The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.
Abstract: Compressibility of individual sequences by the class of generalized finite-state information-lossless encoders is investigated. These encoders can operate in a variable-rate mode as well as a fixed-rate one, and they allow for any finite-state scheme of variable-length-to-variable-length coding. For every individual infinite sequence x a quantity \rho(x) is defined, called the compressibility of x , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for x by any finite-state encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, also provide useful performance criteria for finite and practical data-compression tasks. The proposed concept of compressibility is also shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences. While the definition of \rho(x) allows a different machine for each different sequence to be compressed, the constructive coding theorem leads to a universal algorithm that is asymptotically optimal for all sequences.

3,753 citations

01 Jan 2019
TL;DR: The book presents a thorough treatment of the central ideas and their applications of Kolmogorov complexity with a wide range of illustrative applications, and will be ideal for advanced undergraduate students, graduate students, and researchers in computer science, mathematics, cognitive sciences, philosophy, artificial intelligence, statistics, and physics.
Abstract: The book is outstanding and admirable in many respects. ... is necessary reading for all kinds of readers from undergraduate students to top authorities in the field. Journal of Symbolic Logic Written by two experts in the field, this is the only comprehensive and unified treatment of the central ideas and their applications of Kolmogorov complexity. The book presents a thorough treatment of the subject with a wide range of illustrative applications. Such applications include the randomness of finite objects or infinite sequences, Martin-Loef tests for randomness, information theory, computational learning theory, the complexity of algorithms, and the thermodynamics of computing. It will be ideal for advanced undergraduate students, graduate students, and researchers in computer science, mathematics, cognitive sciences, philosophy, artificial intelligence, statistics, and physics. The book is self-contained in that it contains the basic requirements from mathematics and computer science. Included are also numerous problem sets, comments, source references, and hints to solutions of problems. New topics in this edition include Omega numbers, KolmogorovLoveland randomness, universal learning, communication complexity, Kolmogorov's random graphs, time-limited universal distribution, Shannon information and others.

3,361 citations

Journal ArticleDOI
01 Feb 2007-Brain
TL;DR: A critically discuss the literature on seizure prediction and address some of the problems and pitfalls involved in the designing and testing of seizure-prediction algorithms, and point towards possible future developments and propose methodological guidelines for future studies on seizure predictions.
Abstract: The sudden and apparently unpredictable nature of seizures is one of the most disabling aspects of the disease epilepsy. A method capable of predicting the occurrence of seizures from the electroencephalogram (EEG) of epilepsy patients would open new therapeutic possibilities. Since the 1970s investigations on the predictability of seizures have advanced from preliminary descriptions of seizure precursors to controlled studies applying prediction algorithms to continuous multi-day EEG recordings. While most of the studies published in the 1990s and around the turn of the millennium yielded rather promising results, more recent evaluations could not reproduce these optimistic findings, thus raising a debate about the validity and reliability of previous investigations. In this review, we will critically discuss the literature on seizure prediction and address some of the problems and pitfalls involved in the designing and testing of seizure-prediction algorithms. We will give an account of the current state of this research field, point towards possible future developments and propose methodological guidelines for future studies on seizure prediction.

1,018 citations


Cites background from "On the Complexity of Finite Sequenc..."

  • ...This size is defined as the number of different words in a Lempel–Ziv parsing (Lempel and Ziv, 1976) of the symbol sequence....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, three approaches to the quantitative definition of information are presented: information-based, information-aware and information-neutral approaches to quantifying information in the context of information retrieval.
Abstract: (1968). Three approaches to the quantitative definition of information. International Journal of Computer Mathematics: Vol. 2, No. 1-4, pp. 157-168.

2,661 citations

Journal ArticleDOI
TL;DR: It is shown that the random elements as defined by Kolmogorov possess all conceivable statistical properties of randomness and can equivalently be considered as the elements which withstand a certain universal stochasticity test.
Abstract: Kolmogorov has defined the conditional complexity of an object y when the object x is already given to us as the minimal length of a binary program which by means of x computes y on a certain asymptotically optimal machine. On the basis of this definition he has proposed to consider those elements of a given large finite population to be random whose complexity is maximal. Almost all elements of the population have a complexity which is close to the maximal value. In this paper it is shown that the random elements as defined by Kolmogorov possess all conceivable statistical properties of randomness. They can equivalently be considered as the elements which withstand a certain universal stochasticity test. The definition is extended to infinite binary sequences and it is shown that the non random sequences form a maximal constructive null set. Finally, the Kollektivs introduced by von Alises obtain a definition which seems to satisfy all intuitive requirements.

1,228 citations

Journal ArticleDOI
TL;DR: A variant of the Kolmogorov concept of complexity which yields a common theory of finite and infinite random sequences and some concepts of effective tests which are proved to be equivalent are established.

269 citations

Journal ArticleDOI
TL;DR: An attempt is made to apply information-theoretic computational complexity to meta-mathematics by measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theoremic proofs.
Abstract: An attempt is made to apply information-theoretic computational complexity to meta-mathematics. The paper studies the number of bits of instructions that must be given to a computer for it to perform finite and infinite tasks, and also the time it takes the computer to perform these tasks. This is applied to measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theorems from the axioms.

208 citations