scispace - formally typeset
Journal ArticleDOI

Compression of individual sequences via variable-rate coding

TLDR
The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.
Abstract
Compressibility of individual sequences by the class of generalized finite-state information-lossless encoders is investigated. These encoders can operate in a variable-rate mode as well as a fixed-rate one, and they allow for any finite-state scheme of variable-length-to-variable-length coding. For every individual infinite sequence x a quantity \rho(x) is defined, called the compressibility of x , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for x by any finite-state encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, also provide useful performance criteria for finite and practical data-compression tasks. The proposed concept of compressibility is also shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences. While the definition of \rho(x) allows a different machine for each different sequence to be compressed, the constructive coding theorem leads to a universal algorithm that is asymptotically optimal for all sequences.

read more

Citations
More filters
Journal ArticleDOI

Finite-state dimension

TL;DR: In this article, it was shown that the finite-state dimension of a sequence is precisely the infimum of all compression ratios achievable on the sequence by information-lossless finite state compressors.
Patent

Adaptive data compression system with systolic string matching logic

TL;DR: An adaptive lossless data compression system with systolic string matching logic performs compression and decompression at the maximum rate of one symbol per clock cycle as mentioned in this paper, which uses an improvement of the LZ1 algorithm.
Journal Article

Effective strong dimension in algorithmic information and computational complexity

TL;DR: The basic properties of effective strong dimensions are developed and a number of results relating them to fundamental aspects of randomness, Kolmogorov complexity, prediction, Boolean circuit-size complexity, polynomial-time degrees, and data compression are proved.
Journal ArticleDOI

Lossless compression of continuous-tone images

TL;DR: In this paper, the authors survey some of the recent advances in lossless compression of continuous-tone images and discuss the modeling paradigms underlying the state-of-the-art algorithms, and the principles guiding their design.
Journal ArticleDOI

Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

TL;DR: It is proved that, like their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process.
References
More filters
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI

A universal algorithm for sequential data compression

TL;DR: The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable- to-block codes designed to match a completely specified source.
Journal ArticleDOI

On the Complexity of Finite Sequences

TL;DR: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented, related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated.
Journal ArticleDOI

Coding theorems for individual sequences

TL;DR: The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence).
Journal ArticleDOI

On Information Lossless Automata of Finite Order

TL;DR: The application of the tests to finite deterministic automata is discussed and a method of constructing a decoder for a given finite automaton that is information lossless of finite order, is described.