scispace - formally typeset
Journal ArticleDOI

Compression of individual sequences via variable-rate coding

TLDR
The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.
Abstract
Compressibility of individual sequences by the class of generalized finite-state information-lossless encoders is investigated. These encoders can operate in a variable-rate mode as well as a fixed-rate one, and they allow for any finite-state scheme of variable-length-to-variable-length coding. For every individual infinite sequence x a quantity \rho(x) is defined, called the compressibility of x , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for x by any finite-state encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, also provide useful performance criteria for finite and practical data-compression tasks. The proposed concept of compressibility is also shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences. While the definition of \rho(x) allows a different machine for each different sequence to be compressed, the constructive coding theorem leads to a universal algorithm that is asymptotically optimal for all sequences.

read more

Citations
More filters
Journal ArticleDOI

The algorithmic complexity of multichannel EEGs is sensitive to changes in behavior.

TL;DR: Using a binary partition of EEG activity about the median, redundancy was shown to be insensitive to the size of the data set while being sensitive to changes in the subject's behavioral state, and covariance complexity, calculated from the singular value spectrum of a multichannel signal, was also found to be sensitive toChanges in behavioral state.
Proceedings ArticleDOI

Identification of data, devices, documents and individuals

TL;DR: The author is concerned with indirect identification, where the verifier is dependent on information provided by the object of the identification and perhaps some supplemental information from another source.
Proceedings ArticleDOI

On the Complexity of Traffic Traces and Implications

TL;DR: This paper introduces the notion of trace complexity which approximates the entropy rate of a packet trace, and proposes a traffic generator model able to produce a synthetic trace that matches the complexity levels of its corresponding real-world trace.
Book ChapterDOI

Exponentiation in Finite Fields: Theory and Practice

TL;DR: The main properties for a fast software exponentiation algorithm in \(\mathbb{F}_{2^n }\) for large n∈ℕ are outlined.
Journal ArticleDOI

Data Compression Concepts and Algorithms and Their Applications to Bioinformatics

TL;DR: How basic theoretical ideas from data compression, such as the notions of entropy, mutual information, and complexity have been used for analyzing biological sequences in order to discover hidden patterns, infer phylogenetic relationships between organisms and study viral populations is reviewed.
References
More filters
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI

A universal algorithm for sequential data compression

TL;DR: The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable- to-block codes designed to match a completely specified source.
Journal ArticleDOI

On the Complexity of Finite Sequences

TL;DR: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented, related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated.
Journal ArticleDOI

Coding theorems for individual sequences

TL;DR: The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence).
Journal ArticleDOI

On Information Lossless Automata of Finite Order

TL;DR: The application of the tests to finite deterministic automata is discussed and a method of constructing a decoder for a given finite automaton that is information lossless of finite order, is described.