scispace - formally typeset
Journal ArticleDOI

Compression of individual sequences via variable-rate coding

TLDR
The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.
Abstract
Compressibility of individual sequences by the class of generalized finite-state information-lossless encoders is investigated. These encoders can operate in a variable-rate mode as well as a fixed-rate one, and they allow for any finite-state scheme of variable-length-to-variable-length coding. For every individual infinite sequence x a quantity \rho(x) is defined, called the compressibility of x , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for x by any finite-state encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, also provide useful performance criteria for finite and practical data-compression tasks. The proposed concept of compressibility is also shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences. While the definition of \rho(x) allows a different machine for each different sequence to be compressed, the constructive coding theorem leads to a universal algorithm that is asymptotically optimal for all sequences.

read more

Citations
More filters
Proceedings ArticleDOI

Reducing communication overhead in distributed learning by an order of magnitude (almost)

TL;DR: This work explores the effects of applying quantization and encoding to the parameters of distributed models, and shows that, for a neural network, this can be done - without slowing down the convergence, or hurting the generalization of the model.
Proceedings ArticleDOI

Prediction with a short memory

TL;DR: In this paper, the authors consider the problem of predicting the next observation given a sequence of past observations, and consider the extent to which accurate prediction requires complex algorithms that explicitly leverage long-range dependencies.
Journal ArticleDOI

Fixed-length entropy coding for robust video compression

TL;DR: This paper proposes the use of fixed-length entropy codes (FLC) as an alternative to VLC in video compression applications and shows that in noisy transmissions, the FLC-based codec has shown a superior performance compared to V LC-based Codecs with synchronization words.
Journal ArticleDOI

Lempel–Ziv Factorization Powered by Space Efficient Suffix Trees

TL;DR: It is shown that the Lempel–Ziv-77 factorization of a text of length n on an integer alphabet of sizeσ can be computed in 𝕂 on time with either £(1+ϵ)nlgn+On bits (for a constant $$ϵ>0) of working space (including the space for the output, but not the text).
Journal ArticleDOI

Noise prediction for channels with side information at the transmitter

TL;DR: This work derives a simple formula for the capacity when the state process has memory of a discrete memoryless module-additive noise channel Y=X+Z/sub s/, where the encoder observes causally the random state S/spl isin/S that governs the distribution of the noise Z/ sub s/.
References
More filters
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI

A universal algorithm for sequential data compression

TL;DR: The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable- to-block codes designed to match a completely specified source.
Journal ArticleDOI

On the Complexity of Finite Sequences

TL;DR: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented, related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated.
Journal ArticleDOI

Coding theorems for individual sequences

TL;DR: The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence).
Journal ArticleDOI

On Information Lossless Automata of Finite Order

TL;DR: The application of the tests to finite deterministic automata is discussed and a method of constructing a decoder for a given finite automaton that is information lossless of finite order, is described.