Journal ArticleDOI
Compression of individual sequences via variable-rate coding
Jacob Ziv,A. Lempel +1 more
TLDR
The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.Abstract:
Compressibility of individual sequences by the class of generalized finite-state information-lossless encoders is investigated. These encoders can operate in a variable-rate mode as well as a fixed-rate one, and they allow for any finite-state scheme of variable-length-to-variable-length coding. For every individual infinite sequence x a quantity \rho(x) is defined, called the compressibility of x , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for x by any finite-state encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, also provide useful performance criteria for finite and practical data-compression tasks. The proposed concept of compressibility is also shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences. While the definition of \rho(x) allows a different machine for each different sequence to be compressed, the constructive coding theorem leads to a universal algorithm that is asymptotically optimal for all sequences.read more
Citations
More filters
Posted Content
Massively-Parallel Lossless Data Decompression
TL;DR: In this article, the authors propose two new techniques to increase the degree of parallelism during decompression of DEFLATE, which is based on LZ77 compression and Huffman encoding and achieves a 2X speedup in a head-to-head comparison with several multi-core CPU-based libraries.
Journal ArticleDOI
Natural Language Statistical Features of LSTM-Generated Texts
TL;DR: The authors compared the statistical structure of LSTM-generated language to that of written natural language, and to those produced by Markov models of various orders, by assessing word-frequency statistics, long-range correlations, and entropy measures.
Journal ArticleDOI
Information and dynamical systems: a concrete measurement on sporadic dynamics
Fiorella Argenti,Vieri Benci,Paola Cerrai,Alessandro Cordelli,Stefano Galatolo,Giulia Menconi +5 more
TL;DR: A new compression algorithm called CASToRe is introduced which allows a direct estimation of the information content of the orbits in the 0-entropy case of dynamical systems.
Proceedings ArticleDOI
Malware Classification and Class Imbalance via Stochastic Hashed LZJD
Edward Raff,Charles Nicholas +1 more
TL;DR: This work develops the new SHWeL feature vector representation, by extending the recently proposed Lempel-Ziv Jaccard Distance, which provides significantly improved accuracy while reducing algorithmic complexity to O(N).
Journal ArticleDOI
Lossless Image Compression by Block Matching
TL;DR: This work generalizes 'LZ1'-type methods (that identify matches in previously seen text) to lossless image compression to examine complexity issues and finish by considering practical 2-D implementations for bi-level images.
References
More filters
Book
Information Theory and Reliable Communication
TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI
A universal algorithm for sequential data compression
Jacob Ziv,A. Lempel +1 more
TL;DR: The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable- to-block codes designed to match a completely specified source.
Journal ArticleDOI
On the Complexity of Finite Sequences
A. Lempel,Jacob Ziv +1 more
TL;DR: A new approach to the problem of evaluating the complexity ("randomness") of finite sequences is presented, related to the number of steps in a self-delimiting production process by which a given sequence is presumed to be generated.
Journal ArticleDOI
Coding theorems for individual sequences
TL;DR: The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence).
Journal ArticleDOI
On Information Lossless Automata of Finite Order
TL;DR: The application of the tests to finite deterministic automata is discussed and a method of constructing a decoder for a given finite automaton that is information lossless of finite order, is described.