scispace - formally typeset
Search or ask a question

Showing papers on "Entropy encoding published in 1979"


15 Mar 1979
TL;DR: Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values and have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span.
Abstract: Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.

410 citations


Journal ArticleDOI
TL;DR: In a sample of 220 Frank4ead ECG's the removal of signal redundancy by second-order prediction or interpolation with subsequent entropy encoding of the respective residual errors was investigated, finding interpolation provided a 6 dB smaller residual error variance than prediction.
Abstract: Compression of digital electrocardiogram (ECG) signals is desirable for two reasons: economic use of storage space for data bases and reduction of the data transmission rate for compatibility with telephone lines. In a sample of 220 Frank4ead ECG's the removal of signal redundancy by second-order prediction or interpolation with subsequent entropy encoding of the respective residual errors was investigated. At the sampling rate of 200 Hz, interpolation provided a 6 dB smaller residual error variance than prediction. A near-optimal value for the interpolation coefficients is 0.5, permitting simple implementation of the algorithm and requiring a word length for arithmetic processing of only 2 bits in extent of the signal precision. For linear prediction, the effects of occasional transmission errors decay exponentially, whereas for interpolation they do not, necessitating error control in certain applications. Encoding of the interpolation errors by a Huffman code truncated to ±5 quantization levels of 30 ?V, required an average word length of 2.21 bits/sample (upper 96 percentile 3 bits/sample), resulting in data transmission rates of 1327 bits/s (1800 bits/s) for three simultaneous leads sampled at the rate of 200 Hz. Thus, compared with the original signal of 8 bit samples at 500 Hz, the average compression is 9:1. Encoding of the prediction errors required an average wordlength of 2.67 bits/sample with a 96 percentile of 5.5 bits/sample, making this method less suitable for synchronous transmission.

106 citations


Proceedings ArticleDOI
28 Dec 1979
TL;DR: The reconstruction from transform coding is compared with the original and the spatial error signal is quantized and encoded and the results are compared with conventional DPCM and cosine transform encoding.
Abstract: Cosine transform coding captures the major features of an image at bit rates as low as 0.5 bits per pixel (BPP). However, because the coding is done in transform space, spatial edge information is lost and the images appear soft even at 3BPP. Spatial techniques such as DPCM with entropy encoding preserve edges but fail, ungracefully, at about 2BPP. In this paper we combine the two. The reconstruction from transform coding is compared with the original and the spatial error signal is quantized and encoded. The results are compared with conventional DPCM and cosine transform encoding.

3 citations


Journal ArticleDOI
TL;DR: The problem of minimal expense coding (in particular for Renyi entropies, including the Shannon entropy) is discussed, and a new lower bound for the average cost of encoding the messages to be transmitted is obtained.
Abstract: The problem of minimal expense coding (in particular for Renyi entropies, including the Shannon entropy) is discussed, and a new lower bound for the average cost of encoding the messages to be transmitted is obtained.

1 citations