scispace - formally typeset
Search or ask a question

Showing papers on "Entropy encoding published in 1974"


Journal ArticleDOI
TL;DR: Tree coding and tree search strategies are investigated for the discrete-time memoryless Gaussian source encoded for a signal-power-to-mean-squared-error ratio of about 30 dB and a theoretical lower bound on average search effort is derived.
Abstract: Tree codes are known to be capable of performing arbitrarily close to the rate-distortion function for any memoryless source and single-letter fidelity criterion. Tree coding and tree search strategies are investigated for the discrete-time memoryless Gaussian source encoded for a signal-power-to-mean-squared-error ratio of about 30 dB (about 5 binary digits per source output). Also, a theoretical lower bound on average search effort is derived. Two code search strategies (the Viterbi algorithm and the stack algorithm) were simulated in assembly language on a large digital computer. After suitable modifications, both strategies yielded encoding with a signal-to-distortion ratio about 1 dB below the limit set by the rate-distortion function. Although this performance is better than that of any previously known instrumentable scheme, it unfortunately requires search computation of the order of l0^5 machine cycles per source output encoded.

32 citations


Journal ArticleDOI
TL;DR: The concepts of sliding entropy and sliding signal to quantizing noise (S/N) ratio are developed to measure the way in which the entropy and S/N ratio vary with time during a speech utterance.
Abstract: A study of combining two ways of reducing the redundancy in the digital representation of speech signals is presented. Differential pulse-code modulation (DPCM) encodes the signal into digital form and reduces the redundancy due to correlation in adjacent sample values of the signal. Following this DPCM operation, entropy coding is used to reduce redundancy due to the unequal probabilities of the DPCM quantizer levels to be transmitted. Theoretical studies agree with Computer simalation results with real speech signals. The concepts of sliding entropy and sliding signal to quantizing noise (S/N) ratio are developed to measure the way in which the entropy and S/N ratio vary with time during a speech utterance. Plots of these quantities versus time for four different utterances are shown. Both adaptive and nonadaptive quantizers are studied. And both uniform and minimum mean-square error quantizing rules are included. Buffer length requirements are calculated for the entropy coders.

14 citations


Journal ArticleDOI
TL;DR: The definition of the rate-distortion function is extended to the case of a stationary-ergodic source with side information, and the appropriate coding theorem is proved.
Abstract: The definition of the rate-distortion function is extended to the case of a stationary-ergodic source with side information, and the appropriate coding theorem is proved. Inequalities between the joint, marginal, and conditional rate-distortion functions for ergodic processes are given, and their implications in terms of universal coding are discussed.

10 citations


Journal ArticleDOI
TL;DR: An upper bound on the entropy per run in binary run-length coding is a log (a - 1) , where a is the average run length, and this upper bound is attained by a time-quantized Poisson square wave.
Abstract: An upper bound on the entropy per run in binary run-length coding is a \loga - (a - 1)\log (a - 1) , where a is the average run length. This upper bound is attained by a time-quantized Poisson square wave.

8 citations


Journal ArticleDOI
TL;DR: The time information for source-encoded, or compressed, sampled time functions is treated as a two-symbol source, the symbols corresponding to redundant and nonredundant samples, and the entropy of this source is used as a reference for the comparison of time codes.
Abstract: The time information for source-encoded, or compressed, sampled time functions is treated as a two-symbol source, the symbols corresponding to redundant and nonredundant samples. The entropy of this source for both the statistically independent symbol model and the Markov model is used as a reference for the comparison of time codes. Time encoding schemes are classified into two categories: total information codes and partial information codes. The following codes are compared on the basis of the number of timing bits per symbol as a function of the compression ratio: binary sequence code, Lynch-Davisson code, time-of-nonredundant-sample code, cluster code, fixed-length binary run-length code, Huffman code, and the binary-nonconsective-one code.

4 citations