scispace - formally typeset
Search or ask a question

Showing papers on "Entropy encoding published in 1971"


Book
01 Jan 1971
TL;DR: This chapter justifies why each one of these quantities is non-negative and states the numerical relations between all these quantities and draws the generic Venn-diagram sumarizing them.
Abstract: 1. State the mathematical definitions of H(X ), H(Y), H(X ,Y), H(Y|X ), H(X|Y), I(X ;Y) and justify why each one of these quantities is non-negative. 2. State the numerical relations (inequalities and decompositions) between all these quantities and draw the generic Venn-diagram sumarizing them. 3. Consider the particular case where X ⊥ Y : justify which inequalities become equalities and show the corresponding Venn-diagram highlighting the situation. 4. Consider the particular case where Y = f(X ): justify which inequalities become equalities and show the corresponding Venn-diagram highlighting the situation.

49 citations


Journal ArticleDOI
TL;DR: Additional coding of the DPCM output using entropy coding techniques (Huffman or Shannon-Fano coding) can result in a further increase in the signal-to-quantizing-noise ratio of 5.6 dB without increasing the transmission rate.
Abstract: Much of the redundancy in a speech or television signal is eliminated when it is encoded into digital form by a differential pulse-code-modulation (DPCM) encoder. Additional coding of the DPCM output using entropy coding techniques (Huffman or Shannon-Fano coding) can result in a further increase in the signal-to-quantizing-noise ratio of 5.6 dB without increasing the transmission rate.

29 citations