scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Hyper-Trellis Decoding of Pixel-Domain Wyner–Ziv Video Coding

TL;DR: A new decoding algorithm based on decoding on a hyper-trellis, in which multiple states of the original code trellis are combined, which significantly improves performance without changing the complexity of the decoder.
Abstract: In this paper, we present a new decoding algorithm for the Wyner-Ziv (WZ) video coding scheme based on turbo codes. In this scheme, a video frame is encoded using a turbo code, and only a subset of the parity bits are sent to the decoder. At the decoder, the temporal correlation of the video sequence is exploited by using the previous frame as noisy side information (SI) for the current frame. However, there is a mismatch between the SI, which is available as pixel values, and the binary code bits. Previous implementations of the decoder use suboptimal approaches that convert pixel values to soft information for code bits. We present a new decoding algorithm for this application based on decoding on a hyper-trellis, in which multiple states of the original code trellis are combined. We show that this approach significantly improves performance without changing the complexity of the decoder. We also introduce a new technique for the WZ decoder to exploit the spatial correlation within a frame without requiring transform-domain encoding at the encoder, thereby reducing its complexity. Simulation results for fixed-rate transmission show a 9-10-dB improvement in the peak signal-to-noise ratio when compared to a WZ video codec that does bitwise decoding and utilizes only the temporal correlation.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: Numerical results verify that the proposed scheme remarkably improves the rate distortion performance of distributed video coding at low bitrate.
Abstract: Available distributed video coding codecs are mostly based on decoder rate control scheme where the parity bits for decoding can be achieved over a feedback channel. Meanwhile, the frequent requests over feedback channel increase the transmission delay. The feedback-free distributed video coding, relying on encoder rate control in literatures, has overcome the aforementioned shortcoming. However, when performing parity bitrate estimation and other operations, the feedback-free distributed video coding systems based on bit-plane usually require high precision of bitrate estimation and high quality of side information at the encoder. In this paper, we propose a frame-level distributed video coding system based on encoder rate control. The innovations include three parts: 1) an adaptive coding mode selection algorithm is proposed, which utilizes both temporal and spatial correlation and reduces the complexity of encoder; 2) a bit-plane rearrangement method is adopted, which makes the coding rate on each bit-plane homogeneous and effectively reduces the accuracy requirement of the parity bitrate prediction and improves the efficiency of rate estimation; 3) a frame-level parity bitrate estimation scheme is presented to enhance the efficiency of rate estimation on the basis of a look-up table. Numerical results verify that the proposed scheme remarkably improves the rate distortion performance of distributed video coding at low bitrate.

5 citations

Dissertation
18 Nov 2010
TL;DR: This thesis aims at proposing new solutions for distributed video coding, and especially within multi-camera setting as a new rate-distortion model and its applications, novel side information generation techniques and finally a detailed study of the Wyner-Ziv decoder.
Abstract: Since 2002, distributed video coding has become a major paradigm, because of its attractive theoretical results, and its promising target applications. Indeed, in such a compression system, all inter frame comparison is shifted from the encoder to the decoder, which im- plies an important complexity reduction at the encoder, and moreover, an independent encoding of the camera in case of multiview compression. This thesis aims at proposing new solutions for distributed video coding, and especially within multi-camera setting. These contributions handle several aspects of distributed video coding paradigm as: a new rate-distortion model and its applications, novel side information generation techniques and finally a detailed study of the Wyner-Ziv decoder. All these new approaches aim at enhancing the rate-distortion performance or at leading to a better comprehension of the coder behavior. These ones are explained in detail in this manuscript preceded by a complete overview of their context.

4 citations

Journal Article

4 citations


Cites methods from "Hyper-Trellis Decoding of Pixel-Dom..."

  • ...Besides, channel codes such as turbo code [22], [23], LDPC codes [24], [31], and trellis code [25], [ 26 ] have been employed for DVC so that the bit rate can approach the conditional entropy of the video source given the side information....

    [...]

30 Jun 2010
TL;DR: The thesis focuses on the problems that typically emerge when exploiting temporal correlation solely at the decoder, and finds that DVC can outperform intra coding with a similar encoder complexity, however, for a less constrainedEncoder complexity conventional inter coding outperforms DVC by a large margin.
Abstract: The main focus of video encoding in the past twenty years has been on video broadcasting. A video is captured and encoded by professional equipment and then watched on varying consumer devices. Consequently, the focus was to increase the quality and to keep down the decoder complexity. In more recent years we observe a shift in user behavior, from solely consuming video to also producing and sharing video. As opposed to professional cameras such constrained media devices are limited by the encoder complexity. This thesis addresses Distributed Video Coding (DVC) as a possible solution for very low complexity video encoding. Straightforward intra coding techniques at the encoder is combined with exploiting motion information at the decoder side. In particular, the thesis focuses on the problems that typically emerge when exploiting temporal correlation solely at the decoder. The thesis covers performance limitations of different DVC aspects, namely channel coding, motion estimation at the decoder and quantization. All proposed schemes focus on allowing real-time encoding. In channel coding, we investigate decoder-based modeling. In motion estimation at the decoder, we focus on true motion-based extrapolation. In quantization, we propose a trade-off between adaptivity and overhead. Finally, we compare the derived solutions for each DVC aspect with its counterpart in conventional video coding. We find that DVC can outperform intra coding with a similar encoder complexity. However, for a less constrained encoder complexity conventional inter coding outperforms DVC by a large margin.

3 citations


Cites methods from "Hyper-Trellis Decoding of Pixel-Dom..."

  • ..., [47], [12], and [18], and, therefore, following the argumentation in [30], our transform domain codec uses a Laplacian distribution....

    [...]

Proceedings ArticleDOI
22 May 2011
TL;DR: According to the experimental results, under same total bit budget, the distributed video coding only scheme proves more robust than the latter one and the gain is up to 1 dB in PSNR.
Abstract: We study the scenario of pixel-domain distributed video coding for noisy transmission environments and propose a method to allocate the available rate between source coding and channel coding to generate a robust video stream. Having observed in experiments the uncertainty of the source and the channel coding rate, we model them as random variables via offline training, estimate the decoding failure probability and calculate the mean end-to-end distortion. Adaptive quantization is performed for each slice to minimize its mean end-to-end distortion. With this joint source-channel rate allocation, we compare the robustness of two coding prototypes, namely distributed video coding and distributed video coding with forward error correction. According to our experimental results, under same total bit budget, the distributed video coding only scheme proves more robust than the latter one and the gain is up to 1 dB in PSNR.

2 citations


Cites methods from "Hyper-Trellis Decoding of Pixel-Dom..."

  • ...Symbol-based decoding is implemented by trellis merging [7]....

    [...]

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations


"Hyper-Trellis Decoding of Pixel-Dom..." refers background in this paper

  • ...The first approach [6]–[9] is based on explicit random binning techniques [10], [11] that are typically used in achievability proofs in information theory....

    [...]

Book
01 Jan 1983

25,017 citations

Book
01 Jan 1990
TL;DR: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures and presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers.
Abstract: From the Publisher: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures. Like the first edition,this text can also be used for self-study by technical professionals since it discusses engineering issues in algorithm design as well as the mathematical aspects. In its new edition,Introduction to Algorithms continues to provide a comprehensive introduction to the modern study of algorithms. The revision has been updated to reflect changes in the years since the book's original publication. New chapters on the role of algorithms in computing and on probabilistic analysis and randomized algorithms have been included. Sections throughout the book have been rewritten for increased clarity,and material has been added wherever a fuller explanation has seemed useful or new information warrants expanded coverage. As in the classic first edition,this new edition of Introduction to Algorithms presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers. Further,the algorithms are presented in pseudocode to make the book easily accessible to students from all programming language backgrounds. Each chapter presents an algorithm,a design technique,an application area,or a related topic. The chapters are not dependent on one another,so the instructor can organize his or her use of the book in the way that best suits the course's needs. Additionally,the new edition offers a 25% increase over the first edition in the number of problems,giving the book 155 problems and over 900 exercises thatreinforcethe concepts the students are learning.

21,651 citations


"Hyper-Trellis Decoding of Pixel-Dom..." refers background in this paper

  • ...A proper vertex coloring is harmonious if every pair of colors appears on at most one pair of adjacent vertices [28]....

    [...]

  • ...A vertex coloring is proper if no two adjacent vertices7 are assigned the same color [28]....

    [...]

01 Jan 2005

19,250 citations

01 Nov 1985
TL;DR: This month's guest columnist, Steve Bible, N7HPR, is completing a master’s degree in computer science at the Naval Postgraduate School in Monterey, California, and his research area closely follows his interest in amateur radio.
Abstract: Spread Spectrum It’s not just for breakfast anymore! Don't blame me, the title is the work of this month's guest columnist, Steve Bible, N7HPR (n7hpr@tapr.org). While cruising the net recently, I noticed a sudden bump in the number of times Spread Spectrum (SS) techniques were mentioned in the amateur digital areas. While QEX has discussed SS in the past, we haven't touched on it in this forum. Steve was a frequent cogent contributor, so I asked him to give us some background. Steve enlisted in the Navy in 1977 and became a Data Systems Technician, a repairman of shipboard computer systems. In 1985 he was accepted into the Navy’s Enlisted Commissioning Program and attended the University of Utah where he studied computer science. Upon graduation in 1988 he was commissioned an Ensign and entered Nuclear Power School. His subsequent assignment was onboard the USS Georgia, a trident submarine stationed in Bangor, Washington. Today Steve is a Lieutenant and he is completing a master’s degree in computer science at the Naval Postgraduate School in Monterey, California. His areas of interest are digital communications, amateur satellites, VHF/UHF contesting, and QRP. His research area closely follows his interest in amateur radio. His thesis topic is Multihop Packet Radio Routing Protocol Using Dynamic Power Control. Steve is also the AMSAT Area Coordinator for the Monterey Bay area. Here's Steve, I'll have some additional comments at the end.

8,781 citations


"Hyper-Trellis Decoding of Pixel-Dom..." refers methods in this paper

  • ...channels are combined using diversity combining techniques [21] in order to improve performance....

    [...]