scispace - formally typeset
Search or ask a question
Topic

Sequential decoding

About: Sequential decoding is a research topic. Over the lifetime, 8667 publications have been published within this topic receiving 204271 citations.


Papers
More filters
Proceedings ArticleDOI
01 Dec 2003
TL;DR: Analytic expressions for the exact probability of erasure for systematic, rate- 1/2 convolutional codes used to communicate over the binary erasure channel and decoded using the soft-input, soft-output (SISO) and a posteriori probability (APP) algorithms are given.
Abstract: Analytic expressions for the exact probability of erasure for systematic, rate- 1/2 convolutional codes used to communicate over the binary erasure channel and decoded using the soft-input, soft-output (SISO) and a posteriori probability (APP) algorithms are given. An alternative forward-backward algorithm which produces the same result as the SISO algorithm is also given. This low-complexity implementation, based upon lookup tables, is of interest for systems which use convolutional codes, such as turbo codes.

57 citations

Patent
Yoshikazu Kobayashi1
31 Mar 1998
TL;DR: In this paper, an image decoding apparatus that generates a decoded image from a code sequence is presented, which includes an entropy decoding unit, achieved by the computer, for reading one code out of the code sequence, which is stored in the memory via the bus and performing entropy decoding on the read code in to generate a decode value.
Abstract: In an image decoding apparatus that generates a decoded image from a code sequence. The decoding apparatus has a bus, a computer and a memory, wherein the computer and the memory are connected to each other via the bus. The code sequence is generated by performing orthogonal transform, quantization and entropy coding on image data, which is stored in the memory. The decoding apparatus includes an entropy decoding unit, achieved by the computer, for reading one code out of the code sequence, which is stored in the memory, via the bus and performing entropy decoding on the read code in to generate a decode value. The apparatus also includes a coefficient generating unit, achieved by the computer, for generating at least one orthogonal transform coefficient according to the generated decode value. Also, a writing unit is achieved by the computer, for writing the generated at least one orthogonal transform coefficient into the memory via the bus. A decode controlling unit and the writing unit is provided for instructing the entropy decoding unit to process a next code out of the code sequence.

57 citations

Journal ArticleDOI
29 Jun 1997
TL;DR: This analysis of generalized minimum distance (GMD) decoding algorithms for Euclidean space codes is presented, and it is proved that although these decoding regions are polyhedral, they are essentially always nonconvex.
Abstract: We present a detailed analysis of generalized minimum distance (GMD) decoding algorithms for Euclidean space codes. In particular, we completely characterize GMD decoding regions in terms of receiver front-end properties. This characterization is used to show that GMD decoding regions have intricate geometry. We prove that although these decoding regions are polyhedral, they are essentially always nonconvex. We furthermore show that conventional performance parameters, such as error-correction radius and effective error coefficient, do not capture the essential geometric features of a GMD decoding region, and thus do not provide a meaningful measure of performance. As an alternative, probabilistic estimates of, and upper bounds upon, the performance of GMD decoding are developed. Furthermore, extensive simulation results, for both low-dimensional and high-dimensional sphere-packings, are presented. These simulations show that multilevel codes in conjunction with multistage GMD decoding provide significant coding gains at a very low complexity. Simulated performance, in both cases, is in remarkably close agreement with our probabilistic approximations.

57 citations

Journal ArticleDOI
TL;DR: In this article, the authors gave an explicit construction of a family of capacity-achieving binary t-write WOM codes for any number of writes t, which have polynomial time encoding and decoding algorithms.
Abstract: In this paper, we give an explicit construction of a family of capacity-achieving binary t-write WOM codes for any number of writes t, which have polynomial time encoding and decoding algorithms. The block length of our construction is N=(t/e)O(t/(δe)) when e is the gap to capacity and encoding and decoding run in time N1+δ. This is the first deterministic construction achieving these parameters. Our techniques also apply to larger alphabets.

57 citations

Journal ArticleDOI
TL;DR: The authors extend this pragmatic approach to the case where the core of the trellis decoder is a Viterbi decoder for a punctured version of the de facto standard, rate 1/2 convolutional code.
Abstract: A single convolutional code of fixed rate can be punctured to form a class of higher rate convolutional codes. The authors extend this pragmatic approach to the case where the core of the trellis decoder is a Viterbi decoder for a punctured version of the de facto standard, rate 1/2 convolutional code. >

57 citations


Network Information
Related Topics (5)
MIMO
62.7K papers, 959.1K citations
90% related
Fading
55.4K papers, 1M citations
90% related
Base station
85.8K papers, 1M citations
89% related
Wireless network
122.5K papers, 2.1M citations
87% related
Wireless
133.4K papers, 1.9M citations
86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202351
2022112
202124
202026
201922
201832