scispace - formally typeset
Search or ask a question
Topic

List decoding

About: List decoding is a research topic. Over the lifetime, 7251 publications have been published within this topic receiving 151182 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A decoding algorithm is constructed which turns out to be a generalization of the Peterson algorithm for decoding BCH decoder codes.
Abstract: A class of codes derived from algebraic plane curves is constructed. The concepts and results from algebraic geometry that were used are explained in detail; no further knowledge of algebraic geometry is needed. Parameters, generator and parity-check matrices are given. The main result is a decoding algorithm which turns out to be a generalization of the Peterson algorithm for decoding BCH decoder codes. >

128 citations

Journal ArticleDOI
TL;DR: This work presents a polynomial time constructible asymptotically good family of binary codes of rate /spl Omega/(/spl epsi//sup 4/) that can be list-decoded in polynometric time from up to a fraction of errors, using lists of size O(/spl Epsi //sup -2/).
Abstract: Informally, an error-correcting code has "nice" list-decodability properties if every Hamming ball of "large" radius has a "small" number of codewords in it. We report linear codes with nontrivial list-decodability: i.e., codes of large rate that are nicely list-decodable, and codes of large distance that are not nicely list-decodable. Specifically, on the positive side, we show that there exist codes of rate R and block length n that have at most c codewords in every Hamming ball of radius H/sup -1/(1-R-1/c)/spl middot/n. This answers the main open question from the work of Elias (1957). This result also has consequences for the construction of concatenated codes of good rate that are list decodable from a large fraction of errors, improving previous results of Guruswami and Sudan (see IEEE Trans. Inform. Theory, vol.45, p.1757-67, Sept. 1999, and Proc. 32nd ACM Symp. Theory of Computing (STOC), Portland, OR, p. 181-190, May 2000) in this vein. Specifically, for every /spl epsi/ > 0, we present a polynomial time constructible asymptotically good family of binary codes of rate /spl Omega/(/spl epsi//sup 4/) that can be list-decoded in polynomial time from up to a fraction (1/2-/spl epsi/) of errors, using lists of size O(/spl epsi//sup -2/). On the negative side, we show that for every /spl delta/ and c, there exists /spl tau/ 0, and an infinite family of linear codes {C/sub i/}/sub i/ such that if n/sub i/ denotes the block length of C/sub i/, then C/sub i/ has minimum distance at least /spl delta/ /spl middot/ n/sub i/ and contains more than c/sub 1/ /spl middot/ n/sub i//sup c/ codewords in some Hamming ball of radius /spl tau/ /spl middot/ n/sub i/. While this result is still far from known bounds on the list-decodability of linear codes, it is the first to bound the "radius for list-decodability by a polynomial-sized list" away from the minimum distance of the code.

128 citations

Journal ArticleDOI
TL;DR: This paper provides a survey of the existing literature on the decoding of algebraic-geometric codes and shows what has been done, discusses what still has to be done, and poses some open problems.
Abstract: This paper provides a survey of the existing literature on the decoding of algebraic-geometric codes. Definitions, theorems, and cross references will be given. We show what has been done, discuss what still has to be done, and pose some open problems.

127 citations

Journal ArticleDOI
TL;DR: In this article, a universal decoding procedure for finite-state channels is proposed, which achieves an error probability with an error exponent that, for large enough block length, is equal to the random coding error exponent associated with the optimal maximum likelihood decoding procedure.
Abstract: Universal decoding procedures for finite-state channels are discussed. Although the channel statistics are not known, universal decoding can achieve an error probability with an error exponent that, for large enough block length (or constraint length in case of convolutional codes), is equal to the random-coding error exponent associated with the optimal maximum-likelihood decoding procedure for the given channel. The same approach is applied to sequential decoding, yielding a universal sequential decoding procedure with a cutoff rate and an error exponent that are equal to those achieved by the classical sequential decoding procedure.

126 citations

Journal ArticleDOI
TL;DR: A method for improving the performance of low-density parity-check (LDPC) codes in the high SNR (error floor) region is presented and is universal, as it can be applied to any LDPC code/channel/decoding algorithm and improves performance at the expense of increasing the code length.
Abstract: We discuss error floor asympotics and present a method for improving the performance of low-density parity-check (LDPC) codes in the high SNR (error floor) region. The method is based on Tanner graph covers that do not have trapping sets from the original code. The advantages of the method are that it is universal, as it can be applied to any LDPC code/channel/decoding algorithm and it improves performance at the expense of increasing the code length, without losing the code regularity, without changing the decoding algorithm, and, under certain conditions, without lowering the code rate. The proposed method can be modified to construct convolutional LDPC codes also. The method is illustrated by modifying Tanner, MacKay and Margulis codes to improve performance on the binary symmetric channel (BSC) under the Gallager B decoding algorithm. Decoding results on AWGN channel are also presented to illustrate that optimizing codes for one channel/decoding algorithm can lead to performance improvement on other channels.

125 citations


Network Information
Related Topics (5)
Base station
85.8K papers, 1M citations
89% related
Fading
55.4K papers, 1M citations
89% related
Wireless network
122.5K papers, 2.1M citations
87% related
Network packet
159.7K papers, 2.2M citations
87% related
Wireless
133.4K papers, 1.9M citations
86% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202384
2022153
202179
202078
201982
201894