Topic
List decoding
About: List decoding is a research topic. Over the lifetime, 7251 publications have been published within this topic receiving 151182 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This paper presents three algorithms based on stochastic computation to reduce the decoding complexity of non-binary low-density parity-check codes over Galois fields with low order and a small variable node degree and studies the performance and complexity of the algorithms.
Abstract: Despite the outstanding performance of non-binary low-density parity-check (LDPC) codes over many communication channels, they are not in widespread use yet. This is due to the high implementation complexity of their decoding algorithms, even those that compromise performance for the sake of simplicity. In this paper, we present three algorithms based on stochastic computation to reduce the decoding complexity. The first is a purely stochastic algorithm with error-correcting performance matching that of the sum-product algorithm (SPA) for LDPC codes over Galois fields with low order and a small variable node degree. We also present a modified version which reduces the number of decoding iterations required while remaining purely stochastic and having a low per-iteration complexity. The second algorithm, relaxed half-stochastic (RHS) decoding, combines elements of the SPA and the stochastic decoder and uses successive relaxation to match the error-correcting performance of the SPA. Furthermore, it uses fewer iterations than the purely stochastic algorithm and does not have limitations on the field order and variable node degree of the codes it can decode. The third algorithm, NoX, is a fully stochastic specialization of RHS for codes with a variable node degree 2 that offers similar performance, but at a significantly lower computational complexity. We study the performance and complexity of the algorithms; noting that all have lower per-iteration complexity than SPA and that RHS can have comparable average per-codeword computational complexity, and NoX a lower one.
37 citations
••
18 Jun 2000
TL;DR: It is shown that maximum likelihood (ML) sequential decoding and maximum a posteriori (MAP) sequence estimation gives significant decoding improvements over hard decisions alone and that it is possible to make use of the inherent meaning of the codewords without additional transmission of side information which results in a further gain.
Abstract: We present the results of two methods for soft decoding of variable-length codes. We first show that maximum likelihood (ML) sequential decoding and maximum a posteriori (MAP) sequence estimation gives significant decoding improvements over hard decisions alone, then we show that further improvements can be gained by additional transmission of the symbol length. Finally, we show that it is possible to make use of the inherent meaning of the codewords without additional transmission of side information which results in a further gain.
37 citations
•
TL;DR: In this paper, an improved method for the fast correlation attack on certain stream ciphers is presented, which employs the following decoding approaches: list decoding in which a candidate is assigned to the list based on the most reliable information sets, and minimum distance decoding based on Hamming distance Performance and complexity of the proposed algorithm are considered.
Abstract: An improved method for the fast correlation attack on certain stream ciphers is presented The proposed algorithm employs the following decoding approaches: list decoding in which a candidate is assigned to the list based on the most reliable information sets, and minimum distance decoding based on Hamming distance Performance and complexity of the proposed algorithm are considered A desirable characteristic of the proposed algorithm is its theoretical analyzibility, so that its performance can also be estimated in cases where corresponding experiments are not feasible due to the current technological limitations The algorithm is compared with relevant recently reported algorithms, and its advantages are pointed out Finally, the proposed algorithm is considered in a security evaluation context of a proposal (NESSIE) for stream ciphers
37 citations
•
31 Oct 2007TL;DR: An image decoding apparatus is capable of decoding coded bit streams with different coding schemes as discussed by the authors, which includes a coding scheme decision section for deciding a coded scheme from coding scheme identification information multiplexed into a coded bit stream, a setting unit for setting header information on a second coding scheme in accordance with header information in a first coding scheme.
Abstract: An image decoding apparatus is capable of decoding coded bit streams with different coding schemes. The image decoding apparatus includes a coding scheme decision section for deciding a coding scheme from coding scheme identification information multiplexed into a coded bit stream, a setting unit for setting header information on a second coding scheme in accordance with header information in a first coding scheme, and a decoder for decoding image coded data in the first coding scheme in response to the header information on the second coding scheme, which is set.
37 citations
••
TL;DR: The conjecture of Rujan on error-correcting codes is proven and errors in decoding of signals transmitted through noisy channels assume the smallest values when signals are decoded at a particular finite temperature.
Abstract: The conjecture of Rujan on error-correcting codes is proven. Errors in decoding of signals transmitted through noisy channels assume the smallest values when signals are decoded at a particular finite temperature. This finite-temperature decoding is compared with the conventional maximum likelihood decoding which corresponds to the T =0 case. The method of gauge transformation in the spin glass theory is useful in this argument.
37 citations