scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Viterbi Decoder Using Zynq-7000 AP-SoC

TL;DR: This paper presents an SoC based Hardware-Software codesign approach of implementing the Viterbi Decoder along with the entire communication system comperising of random binary pattern generator, convolution encoder, QPSK modulator,QPSK demodulator and quantizer built on a Zynq-7000 All Programmable SoC chip.
Abstract: Data transmission over the wireless transmission channel is adversely affected by attenuation, distortion, interference, and noise, that hampers the ability of the receiver to correctly receive the transmitted message signal. Thus error detection and correction methods are implemented to mitigate these effects. Convolution encoder is one such channel encoding technique used at transmitter end for deep space and wireless communication whereas at receiver end the Viterbi decoder decodes the encoded data. Viterbi algorithm is based on principles of maximum likelihood where the optimal trellis path is identified that is followed at the encoder using cumulative hamming distance. This paper presents an SoC based Hardware-Software codesign approach of implementing the Viterbi Decoder along with the entire communication system comperising of random binary pattern generator, convolution encoder, QPSK modulator, QPSK demodulator and quantizer built on a Zynq-7000 All Programmable SoC chip. The blocks are designed using Verilog HDL (Hardware Description Language) using the tools Vivado IDE (Integrated Development Environment) 2017.4 by Xilinx and HDL coder toolbox of MATLAB 2018 $a$ by MathWorks. Simulink model of the communication chain is implemented to simulate the design considering the effect of AWGN (Additive White Gaussian Noise) through the channel and then the partial design is translated to be implemented on a Zynq-7000 AP SoC. This design can find potential applications in Satellite communication, SDR (Software Defined Radio).
Citations
More filters
Book ChapterDOI
01 Jan 2022
TL;DR: The Viterbi algorithm is an efficient method to decode convolution encoded data using the concept of maximum likelihood estimation as discussed by the authors, and it successfully decodes data for any constraint length and corrects error up to 4 bits.
Abstract: The Viterbi algorithm is an efficient method to decode convolution encoded data using the concept of maximum likelihood estimation. This paper presents the design and FPGA implementation of a Viterbi decoder for satellite communication. The design is coded in Verilog HDL using the Vivado 2017.4 tool and QuestaSim for behavioral and post-layout simulations. The decoder is designed to decode the output of rate half convolution encoder. It successfully decodes data for any constraint length and corrects error up to 4 bits. The implementation is done on the Zynq-7000 development board. The maximum operating frequency achieved is 221.9 MHz with a power consumption of 37.62 mW.
References
More filters
Journal ArticleDOI
TL;DR: The upper bound is obtained for a specific probabilistic nonsequential decoding algorithm which is shown to be asymptotically optimum for rates above R_{0} and whose performance bears certain similarities to that of sequential decoding algorithms.
Abstract: The probability of error in decoding an optimal convolutional code transmitted over a memoryless channel is bounded from above and below as a function of the constraint length of the code. For all but pathological channels the bounds are asymptotically (exponentially) tight for rates above R_{0} , the computational cutoff rate of sequential decoding. As a function of constraint length the performance of optimal convolutional codes is shown to be superior to that of block codes of the same length, the relative improvement increasing with rate. The upper bound is obtained for a specific probabilistic nonsequential decoding algorithm which is shown to be asymptotically optimum for rates above R_{0} and whose performance bears certain similarities to that of sequential decoding algorithms.

6,804 citations

01 Jan 1967

1,701 citations


"Viterbi Decoder Using Zynq-7000 AP-..." refers background or methods in this paper

  • ...Viterbi algorithm [12] is based on principles of maximum likelihood where the optimal trellis path is identified that is followed at the encoder using cumulative hamming distance....

    [...]

  • ...Received convolution coded signal gets distorted when traveling through the noisy medium which acts as an input to the Viterbi decoder that works out the maximum likelihood of the pattern followed at encoder to decode this signal [12]....

    [...]

Book
30 Jun 1981
TL;DR: This paper presents a meta-modelling architecture for Convolutional Code Structure and Viterbi Decoding, and some of the techniques used in this architecture can be applied to Group Codes and Block Codes.
Abstract: 1. Fundamental Concepts of Coding.- 2. Group Codes.- 3. Simple Nonalgebraic Decoding Techniques for Group Codes.- 4. Soft Decision Decoding of Block Codes.- 5. Algebraic Techniques for Multiple Error Correction.- 6. Convolutional Code Structure and Viterbi Decoding.- 7. Other Convolutional Decoding Techniques.- 8. System Applications.- Appendix A. Code Generators for BCH Codes.- Appendix B. Code Generators for Convolutional Codes.- B.1. Viterbi Decoding.- B.2. Table Look-up Decoding.- B.3. Threshold Decoding.- B.4. Sequential Decoding.- References.

1,208 citations


"Viterbi Decoder Using Zynq-7000 AP-..." refers methods in this paper

  • ...3) Traceback: In the decoder, the Traceback [1] is the block which recovers the received data based on all the information from the ACS....

    [...]

Book
01 Jan 2002
TL;DR: This chapter discusses encoding and decoding of binary BCH codes as well as some of the techniques used in the Viterbi algorithm, which simplifies the decoding process and increases the chances of success in the face of uncertainty.
Abstract: Preface. Foreword. The ECC web site. 1. Introduction. 1.1 Error correcting coding: Basic concepts. 1.1.1 Block codes and convolutional codes. 1.1.2 Hamming distance, Hamming spheres and error correcting capability. 1.2 Linear block codes. 1.2.1 Generator and parity-check matrices. 1.2.2 The weight is the distance. 1.3 Encoding and decoding of linear block codes. 1.3.1 Encoding with G and H. 1.3.2 Standard array decoding. 1.3.3 Hamming spheres, decoding regions and the standard array. 1.4 Weight distribution and error performance. 1.4.1 Weight distribution and undetected error probability over a BSC. 1.4.2 Performance bounds over BSC, AWGN and fading channels. 1.5 General structure of a hard-decision decoder of linear codes. Problems. 2. Hamming, Golay and Reed-Muller codes. 2.1 Hamming codes. 2.1.1 Encoding and decoding procedures. 2.2 The binary Golay code. 2.2.1 Encoding. 2.2.2 Decoding. 2.2.3 Arithmetic decoding of the extended (24, 12, 8) Golay code. 2.3 Binary Reed-Muller codes. 2.3.1 Boolean polynomials and RM codes. 2.3.2 Finite geometries and majority-logic decoding. Problems. 3. Binary cyclic codes and BCH codes. 3.1 Binary cyclic codes. 3.1.1 Generator and parity-check polynomials. 3.1.2 The generator polynomial. 3.1.3 Encoding and decoding of binary cyclic codes. 3.1.4 The parity-check polynomial. 3.1.5 Shortened cyclic codes and CRC codes. 3.1.6 Fire codes. 3.2 General decoding of cyclic codes. 3.2.1 GF(2m) arithmetic. 3.3 Binary BCH codes. 3.3.1 BCH bound. 3.4 Polynomial codes. 3.5 Decoding of binary BCH codes. 3.5.1 General decoding algorithm for BCH codes. 3.5.2 The Berlekamp-Massey algorithm (BMA). 3.5.3 PGZ decoder. 3.5.4 Euclidean algorithm. 3.5.5 Chien search and error correction. 3.5.6 Errors-and-erasures decoding. 3.6 Weight distribution and performance bounds. 3.6.1 Error performance evaluation. Problems. 4. Nonbinary BCH codes: Reed-Solomon codes. 4.1 RS codes as polynomial codes. 4.2 From binary BCH to RS codes. 4.3 Decoding RS codes. 4.3.1 Remarks on decoding algorithms. 4.3.2 Errors-and-erasures decoding. 4.4 Weight distribution. Problems. 5. Binary convolutional codes. 5.1 Basic structure. 5.1.1 Recursive systematic convolutional codes. 5.1.2 Free distance. 5.2 Connections with block codes. 5.2.1 Zero-tail construction. 5.2.2 Direct-truncation construction. 5.2.3 Tail-biting construction. 5.2.4 Weight distributions. 5.3 Weight enumeration. 5.4 Performance bounds. 5.5 Decoding: Viterbi algorithm with Hamming metrics. 5.5.1 Maximum-likelihood decoding and metrics. 5.5.2 The Viterbi algorithm. 5.5.3 Implementation issues. 5.6 Punctured convolutional codes. 5.6.1 Implementation issues related to punctured convolutional codes. 5.6.2 RCPC codes. Problems. 6. Modifying and combining codes. 6.1 Modifying codes. 6.1.1 Shortening. 6.1.2 Extending. 6.1.3 Puncturing. 6.1.4 Augmenting, expurgating and lengthening. 6.2 Combining codes. 6.2.1 Time sharing of codes. 6.2.2 Direct sums of codes. 6.2.3 The |u|u + v|-construction and related techniques. 6.2.4 Products of codes. 6.2.5 Concatenated codes. 6.2.6 Generalized concatenated codes. 7. Soft-decision decoding. 7.1 Binary transmission over AWGN channels. 7.2 Viterbi algorithm with Euclidean metric. 7.3 Decoding binary linear block codes with a trellis. 7.4 The Chase algorithm. 7.5 Ordered statistics decoding. 7.6 Generalized minimum distance decoding. 7.6.1 Sufficient conditions for optimality. 7.7 List decoding. 7.8 Soft-output algorithms. 7.8.1 Soft-output Viterbi algorithm. 7.8.2 Maximum-a posteriori (MAP) algorithm. 7.8.3 Log-MAP algorithm. 7.8.4 Max-Log-MAP algorithm. 7.8.5 Soft-output OSD algorithm. Problems. 8. Iteratively decodable codes. 8.1 Iterative decoding. 8.2 Product codes. 8.2.1 Parallel concatenation: Turbo codes. 8.2.2 Serial concatenation. 8.2.3 Block product codes. 8.3 Low-density parity-check codes. 8.3.1 Tanner graphs. 8.3.2 Iterative hard-decision decoding: The bit-flip algorithm. 8.3.3 Iterative probabilistic decoding: Belief propagation. Problems. 9. Combining codes and digital modulation. 9.1 Motivation. 9.1.1 Examples of signal sets. 9.1.2 Coded modulation. 9.1.3 Distance considerations. 9.2 Trellis-coded modulation (TCM). 9.2.1 Set partitioning and trellis mapping. 9.2.2 Maximum-likelihood. 9.2.3 Distance considerations and error performance. 9.2.4 Pragmatic TCM and two-stage decoding. 9.3 Multilevel coded modulation. 9.3.1 Constructions and multistage decoding. 9.3.2 Unequal error protection with MCM. 9.4 Bit-interleaved coded modulation. 9.4.1 Gray mapping. 9.4.2 Metric generation: De-mapping. 9.4.3 Interleaving. 9.5 Turbo trellis-coded modulation. 9.5.1 Pragmatic turbo TCM. 9.5.2 Turbo TCM with symbol interleaving. 9.5.3 Turbo TCM with bit interleaving. Problems. Appendix A: Weight distributions of extended BCH codes. A.1 Length 8. A.2 Length 16. A.3 Length 32. A.4 Length 64. A.5 Length 128. Bibliography. Index.

506 citations


"Viterbi Decoder Using Zynq-7000 AP-..." refers background in this paper

  • ...Among forwarding error correction (FEC) and error detection techniques [4] later are much simpler....

    [...]

  • ...BMU maps the levels of code words into BMs according to their likelihood and BMs are the Hamming distances [4] between the received code words and expected branches....

    [...]

01 Feb 1993
TL;DR: A proposal for further research into the use of the Viterbi Algorithm in Signature Verification is presented, and is the area of present research at the moment.
Abstract: This paper is a tutorial introduction to the Viterbi Algorithm, this is reinforced by an example use of the Viterbi Algorithm in the area of error correction in communications channels. Some extensions to the basic algorithm are also discussed briefly. Some of the many application areas where the Viterbi Algorithm has been used are considered, including it''s use in communications. target tracking and pattern recognition problems. A proposal for further research into the use of the Viterbi Algorithm in Signature Verification is then presented, and is the area of present research at the moment.

69 citations


"Viterbi Decoder Using Zynq-7000 AP-..." refers methods in this paper

  • ...There are different channel encoding techniques but probably the most commonly used is Convolution encoder and its counter part at the receiver end is Viterbi decoder [8]....

    [...]