scispace - formally typeset
Search or ask a question
Author

Takayuki Nozaki

Bio: Takayuki Nozaki is an academic researcher from Yamaguchi University. The author has contributed to research in topics: Low-density parity-check code & Decoding methods. The author has an hindex of 8, co-authored 49 publications receiving 167 citations. Previous affiliations of Takayuki Nozaki include Kanagawa University & Tokyo Institute of Technology.

Papers
More filters
Proceedings Article
01 Jan 2014
TL;DR: This paper proposes a fountain code whose space decoding complexity is nearly equal to that for the Raptor codes, and Simulation results show that the proposed fountain coding system outperforms Raptor coding system in terms of the overhead for the received packets.
Abstract: Fountain codes based on non-binary low-density parity-check (LDPC) codes have good decoding performance when the number of source packets is finite. However, the space complexity of the decoding algorithm for fountain codes based on non-binary LDPC codes grows exponentially with the degree of a field extension. Zigzag decodable codes generate the output packets from source packets by using shift and exclusive or. It is known that the zigzag decodable codes are efficiently decoded by the zigzag decoder. In this paper, by applying zigzag decodable coding to fountain codes, we propose a fountain code whose space decoding complexity is nearly equal to that for the Raptor codes. Simulation results show that the proposed fountain coding system outperforms Raptor coding system in terms of the overhead for the received packets.

15 citations

Journal ArticleDOI
TL;DR: This paper proposes a fountain coding system which has lower space decoding complexity and lower decoding erasure rate than the Raptor coding systems, and analyzes the overhead for the received packets, decoding erasures, decoding complexity, and asymptotic overhead of the proposed fountain code.
Abstract: This paper proposes a fountain coding system which has lower space decoding complexity and lower decoding erasure rate than the Raptor coding systems. The main idea of the proposed fountain code is employing shift and exclusive OR to generate the output packets. This technique is known as the zigzag decodable code, which is efficiently decoded by the zigzag decoder. In other words, we propose a fountain code based on the zigzag decodable code in this paper. Moreover, we analyze the overhead for the received packets, decoding erasure rate, decoding complexity, and asymptotic overhead of the proposed fountain code. As the result, we show that the proposed fountain code outperforms the Raptor codes in terms of the overhead and decoding erasure rate. Simulation results show that the proposed fountain coding system outperforms Raptor coding system in terms of the overhead and the space decoding complexity.

10 citations

Journal ArticleDOI
TL;DR: This paper presents the analytical solution of the covariance evolution for irregular LDPC code ensembles and its application to block erasure probabilities of finite- length low-density parity-check codes.
Abstract: The scaling law developed by Amraoui et al. is a powerful technique to estimate the block erasure probabilities of finite- length low-density parity-check (LDPC) codes. Solving a system of differential equations called covariance evolution, one can obtain the scaling parameter. However, the covariance evolution has not been analytically solved. In this paper, we present the analytical solution of the covariance evolution for irregular LDPC code ensembles.

10 citations

Proceedings ArticleDOI
05 Jun 2011
TL;DR: A necessary and sufficient condition for successful decoding of zigzag cycle codes over the MBIOS channel by the BP decoder is clarified and expurgate non-binary LDPC code ensemble to analyze and to lower the error floor.
Abstract: In this paper, we investigate the error floors of non-binary low-density parity-check (LDPC) codes transmitted over the memoryless binary-input output-symmetric (MBIOS) channels. We clarify a necessary and sufficient condition for successful decoding of zigzag cycle codes over the MBIOS channel by the BP decoder. We expurgate non-binary LDPC code ensemble to analyze and to lower the error floor by using the above condition. Finally, we show upper and lower bounds of the error floors of the expurgated LDPC code ensemble over the MBIOS channel.

10 citations


Cited by
More filters
Book ChapterDOI
01 Jan 1993
TL;DR: It is shown that by using a sufficiently large number of relays in the proper manner, circuits can be built which are arbitrarily reliable, regardless of how unreliable the original relays are.
Abstract: An investigation is made of relays whose reliability can be described in simple terms by means of probabilities. It is shown that by using a sufficiently large number of these relays in the proper manner, circuits can be built which are arbitrarily reliable, regardless of how unreliable the original relays are. Various properties of these circuits are elucidated.

256 citations

Book
04 Nov 2013
TL;DR: This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding.
Abstract: Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, learning theory, and dynamical systems.This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors.The first part of the monograph introduces classical concentration inequalities for martingales, aswell as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication.The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities for functions of many independent random variables. The basic ingredients of the entropy method are discussed first in conjunction with the closely related topic of logarithmic Sobolev inequalities, which are typical of the so-called functional approach to studying the concentration of measure phenomenon. The discussion on logarithmic Sobolev inequalities is complemented by a related viewpoint based on probability in metric spaces. This viewpoint centers around the so-called transportation-cost inequalities, whose roots are in information theory. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method and related information-theoretic tools to problems in communications and coding. These include strong converses, empirical distributions of good channel codes with non-vanishing error probability, and an information-theoretic converse for concentration of measure.

211 citations