Journal ArticleDOI
Network information flow
TLDR
This work reveals that it is in general not optimal to regard the information to be multicast as a "fluid" which can simply be routed or replicated, and by employing coding at the nodes, which the work refers to as network coding, bandwidth can in general be saved.Abstract:
We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a point-to-point communication network on which a number of information sources are to be multicast to certain sets of destinations. We assume that the information sources are mutually independent. The problem is to characterize the admissible coding rate region. This model subsumes all previously studied models along the same line. We study the problem with one information source, and we have obtained a simple characterization of the admissible coding rate region. Our result can be regarded as the max-flow min-cut theorem for network information flow. Contrary to one's intuition, our work reveals that it is in general not optimal to regard the information to be multicast as a "fluid" which can simply be routed or replicated. Rather, by employing coding at the nodes, which we refer to as network coding, bandwidth can in general be saved. This finding may have significant impact on future design of switching systems.read more
Citations
More filters
Journal ArticleDOI
Decentralized erasure codes for distributed networked storage
TL;DR: In this article, the problem of constructing an erasure code for storage over a network when the data sources are distributed is considered, and it is shown that decentralized erasure codes are optimally sparse, and lead to reduced communication, storage and computation cost over random linear coding.
Proceedings ArticleDOI
Iterative Network and Channel Decoding for the Two-Way Relay Channel
TL;DR: A joint network-channel coding method is described for this channel model, where channel codes are used at both users and a network code is used at the relay, which forms a distributed turbo code which is called turbo network code.
Journal ArticleDOI
Networks, Matroids, and Non-Shannon Information Inequalities
TL;DR: The Vamos network is constructed, and it is proved that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.
Proceedings ArticleDOI
Achieving minimum-cost multicast: a decentralized approach based on network coding
TL;DR: Decentralized algorithms that compute minimum-cost subgraphs for establishing multicast connections in networks that use coding, coupled with existing decentralized schemes for constructing network codes, constitute a fully decentralized approach for achieving minimum- cost multicast.
Posted Content
Reliable Physical Layer Network Coding
Bobak Nazer,Michael Gastpar +1 more
TL;DR: In this paper, the authors explore the core ideas behind linear network coding and the possibilities it offers for communication over interference-limited wireless networks, and present some simple examples of such a technique.
References
More filters
Book
Elements of information theory
Thomas M. Cover,Joy A. Thomas +1 more
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI
Factor graphs and the sum-product algorithm
TL;DR: A generic message-passing algorithm, the sum-product algorithm, that operates in a factor graph, that computes-either exactly or approximately-various marginal functions derived from the global function.
Journal ArticleDOI
Noiseless coding of correlated information sources
David Slepian,Jack K. Wolf +1 more
TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI
Linear network coding
TL;DR: This work forms this multicast problem and proves that linear coding suffices to achieve the optimum, which is the max-flow from the source to each receiving node.
Journal ArticleDOI
Achievable rates for multiple descriptions
Abbas El Gamal,Thomas M. Cover +1 more
TL;DR: These rates are shown to be optimal for deterministic distortion measures for random variables and Shannon mutual information.