scispace - formally typeset
Journal ArticleDOI

Network information flow

TLDR
This work reveals that it is in general not optimal to regard the information to be multicast as a "fluid" which can simply be routed or replicated, and by employing coding at the nodes, which the work refers to as network coding, bandwidth can in general be saved.
Abstract
We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a point-to-point communication network on which a number of information sources are to be multicast to certain sets of destinations. We assume that the information sources are mutually independent. The problem is to characterize the admissible coding rate region. This model subsumes all previously studied models along the same line. We study the problem with one information source, and we have obtained a simple characterization of the admissible coding rate region. Our result can be regarded as the max-flow min-cut theorem for network information flow. Contrary to one's intuition, our work reveals that it is in general not optimal to regard the information to be multicast as a "fluid" which can simply be routed or replicated. Rather, by employing coding at the nodes, which we refer to as network coding, bandwidth can in general be saved. This finding may have significant impact on future design of switching systems.

read more

Citations
More filters
Proceedings ArticleDOI

Quantitative information flow as network flow capacity

TL;DR: A new technique for determining how much information about a program's secret inputs is revealed by its public outputs is presented, which achieves a more precise quantitative result by computing a maximum flow of information between the inputs and outputs.
Journal ArticleDOI

Universal Secure Network Coding via Rank-Metric Codes

TL;DR: The problem of securing a network coding communication system against an eavesdropper and a coding scheme is proposed that can achieve the maximum possible rate of n - μ packets, which is shown to be optimal under the assumption of zero-error communication.
Journal ArticleDOI

Feedback Capacity of the Gaussian Interference Channel to Within 2 Bits

TL;DR: In this article, the capacity region of the two-user Gaussian IC with feedback was characterized to within 2 bits/s/Hz and the symmetric capacity to within 1 bit/S/Hz.
Journal ArticleDOI

Network information flow with correlated sources

TL;DR: In this paper, it was shown that perfect reconstruction is possible if and only if H(U/sub s/ | U/sub S(c)/) < /spl Sigma//sub i/spl,j/spl isin/S(c) for all S/spl sube/{0...M},S/spl ne/O,0/spl poly(S) isin/(S)
Proceedings ArticleDOI

Adaptive network coding and scheduling for maximizing throughput in wireless networks

TL;DR: This paper proposes a general framework to develop optimal and adaptive joint network coding and scheduling schemes, and applies this framework to two different network coding architectures: COPE, a scheme recently proposed in [7], and XOR-Sym, a new scheme presented here.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI

Factor graphs and the sum-product algorithm

TL;DR: A generic message-passing algorithm, the sum-product algorithm, that operates in a factor graph, that computes-either exactly or approximately-various marginal functions derived from the global function.
Journal ArticleDOI

Noiseless coding of correlated information sources

TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI

Linear network coding

TL;DR: This work forms this multicast problem and proves that linear coding suffices to achieve the optimum, which is the max-flow from the source to each receiving node.
Journal ArticleDOI

Achievable rates for multiple descriptions

TL;DR: These rates are shown to be optimal for deterministic distortion measures for random variables and Shannon mutual information.
Related Papers (5)