scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Edge-Cut Bounds on Network Coding Rates

01 Mar 2006-Journal of Network and Systems Management (Springer US)-Vol. 14, Iss: 1, pp 49-67
TL;DR: A new bound on communication rates is developed that applies to network coding, which is a promising active network application that has processors transmit packets that are general functions, for example a bit-wise XOR of selected received packets.
Abstract: Active networks are network architectures with processors that are capable of executing code carried by the packets passing through them. A critical network management concern is the optimization of such networks and tight bounds on their performance serve as useful design benchmarks. A new bound on communication rates is developed that applies to network coding, which is a promising active network application that has processors transmit packets that are general functions, for example a bit-wise XOR, of selected received packets. The bound generalizes an edge-cut bound on routing rates by progressively removing edges from the network graph and checking whether certain strengthened d-separation conditions are satisfied. The bound improves on the cut-set bound and its efficacy is demonstrated by showing that routing is rate-optimal for some commonly cited examples in the networking literature.

Content maybe subject to copyright    Report

Citations
More filters
Book
16 Jan 2012
TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Abstract: This comprehensive treatment of network information theory and its applications provides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon's point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia.

2,442 citations

Book
01 Jun 2007
TL;DR: This article reviews progress in cooperative communication networks and intends its presentation to be a tutorial for the reader who is familiar with information theory concepts but has not actively followed the field.
Abstract: This article reviews progress in cooperative communication networks. Our survey is by no means exhaustive. Instead, we assemble a representative sample of recent results to serve as a roadmap for the area. Our emphasis is on wireless networks, but many of the results apply to cooperation in wireline networks and mixed wireless/wireline networks. We intend our presentation to be a tutorial for the reader who is familiar with information theory concepts but has not actively followed the field. For the active researcher, this contribution should serve as a useful digest of significant results. This article is meant to encourage readers to find new ways to apply the ideas of network cooperation and should make the area sufficiently accessible to network designers to contribute to the advancement of networking practice.

334 citations

Book
Gerhard Kramer1
25 Jun 2008
TL;DR: This survey builds up knowledge on random coding, binning, superposition coding, and capacity converses by introducing progressively more sophisticated tools for a selection of source and channel models.
Abstract: This survey reviews fundamental concepts of multi-user information theory. Starting with typical sequences, the survey builds up knowledge on random coding, binning, superposition coding, and capacity converses by introducing progressively more sophisticated tools for a selection of source and channel models. The problems addressed include: Source Coding; Rate-Distortion and Multiple Descriptions; Capacity-Cost; The Slepian–Wolf Problem; The Wyner-Ziv Problem; The Gelfand-Pinsker Problem; The Broadcast Channel; The Multiaccess Channel; The Relay Channel; The Multiple Relay Channel; and The Multiaccess Channel with Generalized Feedback. The survey also includes a review of basic probability and information theory.

290 citations

Journal ArticleDOI
TL;DR: The Vamos network is constructed, and it is proved that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.
Abstract: We define a class of networks, called matroidal networks, which includes as special cases all scalar-linearly solvable networks, and in particular solvable multicast networks. We then present a method for constructing matroidal networks from known matroids. We specifically construct networks that play an important role in proving results in the literature, such as the insufficiency of linear network coding and the unachievability of network coding capacity. We also construct a new network, from the Vamos matroid, which we call the Vamos network, and use it to prove that Shannon-type information inequalities are in general not sufficient for computing network coding capacities. To accomplish this, we obtain a capacity upper bound for the Vamos network using a non-Shannon-type information inequality discovered in 1998 by Zhang and Yeung, and then show that it is smaller than any such bound derived from Shannon-type information inequalities. This is the first application of a non-Shannon-type inequality to network coding. We also compute the exact routing capacity and linear coding capacity of the Vamos network. Finally, using a variation of the Vamos network, we prove that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.

279 citations

Book
01 Nov 2012
TL;DR: 1. The concept of cognitive radio, capacity of cognitiveRadio networks, and Propagation issues for cognitive radio: a review.
Abstract: Widely regarded as one of the most promising emerging technologies for driving the future development of wireless communications, cognitive radio has the potential to mitigate the problem of increasing radio spectrum scarcity through dynamic spectrum allocation. Drawing on fundamental elements of information theory, network theory, propagation, optimisation and signal processing, a team of leading experts present a systematic treatment of the core physical and networking principles of cognitive radio and explore key design considerations for the development of new cognitive radio systems. Containing all the underlying principles you need to develop practical applications in cognitive radio, this book is an essential reference for students, researchers and practitioners alike in the field of wireless communications and signal processing.

236 citations

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations


Additional excerpts

  • ...7, we choose Ed = {(2, 3), (4, 3), (2, 5), (4, 5)}, Sd = {1, 2, 3}, [π(1), π(2), π(3)] = [3, 1, 2] and the resulting graph GEd is shown in Fig....

    [...]

  • ...7 we choose Ed = {(3, 2), (3, 4), (5, 2), (5, 4)}, Sd = {2, 3}, [π(1), π(2)] = [2, 3]....

    [...]

Book
01 Jan 1988
TL;DR: Probabilistic Reasoning in Intelligent Systems as mentioned in this paper is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty, and provides a coherent explication of probability as a language for reasoning with partial belief.
Abstract: From the Publisher: Probabilistic Reasoning in Intelligent Systems is a complete andaccessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty—and offers techniques, based on belief networks, that provide a mechanism for making semantics-based systems operational. Specifically, network-propagation techniques serve as a mechanism for combining the theoretical coherence of probability theory with modern demands of reasoning-systems technology: modular declarative inputs, conceptually meaningful inferences, and parallel distributed computation. Application areas include diagnosis, forecasting, image interpretation, multi-sensor fusion, decision support systems, plan recognition, planning, speech recognition—in short, almost every task requiring that conclusions be drawn from uncertain clues and incomplete information. Probabilistic Reasoning in Intelligent Systems will be of special interest to scholars and researchers in AI, decision theory, statistics, logic, philosophy, cognitive psychology, and the management sciences. Professionals in the areas of knowledge-based systems, operations research, engineering, and statistics will find theoretical and computational tools of immediate practical use. The book can also be used as an excellent text for graduate-level courses in AI, operations research, or applied probability.

15,671 citations

Journal ArticleDOI
TL;DR: This work reveals that it is in general not optimal to regard the information to be multicast as a "fluid" which can simply be routed or replicated, and by employing coding at the nodes, which the work refers to as network coding, bandwidth can in general be saved.
Abstract: We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a point-to-point communication network on which a number of information sources are to be multicast to certain sets of destinations. We assume that the information sources are mutually independent. The problem is to characterize the admissible coding rate region. This model subsumes all previously studied models along the same line. We study the problem with one information source, and we have obtained a simple characterization of the admissible coding rate region. Our result can be regarded as the max-flow min-cut theorem for network information flow. Contrary to one's intuition, our work reveals that it is in general not optimal to regard the information to be multicast as a "fluid" which can simply be routed or replicated. Rather, by employing coding at the nodes, which we refer to as network coding, bandwidth can in general be saved. This finding may have significant impact on future design of switching systems.

8,533 citations


"Edge-Cut Bounds on Network Coding R..." refers background in this paper

  • ...For example, it is known that linear network coding is optimal for multicasting a single source in directed networks [1], [9]....

    [...]

  • ...The terminals can further perform network coding [1], [9], i....

    [...]

  • ...7, we choose Ed = {(2, 3), (4, 3), (2, 5), (4, 5)}, Sd = {1, 2, 3}, [π(1), π(2), π(3)] = [3, 1, 2] and the resulting graph GEd is shown in Fig....

    [...]

  • ...Network coding has been intensely studied since [1] presented a novel coding scheme that attains a cut-set bound for multicasting in networks....

    [...]

Book
01 Jan 1962
TL;DR: Ford and Fulkerson as mentioned in this paper set the foundation for the study of network flow problems and developed powerful computational tools for solving and analyzing network flow models, and also furthered the understanding of linear programming.
Abstract: In this classic book, first published in 1962, L. R. Ford, Jr., and D. R. Fulkerson set the foundation for the study of network flow problems. The models and algorithms introduced in Flows in Networks are used widely today in the fields of transportation systems, manufacturing, inventory planning, image processing, and Internet traffic. The techniques presented by Ford and Fulkerson spurred the development of powerful computational tools for solving and analyzing network flow models, and also furthered the understanding of linear programming. In addition, the book helped illuminate and unify results in combinatorial mathematics while emphasizing proofs based on computationally efficient construction. Flows in Networks is rich with insights that remain relevant to current research in engineering, management, and other sciences. This landmark work belongs on the bookshelf of every researcher working with networks.

4,341 citations