scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Information capacity and power control in single-cell multiuser communications

18 Jun 1995-Vol. 1, pp 331-335
TL;DR: By examining the bit error-rate with antipodal signalling, it is shown that an increase in capacity over a perfectly-power controlled (Gaussian) channel can be achieved, especially if the number of users is large, and the inherent diversity in multiuser communications over fading channels is shown.
Abstract: We consider a power control scheme for maximizing the information capacity of the uplink in single-cell multiuser communications with frequency-flat fading, under the assumption that the users attenuations are measured perfectly. Its main characteristics are that only one user transmits over the entire bandwidth at any particular time instant and that the users are allocated more power when their channels are good, and less when they are bad. Moreover, these features are independent of the statistics of the fading. Numerical results are presented for the case of single-path Rayleigh fading. We show that an increase in capacity over a perfectly-power controlled (Gaussian) channel can be achieved, especially if the number of users is large. By examining the bit error-rate with antipodal signalling, we show the inherent diversity in multiuser communications over fading channels.
Citations
More filters
Book
01 Jan 2005

9,038 citations

Journal ArticleDOI
TL;DR: Results show that, even though the interuser channel is noisy, cooperation leads not only to an increase in capacity for both users but also to a more robust system, where users' achievable rates are less susceptible to channel variations.
Abstract: Mobile users' data rate and quality of service are limited by the fact that, within the duration of any given call, they experience severe variations in signal attenuation, thereby necessitating the use of some type of diversity. In this two-part paper, we propose a new form of spatial diversity, in which diversity gains are achieved via the cooperation of mobile users. Part I describes the user cooperation strategy, while Part II (see ibid., p.1939-48) focuses on implementation issues and performance analysis. Results show that, even though the interuser channel is noisy, cooperation leads not only to an increase in capacity for both users but also to a more robust system, where users' achievable rates are less susceptible to channel variations.

6,621 citations


Cites background from "Information capacity and power cont..."

  • ...We, of course, expect the robustness to improve even more as the interuser channel quality ( ) improves....

    [...]

Journal ArticleDOI
01 Jun 2002
TL;DR: This work shows that true beamforming gains can be achieved when there are sufficient users, even though very limited channel feedback is needed, and proposes the use of multiple transmit antennas to induce large and fast channel fluctuations so that multiuser diversity can still be exploited.
Abstract: Multiuser diversity is a form of diversity inherent in a wireless network, provided by independent time-varying channels across the different users. The diversity benefit is exploited by tracking the channel fluctuations of the users and scheduling transmissions to users when their instantaneous channel quality is near the peak. The diversity gain increases with the dynamic range of the fluctuations and is thus limited in environments with little scattering and/or slow fading. In such environments, we propose the use of multiple transmit antennas to induce large and fast channel fluctuations so that multiuser diversity can still be exploited. The scheme can be interpreted as opportunistic beamforming and we show that true beamforming gains can be achieved when there are sufficient users, even though very limited channel feedback is needed. Furthermore, in a cellular system, the scheme plays an additional role of opportunistic nulling of the interference created on users of adjacent cells. We discuss the design implications of implementing. this scheme in a complete wireless system.

3,041 citations


Cites methods from "Information capacity and power cont..."

  • ...is best motivated by an information-theoretic result of Knopp and Humblet [9]....

    [...]

  • ...[9] R. Knopp and P. Humblet, “Information capacity and power control in single cell multiuser communications,” inProc....

    [...]

  • ...Thismultiuser diversity is best motivated by an information-theoretic result of Knopp and Humblet [9]....

    [...]

Journal ArticleDOI
TL;DR: The per-session throughput for applications with loose delay constraints, such that the topology changes over the time-scale of packet delivery, can be increased dramatically under this assumption, and a form of multiuser diversity via packet relaying is exploited.
Abstract: The capacity of ad hoc wireless networks is constrained by the mutual interference of concurrent transmissions between nodes. We study a model of an ad hoc network where n nodes communicate in random source-destination pairs. These nodes are assumed to be mobile. We examine the per-session throughput for applications with loose delay constraints, such that the topology changes over the time-scale of packet delivery. Under this assumption, the per-user throughput can increase dramatically when nodes are mobile rather than fixed. This improvement can be achieved by exploiting a form of multiuser diversity via packet relaying.

2,736 citations


Cites background from "Information capacity and power cont..."

  • ...Thismultiuser diversityis best motivated by an information theoretic result of Knopp and Hu mblet [4]....

    [...]

Book
16 Jan 2012
TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Abstract: This comprehensive treatment of network information theory and its applications provides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon's point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia.

2,442 citations

References
More filters
Book
01 Jan 1991
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

45,034 citations


"Information capacity and power cont..." refers background in this paper

  • ...This is an example of a broadcast channel or a one-tomany communication problem (again see [ 1 ])....

    [...]

  • ...The uplink refers to the information flow from the users to the base station and is an example of a classic multiuser channel, or a many-to-one communication problem (see [ 1 ])....

    [...]

  • ...simply a gaussian multiuser channel whose capacity region is defined by the following set of equations [ 1 ]...

    [...]

Book
01 Jan 1968
TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Abstract: Communication Systems and Information Theory. A Measure of Information. Coding for Discrete Sources. Discrete Memoryless Channels and Capacity. The Noisy-Channel Coding Theorem. Techniques for Coding and Decoding. Memoryless Channels with Discrete Time. Waveform Channels. Source Coding with a Fidelity Criterion. Index.

6,684 citations

Journal ArticleDOI
TL;DR: Some information-theoretic considerations used to determine upper bounds on the information rates that can be reliably transmitted over a two-ray propagation path mobile radio channel model, operating in a time division multiplex access (TDMA) regime, under given decoding delay constraints are presented.
Abstract: We present some information-theoretic considerations used to determine upper bounds on the information rates that can be reliably transmitted over a two-ray propagation path mobile radio channel model, operating in a time division multiplex access (TDMA) regime, under given decoding delay constraints. The sense in which reliability is measured is addressed, and in the interesting eases where the decoding delay constraint plays a significant role, the maximal achievable rate (capacity), is specified in terms of capacity versus outage. In this case, no coding capacity in the strict Shannon sense exists. Simple schemes for time and space diversity are examined, and their potential benefits are illuminated from an information-theoretic stand point. In our presentation, we chose to specialize to the TDMA protocol for the sake of clarity and convenience. Our main arguments and results extend directly to certain variants of other multiple access protocols such as code division multiple access (CDMA) and frequency division multiple access (FDMA), provided that no fast feedback from the receiver to the transmitter is available. >

1,216 citations

Journal ArticleDOI
A.D. Wyner1
TL;DR: Shannon-theoretic limits for a very simple cellular multiple-access system, and a scheme which does not require joint decoding of all the users, and is, in many cases, close to optimal.
Abstract: We obtain Shannon-theoretic limits for a very simple cellular multiple-access system. In our model the received signal at a given cell site is the sum of the signals transmitted from within that cell plus a factor /spl alpha/ (0/spl les//spl alpha//spl les/1) times the sum of the signals transmitted from the adjacent cells plus ambient Gaussian noise. Although this simple model is scarcely realistic, it nevertheless has enough meat so that the results yield considerable insight into the workings of real systems. We consider both a one dimensional linear cellular array and the familiar two-dimensional hexagonal cellular pattern. The discrete-time channel is memoryless. We assume that N contiguous cells have active transmitters in the one-dimensional case, and that N/sup 2/ contiguous cells have active transmitters in the two-dimensional case. There are K transmitters per cell. Most of our results are obtained for the limiting case as N/spl rarr//spl infin/. The results include the following. (1) We define C/sub N/,C/spl circ//sub N/ as the largest achievable rate per transmitter in the usual Shannon-theoretic sense in the one- and two-dimensional cases, respectively (assuming that all signals are jointly decoded). We find expressions for limN/spl rarr//spl infin/C/sub N/ and limN/spl rarr//spl infin/C/spl circ//sub N/. (2) As the interference parameter /spl alpha/ increases from 0, C/sub N/ and C/spl circ//sub N/ increase or decrease according to whether the signal-to-noise ratio is less than or greater than unity. (3) Optimal performance is attainable using TDMA within the cell, but using TDMA for adjacent cells is distinctly suboptimal. (4) We suggest a scheme which does not require joint decoding of all the users, and is, in many cases, close to optimal. >

787 citations


"Information capacity and power cont..." refers methods in this paper

  • ...The capacity of the uplink channel without fading for the multicell case, modelled both as linear and hexagonal arrays, is treated in [ 3 ]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors derived the channel capacity in a Rayleigh fading environment and showed that channel capacity is always lower than that in a Gaussian-noise environment and that diversity schemes can improve channel capacity.
Abstract: The channel capacity in a Rayleigh fading environment is derived. The result shows that the channel capacity in a Rayleigh fading environment is always lower than that in a Gaussian-noise environment. When operating a digital transmission in a mobile radio environment that has Rayleigh fading statistics, it is very important to know the degradations in channel capacity due to Rayleigh fading, and also to what degree the diversity schemes can raise the channel capacity in a Rayleigh fading environment. The curves are generated to show the degradation of channel capacity in a Rayleigh fading environment and its improvement by diversity schemes. >

371 citations