scispace - formally typeset
Search or ask a question
Topic

Upper and lower bounds

About: Upper and lower bounds is a research topic. Over the lifetime, 56902 publications have been published within this topic receiving 1143379 citations. The topic is also known as: majoring or minoring element.


Papers
More filters
Journal ArticleDOI
TL;DR: A coding theorem and weak converse are proved and a necessary and sufficient condition for a positive capacity is derived and upper and lower bounds on the capacity are obtained, which coincide for channels with symmetric structure.
Abstract: Summary The problem of transmitting information in a specified direction over a communication channel with three terminals is considered. Examples are given of the various ways of sending information. Basic inequalities for average mutual information rates are obtained. A coding theorem and weak converse are proved and a necessary and sufficient condition for a positive capacity is derived. Upper and lower bounds on the capacity are obtained, which coincide for channels with symmetric structure.

1,727 citations

Journal ArticleDOI
TL;DR: Fault-tolerant consensus protocols are given for various cases of partial synchrony and various fault models that allow partially synchronous processors to reach some approximately common notion of time.
Abstract: The concept of partial synchrony in a distributed system is introduced. Partial synchrony lies between the cases of a synchronous system and an asynchronous system. In a synchronous system, there is a known fixed upper bound D on the time required for a message to be sent from one processor to another and a known fixed upper bound P on the relative speeds of different processors. In an asynchronous system no fixed upper bounds D and P exist. In one version of partial synchrony, fixed bounds D and P exist, but they are not known a priori. The problem is to design protocols that work correctly in the partially synchronous system regardless of the actual values of the bounds D and P. In another version of partial synchrony, the bounds are known, but are only guaranteed to hold starting at some unknown time T, and protocols must be designed to work correctly regardless of when time T occurs. Fault-tolerant consensus protocols are given for various cases of partial synchrony and various fault models. Lower bounds that show in most cases that our protocols are optimal with respect to the number of faults tolerated are also given. Our consensus protocols for partially synchronous processors use new protocols for fault-tolerant “distributed clocks” that allow partially synchronous processors to reach some approximately common notion of time.

1,613 citations

Proceedings ArticleDOI
05 Dec 1989
TL;DR: An exact characterization of the ability of the rate monotonic scheduling algorithm to meet the deadlines of a periodic task set and a stochastic analysis which gives the probability distribution of the breakdown utilization of randomly generated task sets are represented.
Abstract: An exact characterization of the ability of the rate monotonic scheduling algorithm to meet the deadlines of a periodic task set is represented. In addition, a stochastic analysis which gives the probability distribution of the breakdown utilization of randomly generated task sets is presented. It is shown that as the task set size increases, the task computation times become of little importance, and the breakdown utilization converges to a constant determined by the task periods. For uniformly distributed tasks, a breakdown utilization of 88% is a reasonable characterization. A case is shown in which the average-case breakdown utilization reaches the worst-case lower bound of C.L. Liu and J.W. Layland (1973). >

1,582 citations

Journal ArticleDOI
TL;DR: It is shown that while the ordinary capacity of a memoryless channel with feedback is equal to that of the same channel without feedback, the zero error capacity may be greater and a solution is given to the problem of evaluating C_oF.
Abstract: The zero error capacity C_o of a noisy channel is defined as the least upper bound of rates at which it is possible to transmit information with zero probability of error. Various properties of C_o are studied; upper and lower bounds and methods of evaluation of C_o are given. Inequalities are obtained for the C_o relating to the "sum" and "product" of two given channels. The analogous problem of zero error capacity C_oF for a channel with a feedback link is considered. It is shown that while the ordinary capacity of a memoryless channel with feedback is equal to that of the same channel without feedback, the zero error capacity may be greater. A solution is given to the problem of evaluating C_oF .

1,581 citations

Journal ArticleDOI
TL;DR: A simple algorithm for computing channel capacity is suggested that consists of a mapping from the set of channel input probability vectors into itself such that the sequence of probability vectors generated by successive applications of the mapping converges to the vector that achieves the capacity of the given channel.
Abstract: By defining mutual information as a maximum over an appropriate space, channel capacities can be defined as double maxima and rate-distortion functions as double minima. This approach yields valuable new insights regarding the computation of channel capacities and rate-distortion functions. In particular, it suggests a simple algorithm for computing channel capacity that consists of a mapping from the set of channel input probability vectors into itself such that the sequence of probability vectors generated by successive applications of the mapping converges to the vector that achieves the capacity of the given channel. Analogous algorithms then are provided for computing rate-distortion functions and constrained channel capacities. The algorithms apply both to discrete and to continuous alphabet channels or sources. In addition, a formalization of the theory of channel capacity in the presence of constraints is included. Among the examples is the calculation of close upper and lower bounds to the rate-distortion function of a binary symmetric Markov source.

1,472 citations


Network Information
Related Topics (5)
Bounded function
77.2K papers, 1.3M citations
91% related
Matrix (mathematics)
105.5K papers, 1.9M citations
89% related
Eigenvalues and eigenvectors
51.7K papers, 1.1M citations
89% related
Probability distribution
40.9K papers, 1.1M citations
89% related
Markov chain
51.9K papers, 1.3M citations
88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
20231,761
20223,754
20212,833
20203,089
20192,954