scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Book ChapterDOI
19 Aug 1996
TL;DR: An algorithm to construct for any problem instance of degree d and fan-out k a communication schedule with total communication time at most qd+k1/q(d−1), for any integer q≥2 is presented.
Abstract: We consider the Multi-Message Multicasting problem for the n processor fully connected static network. We present an efficient algorithm to construct a communication schedule with total communication time at most d2, where d is the maximum number of messages a processor may send (receive). We present an algorithm to construct for any problem instance of degree d and fan-out k (maximum number of processors that may receive a given message) a communication schedule with total communication time at most qd+k1/q(d−1), for any integer q≥2. The time complexity bound for our algorithm is O(n(d(q+k1/q))q). Our main result is a linear time approximation algorithm with a smaller approximation bound for small values of k(<100). We discuss applications and show how to adapt our algorithms to dynamic networks such as the Benes network, the interconnection network used in the Meiko CS-2.

6 citations

Journal ArticleDOI
TL;DR: This paper has proposed an enhancing IKSD algorithm by adding the combining of column norm ordering (channel ordering) with Manhattan metric to enhance the performance and reduce the computational complexity.
Abstract: The main challenge in MIMO systems is how to design the MIMO detection algorithms with lowest computational complexity and high performance that capable of accurately detecting the transmitted signals. In last valuable research results, it had been proved the Maximum Likelihood Detection (MLD) as the optimum one, but this algorithm has an exponential complexity especially with increasing of a number of transmit antennas and constellation size making it an impractical for implementation. However, there are alternative algorithms such as the K-best sphere detection (KSD) and Improved K-best sphere detection (IKSD) which can achieve a close to Maximum Likelihood (ML) performance and less computational complexity. In this paper, we have proposed an enhancing IKSD algorithm by adding the combining of column norm ordering (channel ordering) with Manhattan metric to enhance the performance and reduce the computational complexity. The simulation results show us that the channel ordering approach enhances the performance and reduces the complexity, and Manhattan metric alone can reduce the complexity. Therefore, the combined channel ordering approach with Manhattan metric enhances the performance and much reduces the complexity more than if we used the channel ordering approach alone. So our proposed algorithm can be considered a feasible complexity reduction scheme and suitable for practical implementation.

6 citations

Journal ArticleDOI
TL;DR: A Necessary Information Complexity notion is introduced to quantify the minimum amount of information needed for the existence of a Probabilistic Approximate equilibrium in statistical ensembles of games.
Abstract: In this work, we study Static and Dynamic Games on Large Networks of interacting agents, assuming that the players have some statistical description of the interaction graph, as well as some local information. Inspired by Statistical Physics, we consider statistical ensembles of games and define a Probabilistic Approximate equilibrium notion for such ensembles. A Necessary Information Complexity notion is introduced to quantify the minimum amount of information needed for the existence of a Probabilistic Approximate equilibrium. We then focus on some special classes of games for which it is possible to derive upper and/or lower bounds for the complexity. At first, static and dynamic games on random graphs are studied and their complexity is determined as a function of the graph connectivity. In the low complexity case, we compute Probabilistic Approximate equilibrium strategies. We then consider static games on lattices and derive upper and lower bounds for the complexity, using contraction mapping ideas. A LQ game on a large ring is also studied numerically. Using a reduction technique, approximate equilibrium strategies are computed and it turns out that the complexity is relatively low.

6 citations

Journal ArticleDOI
TL;DR: The 3-Satisfiability problem is analyzed and the existence of fast decision procedures for this problem over the reals is examined based on certain conditions on the discrete setting.
Abstract: Relations between discrete and continuous complexity models are considered. The present paper is devoted to combine both models. In particular we analyze the 3-Satisfiability problem. The existence of fast decision procedures for this problem over the reals is examined based on certain conditions on the discrete setting. Moreover we study the behaviour of exponential time computations over the reals depending on the real complexity of 3-Satisfiability. This will be done using tools from complexity theory over the integers.

6 citations

Journal ArticleDOI
TL;DR: An approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts.
Abstract: In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann–Gibbs–Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner–Davison–Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.

6 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732