scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Proceedings ArticleDOI
01 Oct 2018
TL;DR: In this paper, a maximum induced matching algorithm for permutation graphs was proposed, which has better time complexity than the best known algorithm in both worst case and average case, where k(G) is the cardinality of the minimum clique cover set.
Abstract: Let $G = (V,~E)$ be an undirected graph, where V is the vertex set and E is the edge set. A subset M of E is an induced matching of G if M is a matching of G and no two edges in M are joined by an edge. Finding a maximum induced matching is a $\mathbb {N}\mathbb {P}$-Hard problem on general graphs, even on bipartite graphs. However, this problem can be solved in polynomial time in some special graph classes such as weakly chordal, chordal, interval and circular-arc graphs. In this paper, we introduce a maximum induced matching algorithm in permutation graphs with $O(|V|k(G)\log \log (|V|))$ time in worst case complexity and $O(|V|\sqrt {|V|}\log \log (|V|))$ time in average case complexity, where $k(G)$ is the cardinality of the minimum clique cover set. The approach is to reduce the size of vertex set of $L(G)^{2}$ without changing the cardinality of its maximum independent set. Our algorithm has better time complexity than the best known algorithm in both worst case and average case.
Journal ArticleDOI
TL;DR: It is shown that the combined use of Cooley-Tukey and Good-Thomas decompositions in the reduced wise algorithm decreases the computational complexity of the resulting convolution algorithm as compared to that of the algorithms in the support set.
Abstract: A reduction operation and the design of a reduced wise algorithm over a set of known algorithms of unvarying complexity are addressed. A direct convolution algorithm and the best-known algorithms based on fast discrete orthogonal transforms (with Cooley-Tukey and Good-Thomas decompositions and the Rader algorithm for short lengths) are used as a support set of known algorithms. It is shown that their combined use in the reduced wise algorithm decreases the computational complexity of the resulting convolution algorithm as compared to that of the algorithms in the support set.
Proceedings ArticleDOI
01 Oct 2013
TL;DR: An algorithm is introduced that can perform with improved efficiency irrespective of the underlying data structure used, which can determine equality with reduced memory requirements O (1) and in a single pass O (n).
Abstract: We present a new algorithm to test the equality of tests, which is highly efficient in terms of space and time complexity. Most existing methodologies make use of specialized data structures such as `tries' to perform operations on sets and provide probabilistic results. In this paper, we introduce an algorithm that can perform with improved efficiency irrespective of the underlying data structure used. When compared with algorithms in existing literature, this method can determine equality with reduced memory requirements O (1) and in a single pass O (n). We show both, mathematically and through experiments that the proposed method fares better than its predecessors.
Journal ArticleDOI
TL;DR: Three numerical algorithms for computing the solution of the covariance matrix differential equations of states of a linear time-invariant dynamical system forced by white Gaussian noise are analyzed.
Proceedings ArticleDOI
11 Apr 2000
TL;DR: An algorithm for adaptive image partitioning achieving a range of rates under a computational complexity constraint is developed and results in a reduction of the computational complexity as compared to other known algorithms at the same rate-distortion operating point.
Abstract: Fractal image coding is a relatively new technique for compact representation of an image by exploiting self-similarities between parts of the image and other parts in it at a higher resolution. The various parts at different resolutions are consequences of a partition grid and splitting criterion applied to the image. We propose a partitioning criterion which takes in consideration the computational complexity of the encoding process. Based on this criterion we develop an algorithm for adaptive image partitioning achieving a range of rates under a computational complexity constraint. The proposed algorithm results in a reduction of the computational complexity as compared to other known algorithms at the same rate-distortion operating point.

Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732