scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Proceedings ArticleDOI
26 Jun 2012
TL;DR: It is proved that an associative algebra A has minimal rank if and only if the Alder-Strassen bound is also tight for the multiplicative complexity of A, and that if A is local or superbasic, then every optimal quadratic computation for A is almost bilinear.
Abstract: We prove that an associative algebra $A$ has minimal rank if and only if the Alder -- Strassen bound is also tight for the multiplicative complexity of $A$, that is, the multiplicative complexity of $A$ is $2 \dim A - t_A$ where $t_A$ denotes the number of maximal two sided ideals of $A$. This generalizes a result by E. Feig who proved this for division algebras. Furthermore, we show that if $A$ is local or super basic, then every optimal quadratic computation for $A$ is almost bilinear.
Book ChapterDOI
01 Jan 2012
TL;DR: The simulation results indicate that the proposed Block-FFT algorithm exhibits same BER performance with the other two algorithms while whose computational complexity and memory requirements are sharply decreased and it is more tractable to realize.
Abstract: A new method to implement the Block-FFT algorithm is investigated based on the comparative study of the computation of the inverse of the system matrix which is the crucial part of Joint data Detection algorithms, and the computational complexity, memory requirements and the BER performance of the proposed Block-FFT algorithm is compared to that of standard Cholesky decomposition algorithm and approximate Cholesky decomposition algorithm. The simulation results indicate that the proposed Block-FFT algorithm exhibits same BER performance with the other two algorithms while whose computational complexity and memory requirements are sharply decreased . Furthermore, it is more tractable to realize.
Journal ArticleDOI
TL;DR: This paper defines restriction complexity, compute its values for some functions, and determine its range in the form of upper and lower bounds on it, showing that restriction complexity is close to the cardinality of support measure in an almost all sense as defined by Abu-Mostafa.
Abstract: Complexity of boolean functions can be computed in many ways. Various complexity measures exist which are based on different models of representation of the boolean function. The complexity measures range from very coarse and simple to very fine and hard to compute. The introduced complexity meausre called Restriction complexity is a measure based on number of restrictions of a boolean function. In this paper we define restriction complexity, compute its values for some functions, and determine its range in the form of upper and lower bounds on it. We also show that restriction complexity is close to the cardinality of support measure in an almost all sense as defined by Abu-Mostafa.
Proceedings ArticleDOI
13 Nov 2014
TL;DR: A new sorting technique based on divide & conquer approach, named as Fuse sort algorithm, an approach of comparison based sorting with O(nloglogn) time and linear space is presented.
Abstract: Computational Complexity is a fundamental research area in the field of computer science. It has attracted lots of interest of various researchers. In past, vast number of sorting algorithms has been proposed by various researchers. To efficiently optimize any sorting problem having large number of elements requires O(nlogn) time in average case by existing sorting techniques. This paper presents a new sorting technique based on divide & conquer approach, named as Fuse sort algorithm, an approach of comparison based sorting with O(nloglogn) time and linear space. The priory and mathematical analysis of proposed sorting algorithm is given and a case study with merge sort is performed based on several factors.
Proceedings ArticleDOI
14 Nov 2005
TL;DR: A new k-DT algorithm is proposed that divides the problem into D 1-dimensional problems and is compared to the existing raster-scanning and propagation approaches to compare its accuracy and computational complexity.
Abstract: The signed k-distance transformation (k-DT) computes the k nearest prototypes from each location on a discrete regular grid within a given D dimensional volume. We propose a new k-DT algorithm that divides the problem into D 1-dimensional problems and compare its accuracy and computational complexity to the existing raster-scanning and propagation approaches.

Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732