scispace - formally typeset
Search or ask a question

Showing papers on "Average-case complexity published in 2019"


Journal ArticleDOI
TL;DR: It is shown that a certain “product” lower bound method of Impagliazzo and Williams (CCC 2010) fails to capture PNP communication complexity up to polynomial factors, which answers a question of Papakonstantinou, Scheder, and Song (CCC 2014).
Abstract: We prove that the PNP-type query complexity (alternatively, decision list width) of any Boolean function f is quadratically related to the PNP-type communication complexity of a lifted version of f. As an application, we show that a certain “product” lower bound method of Impagliazzo and Williams (CCC 2010) fails to capture PNP communication complexity up to polynomial factors, which answers a question of Papakonstantinou, Scheder, and Song (CCC 2014).

31 citations


Proceedings ArticleDOI
19 Mar 2019
TL;DR: In this article, the problem of counting k-cliques in s-uniform Erdős-Renyi hypergraphs G(n, c, s) with edge density c was considered and it was shown that its fine-grained average case complexity can be based on its worst-case complexity.
Abstract: The complexity of clique problems on Erdős-Renyi random graphs has become a central topic in average-case complexity. Algorithmic phase transitions in these problems have been shown to have broad connections ranging from mixing of Markov chains and statistical physics to information-computation gaps in high-dimensional statistics. We consider the problem of counting k-cliques in s-uniform Erdős-Renyi hypergraphs G(n, c, s) with edge density c and show that its fine-grained average-case complexity can be based on its worst-case complexity. We prove the following: •Dense Erdős-Renyi hypergraphs: Counting k-cliques on G(n, c, s) with k and c constant matches its worst-case complexity up to a polylog(n) factor. Assuming ETH, it takes n^Ω(k) time to count k-cliques in G(n, c, s) if k and c are constant. • Sparse Erdős-Renyi hypergraphs: When c = Θ(n^-α), for each fixed α our reduction yields different average-case phase diagrams depicting a tradeoff between runtime and k. Assuming the best known worst-case algorithms are optimal, in the graph case of s = 2, we establish that the exponent in n of the optimal running time for k-clique counting in G(n, c, s) is ωk/3 - C α (k/2) + O_k, α (1), where ω/9 ≤ C ≤ 1 and ω is the matrix multiplication constant. In the hypergraph case of s ≥ 3, we show a lower bound at the exponent of k - α (k/s) + O_k, α (1) which surprisingly is tight against algorithmic achievability exactly for the set of c above the Erdős-Renyi k-clique percolation threshold. Our reduction yields the first known average-case hardness result on Erdos-Renyi hypergraphs based on a worst-case hardness assumption. We also analyze several natural algorithms for counting k-cliques in G(n, c, s) that establish our upper bounds in the sparse case c = Θ(n^-α).

28 citations


Proceedings ArticleDOI
24 Jun 2019
TL;DR: It is shown that complexity analysis of probabilistic higher-order functional programs can be carried out compositionally by way of a type system and that any average case nolytime Turing machines can be encoded as a term typable in $\ell\pmb{\mathsf{RPCF}}$.
Abstract: We show that complexity analysis of probabilistic higher-order functional programs can be carried out compositionally by way of a type system. The introduced type system is a significant extension of refinement types. On the one hand, the presence of probabilistic effects requires adopting a form of dynamic distribution type, subject to a coupling-based subtyping discipline. On the other hand, recursive definitions are proved terminating by way of Lyapunov ranking functions. We prove not only that the obtained type system, called $\ell \pmb{\mathsf{RPCF}}$ , provides a sound methodology for average case complexity analysis, but also that it is extensionally complete, in the sense that any average case nolytime Turing machines can be encoded as a term typable in $\ell\pmb{\mathsf{RPCF}}$ .

18 citations


Journal ArticleDOI
TL;DR: Computational complexity bounds for various classes of functions computed by cost register automata are given.

6 citations


Journal ArticleDOI
TL;DR: It is proved that an average number of comparisons for both presented algorithms ITS and BQS is less than for the case of correct implementation of the BS algorithm, which is widely used in the literature.

5 citations



Posted Content
TL;DR: The average-case complexity of a branch-and-bound algorithms for Minimum Dominating Set problem in random graphs in the G(n,p) model is studied and phase transitions between subexponential and exponential average- case complexities are identified.
Abstract: The average-case complexity of a branch-and-bound algorithms for Minimum Dominating Set problem in random graphs in the G(n,p) model is studied. We identify phase transitions between subexponential and exponential average-case complexities, depending on the growth of the probability p with respect to the number n of nodes.

4 citations


Posted Content
TL;DR: This work considers the problem of counting k-cliques in s-uniform Erdős-Rényi hypergraphs G(n, c, s) with edge density c and proves that its fine-grained average- case complexity can be based on its worst-case complexity.
Abstract: We consider the problem of counting $k$-cliques in $s$-uniform Erdos-Renyi hypergraphs $G(n,c,s)$ with edge density $c$, and show that its fine-grained average-case complexity can be based on its worst-case complexity. We prove the following: 1. Dense Erdos-Renyi graphs and hypergraphs: Counting $k$-cliques on $G(n,c,s)$ with $k$ and $c$ constant matches its worst-case time complexity up to a $\mathrm{polylog}(n)$ factor. Assuming randomized ETH, it takes $n^{\Omega(k)}$ time to count $k$-cliques in $G(n,c,s)$ if $k$ and $c$ are constant. 2. Sparse Erdos-Renyi graphs and hypergraphs: When $c = \Theta(n^{-\alpha})$, we give several algorithms exploiting the sparsity of $G(n, c, s)$ that are faster than the best known worst-case algorithms. Complementing this, based on a fine-grained worst-case assumption, our results imply a different average-case phase diagram for each fixed $\alpha$ depicting a tradeoff between a runtime lower bound and $k$. Surprisingly, in the hypergraph case ($s \ge 3$), these lower bounds are tight against our algorithms exactly when $c$ is above the Erdős-Renyi $k$-clique percolation threshold. This is the first worst-case-to-average-case hardness reduction for a problem on Erdős-Renyi hypergraphs that we are aware of. We also give a variant of our result for computing the parity of the $k$-clique count that tolerates higher error probability.

2 citations


Journal ArticleDOI
21 Nov 2019
TL;DR: The robustness of average case Oemp(nlog2n) complexity for smart sort is conjectured as result of study for various regression models and factorial design experiments.
Abstract: This research article is a systematic study towards exploring the parameterised behaviour of smart sort, a comparison based sorting algorithm. Our observation for quick sort led us to conjecture that for sufficiently large samples of fixed size, the average case runtime complexity is: yavg(n, td) = Oemp(td), where yavg denotes the average complexity with parameters n and td denoting the input size and frequency of an element (tie density) respectively. The notation Oemp (also called empirical-O) is the statistical bound estimate obtained by running computer experiments. Performance of heap sort is better for discrete inputs with low k values (or equivalently high td values) and the runtime reaches to maximum beyond a threshold k. These two observations are opposite in their behaviour. The smart sort, which is designed by combining the key functions of standard quick and heap sort algorithms, is expected to behave optimally with respect to the different input parameters. The robustness of average case Oemp(nlog2n) complexity for smart sort is conjectured as result of study for various regression models and factorial design experiments.

1 citations