scispace - formally typeset
Search or ask a question

Showing papers by "Rezaul Chowdhury published in 2000"


Journal ArticleDOI
TL;DR: A simplified complexity analysis of the BOTTOM-UP-HEAPSORT algorithm from a different viewpoint is presented and it is shown that it requires nlog2 n+n element comparisons in the worst case and nlog 2 n+0.42n comparisons on the average.
Abstract: McDiarmid and Reed (1989) presented a variant of BOTTOM-UP-HEAPSORT which requires nlog2 n+n element comparisons (for n= 2 h+1-1) in the worst case, but requires an extra storage of n bits. Ingo Wegener (1992) has analyzed the average and worst case complexity of the algorithm which is very complex and long. In this paper we present a simplified complexity analysis of the same algorithm from a different viewpoint. For n= 2 h+1-1, we show that it requires nlog2 n+n element comparisons in the worst case and nlog2 n+0.42n comparisons on the average

3 citations


Posted Content
TL;DR: A new data structure for double ended priority queue, called min-max fine heap, which combines the techniques used in fine heap and traditional min- max heap, and their analysis indicates that the new structure outperforms the traditional one.
Abstract: In this paper we present a new data structure for double ended priority queue, called min-max fine heap, which combines the techniques used in fine heap and traditional min-max heap. The standard operations on this proposed structure are also presented, and their analysis indicates that the new structure outperforms the traditional one.

2 citations


Journal ArticleDOI
TL;DR: A new mergesort algorithm which can sort n(= 2h+1 − 1) elements using no more than n log2(n+1) − (1312)n − 1 element comparisons in the worst case is presented.
Abstract: In this paper, we present a new mergesort algorithm which can sort n(= 2h+1 − 1) elements using no more than n log2(n+1) − (1312)n − 1 element comparisons in the worst case. This algorithm includes the heap (fine heap) creation phase as a pre-processing step, and for each internal node v, its left and right subheaps are merged into a sorted list of the elements under that node. Experimental results show that this algorithm requires only n log2(n+1) − 1.2n element comparisons in the average case. But it requires extra space for n LINK fields.

Journal ArticleDOI
TL;DR: An iterative algorithm has been presented for calculating the square root of a real number with arbitrary order of convergence using formulae derived by applying binomial theorem to reduce the number of division operations required.
Abstract: In this paper an iterative algorithm has been presented for calculating the square root of a real number with arbitrary order of convergence using formulae derived by applying binomial theorem The primary objective is to reduce the number of division operations required