scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A novel compression algorithm for electrocardiogram signals based on the linear prediction of the wavelet coefficients

TL;DR: A new algorithm for electrocardiogram (ECG) compression based on the compression of the linearly predicted residuals of the wavelet coefficients of the signal, which reduces the bit rate while keeping the reconstructed signal distortion at a clinically acceptable level.
About: This article is published in Digital Signal Processing.The article was published on 2003-10-01. It has received 97 citations till now. The article focuses on the topics: Wavelet transform & Stationary wavelet transform.
Citations
More filters
Journal ArticleDOI
Sabah M. Ahmed1
01 Sep 2008
TL;DR: An ECG compressor based on the optimal selection of wavelet filters and threshold levels in different subbands that achieve maximum data volume reduction while guaranteeing reconstruction quality and the computational complexity of the proposed technique is the price paid for the improvement in the compression performance measures.
Abstract: Although most of the theoretical and implementation aspects of wavelet based algorithms in ElectroCardioGram (ECG) signal compression are well studied, many issues related to the choice of wavelet filters and threshold levels selection remain unresolved. The utilization of optimal mother wavelet will lead to localization and maximization of wavelet coefficients' values in wavelet domain. This paper presents an ECG compressor based on the optimal selection of wavelet filters and threshold levels in different subbands that achieve maximum data volume reduction while guaranteeing reconstruction quality. The proposed algorithm starts by segmenting the ECG signal into frames; where each frame is decomposed into m subbands through optimized wavelet filters. The resulting wavelet coefficients are threshold and those having absolute values below specified threshold levels in all subands are deleted and the remaining coefficients are appropriately encoded with a modified version of the run-length coding scheme. The threshold levels to use, before encoding, are adjusted in an optimum manner, until predefined compression ratio and signal quality are achieved. Extensive experimental tests were made by applying the algorithm to ECG records from the MIT-BIH Arrhythmia Database [1]. The compression ratio (CR), the percent root-mean-square difference (PRD) and the zero-mean percent root-mean-square difference (PRD1) measures are used for measuring the algorithm performance (high CR with excellent reconstruction quality). From the obtained results, it can be deduced that the performance of the optimized signal dependent wavelet outperforms that of Daubechies and Coiflet standard wavelets. However, the computational complexity of the proposed technique is the price paid for the improvement in the compression performance measures. Finally, it should be noted that the proposed method is flexible in controlling the quality of the reconstructed signals and the volume of the compressed signals by establishing target PRD and CR a priori respectively.

2 citations

Journal ArticleDOI
TL;DR: It is proposed to develop, a hybrid two-stage ECG signal compressor based on DCT and DWT, which will offer optimized CR and PRD, which would be suitable for most monitoring and diagnostic applications.
Abstract: Over the years, a variety of other linear transforms have been developed which include discrete Fourier transform (DFT), discrete wavelet transform (DWT) and discrete cosine transform (DCT) and many more, each with its own advantages and disadvantages. Among these techniques, DWT has been proven to be very efficient for ECG signal coding. It is proposed to develop, a hybrid two-stage ECG signal compressor based on DCT and DWT. Proposed method is hybrid ECG compression technique based on wavelet transformation of the DCT coefficients of the signal. Interaction of DCT analysis with DWT transformations, signal thresholding and coding are a few of the many outstanding challenges in ECG compression. Proposed method will offer optimized CR and PRD, which would be suitable for most monitoring and diagnostic applications. KeywordsDCT, DWT, Wavelet, Image Processing, Signal Compression.

2 citations

Proceedings ArticleDOI
09 Jul 2009
TL;DR: This paper presents a discussion concerning ECG signals compression using the basis pursuit (BP) approach applied for several overcomplete wavelet dictionaries, based on an “optimal” superposition of dictionary elements, by minimizing the l1 norm of the error.
Abstract: This paper presents a discussion concerning ECG signals compression using the basis pursuit (BP) approach applied for several overcomplete wavelet dictionaries. The compression is based on an “optimal” superposition of dictionary elements, by minimizing the l 1 norm of the error. The best results have been obtained with the Coiflet 4 dictionary.

2 citations

References
More filters
Journal ArticleDOI
John Makhoul1
01 Apr 1975
TL;DR: This paper gives an exposition of linear prediction in the analysis of discrete signals as a linear combination of its past values and present and past values of a hypothetical input to a system whose output is the given signal.
Abstract: This paper gives an exposition of linear prediction in the analysis of discrete signals The signal is modeled as a linear combination of its past values and present and past values of a hypothetical input to a system whose output is the given signal In the frequency domain, this is equivalent to modeling the signal spectrum by a pole-zero spectrum The major part of the paper is devoted to all-pole models The model parameters are obtained by a least squares analysis in the time domain Two methods result, depending on whether the signal is assumed to be stationary or nonstationary The same results are then derived in the frequency domain The resulting spectral matching formulation allows for the modeling of selected portions of a spectrum, for arbitrary spectral shaping in the frequency domain, and for the modeling of continuous as well as discrete spectra This also leads to a discussion of the advantages and disadvantages of the least squares error criterion A spectral interpretation is given to the normalized minimum prediction error Applications of the normalized error are given, including the determination of an "optimal" number of poles The use of linear prediction in data compression is reviewed For purposes of transmission, particular attention is given to the quantization and encoding of the reflection (or partial correlation) coefficients Finally, a brief introduction to pole-zero modeling is given

4,206 citations

Book
05 Sep 1978
TL;DR: This paper presents a meta-modelling framework for digital Speech Processing for Man-Machine Communication by Voice that automates the very labor-intensive and therefore time-heavy and expensive process of encoding and decoding speech.
Abstract: 1. Introduction. 2. Fundamentals of Digital Speech Processing. 3. Digital Models for the Speech Signal. 4. Time-Domain Models for Speech Processing. 5. Digital Representation of the Speech Waveform. 6. Short-Time Fourier Analysis. 7. Homomorphic Speech Processing. 8. Linear Predictive Coding of Speech. 9. Digital Speech Processing for Man-Machine Communication by Voice.

3,103 citations

Journal ArticleDOI
TL;DR: The perfect reconstruction condition is posed as a Bezout identity, and it is shown how it is possible to find all higher-degree complementary filters based on an analogy with the theory of Diophantine equations.
Abstract: The wavelet transform is compared with the more classical short-time Fourier transform approach to signal analysis. Then the relations between wavelets, filter banks, and multiresolution signal processing are explored. A brief review is given of perfect reconstruction filter banks, which can be used both for computing the discrete wavelet transform, and for deriving continuous wavelet bases, provided that the filters meet a constraint known as regularity. Given a low-pass filter, necessary and sufficient conditions for the existence of a complementary high-pass filter that will permit perfect reconstruction are derived. The perfect reconstruction condition is posed as a Bezout identity, and it is shown how it is possible to find all higher-degree complementary filters based on an analogy with the theory of Diophantine equations. An alternative approach based on the theory of continued fractions is also given. These results are used to design highly regular filter banks, which generate biorthogonal continuous wavelet bases with symmetries. >

1,804 citations

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations