scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

01 May 1997-IEEE Transactions on Biomedical Engineering (IEEE Trans Biomed Eng)-Vol. 44, Iss: 5, pp 394-402
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.
Citations
More filters
Journal ArticleDOI
TL;DR: A novel electrocardiogram (ECG) compression method by adapting an adaptive Fourier decomposition (AFD) algorithm hybridized with a symbol substitution (SS) technique, which performs lossless compression enhancement and built-in data encryption, which is pivotal for e-health.
Abstract: This paper presents a novel electrocardiogram (ECG) compression method for e-health applications by adapting an adaptive Fourier decomposition (AFD) algorithm hybridized with a symbol substitution (SS) technique. The compression consists of two stages: first stage AFD executes efficient lossy compression with high fidelity; second stage SS performs lossless compression enhancement and built-in data encryption, which is pivotal for e-health. Validated with 48 ECG records from MIT-BIH arrhythmia benchmark database, the proposed method achieves averaged compression ratio (CR) of 17.6–44.5 and percentage root mean square difference (PRD) of 0.8–2.0% with a highly linear and robust PRD-CR relationship, pushing forward the compression performance to an unexploited region. As such, this paper provides an attractive candidate of ECG compression method for pervasive e-health applications.

96 citations

Journal ArticleDOI
TL;DR: A filter bank-based algorithm for ECG compression is developed that utilises a nearly-perfect reconstruction cosine modulated filter bank to split the incoming signals into several subband signals that are then quantised through thresholding and Huffman encoded.
Abstract: A filter bank-based algorithm for ECG compression is developed. The proposed method utilises a nearly-perfect reconstruction cosine modulated filter bank to split the incoming signals into several subband signals that are then quantised through thresholding and Huffman encoded. The advantage of the proposed method is that the threshold is chosen so that the quality of the retrieved signal is guaranteed. It is shown that the compression ratio achieved is an improvement over those obtained by previously reported thresholding-based algorithms.

94 citations

Journal ArticleDOI
TL;DR: The proposed Wavelet threshold based ECG signal compression technique using uniform scalar zero zone quantizer (USZZQ) and Huffman coding on differencing significance map (DSM) achieves the required compression ratio with less reconstruction error for GSM-based cellular telemedicine system.

94 citations

Journal ArticleDOI
01 Jun 2001
TL;DR: Simulation results illustrate that both methods can contribute to and enhance the medical data compression performance suitable for a hybrid mobile telemedical system that integrates these algorithmic approaches for real-time ECG data transmission scenarios with high CRs and low NRMSE ratios, especially in low bandwidth mobile systems.
Abstract: This paper evaluates the compression performance and characteristics of two wavelet coding compression schemes of electrocardiogram (ECG) signals suitable for real-time telemedical applications. The two proposed methods, namely the optimal zonal wavelet coding (OZWC) method and the wavelet transform higher order statistics-based coding (WHOSC) method, are used to assess the ECG compression issues. The WHOSC method employs higher order statistics (HOS) and uses multirate processing with the autoregressive HOS model technique to provide increasing robustness to the coding scheme. The OZWC algorithm used is based on the optimal wavelet-based zonal coding method developed for the class of discrete "Lipschitizian" signals. Both methodologies were evaluated using the normalized rms error (NRMSE) and the average compression ratio (CR) and bits per sample criteria, applied on abnormal clinical ECG data samples selected from the MIT-BIH database and the Creighton University Cardiac Center database. Simulation results illustrate that both methods can contribute to and enhance the medical data compression performance suitable for a hybrid mobile telemedical system that integrates these algorithmic approaches for real-time ECG data transmission scenarios with high CRs and low NRMSE ratios, especially in low bandwidth mobile systems.

84 citations


Cites background or methods from "Wavelet and wavelet packet compress..."

  • ...Previous ECG data compression techniques have been reported in the literature and include direct time-domain techniques [4], transform-domain techniques [5]–[7], average beat subtraction [8], and different modeling methods [9]–[12]....

    [...]

  • ...In recent years, wavelet-based compression techniques and tools have received significant attention, especially for different biomedical signal-processing applications [12], [25]....

    [...]

Journal ArticleDOI
TL;DR: The results show that the N-PR cosine-modulated filter bank method outperforms the WP technique in both quality and efficiency.

82 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...On the other hand, the input signal is processed by taking non-overlapping blocks of samples whose size is taken to be a power of two [6; 7]....

    [...]

  • ...In [4] and [5] bit allocation was chosen in a DWT diagram, as was the case in [6] and [7], where both the Embedded Zerotree Wavelet (EZW) and the Set Partitioning In Hierarchical Tree (SPIHT) algorithms, both of which have shown very good results in image coding, were applied to ECGs....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Abstract: Multiresolution representations are effective for analyzing the information content of images. The properties of the operator which approximates a signal at a given resolution were studied. It is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2/sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions. In L/sup 2/(R), a wavelet orthonormal basis is a family of functions which is built by dilating and translating a unique function psi (x). This decomposition defines an orthogonal multiresolution representation called a wavelet representation. It is computed with a pyramidal algorithm based on convolutions with quadrature mirror filters. Wavelet representation lies between the spatial and Fourier domains. For images, the wavelet representation differentiates several spatial orientations. The application of this representation to data compression in image coding, texture discrimination and fractal analysis is discussed. >

20,028 citations

Book
01 May 1992
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

16,073 citations

Journal ArticleDOI
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

14,157 citations

Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"Wavelet and wavelet packet compress..." refers methods in this paper

  • ...In the work described in this paper, was chosen to be Daubechie's W6 wavelet [10], which is illustrated in Figure 1....

    [...]

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations


Additional excerpts

  • ...algorithm was inspired by that in [28]....

    [...]