scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

01 May 1997-IEEE Transactions on Biomedical Engineering (IEEE Trans Biomed Eng)-Vol. 44, Iss: 5, pp 394-402
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.
Citations
More filters
Proceedings ArticleDOI
01 Nov 2009
TL;DR: An efficient Electrocardiogram (ECG) signal compression method based on wavelet transform that combines the adapted SPIHT method with VKTP (Vector K-Tree Partitioning) coder to improve the compression ratio while maintaining a good signal quality is presented.
Abstract: In this paper, an efficient Electrocardiogram (ECG) signal compression method based on wavelet transform is presented. The proposed method combines the adapted SPIHT (Set Partitioning In Hierarchical Trees) method with VKTP (Vector K-Tree Partitioning) coder. The SPIHT method is based on the use of wavelet transform which is very well suited to locate the energy of the signal in fewer coefficients. Using the VKTP algorithm, to encode the generated bit stream of SPIHT algorithm, we achieve high compression performances. The tests of this lossy compression/ decompression technique are performed on many ECG records from Arrhythmia Database. The obtained results illustrate the capabilities of the proposed approach to improve the compression ratio while maintaining a good signal quality.

14 citations


Cites background from "Wavelet and wavelet packet compress..."

  • ...11:1, which is better than the coders in [2] and [7]....

    [...]

  • ...Hilton [7] has presented a wavelet and wavelet packet-based EZW encoder....

    [...]

Proceedings ArticleDOI
22 Oct 2007
TL;DR: A novel algorithm for EMG signal compression using the wavelet transform that had a better performance in compression ratio and fidelity of the reconstructed signal and was compared to other algorithms based on the wave let transform.
Abstract: Despite the growing interest in the transmission and storage of electromyographic signals for long periods of time, only a few studies dealt with the compression of these signals. In this article we propose a novel algorithm for EMG signal compression using the wavelet transform. For EMG signals acquired during isometric contractions, the proposed algorithm provided compression factors ranging from 50 to 90%, with an average PRD ranging from 1.4 to 7.5%. The proposed method uses a new scheme for normalizing the wavelet coefficients. The wavelet coefficients are quantized using dynamic bit allocation, which is carried out by a Kohonen Neural Network. After the quantization, these coefficients are encoded using an arithmetic encoder. The compression results using the proposed algorithm were compared to other algorithms based on the wavelet transform. The proposed algorithm had a better performance in compression ratio and fidelity of the reconstructed signal.

14 citations


Cites background or methods from "Wavelet and wavelet packet compress..."

  • ...The set of filters that are used in this work were the biorthogonal 9/7, that have been shown to be very effective in the compression of ECG signals [4,5]....

    [...]

  • ...Previous works have dealt with the compression of other kinds of biomedical signals, such as the electrocardiogram [3,4,5] and the electroencephalogram [6]....

    [...]

Journal ArticleDOI
TL;DR: This paper presents a simple and efficient method for guaranteeing reconstruction quality measured using the new distortion index wavelet weighted PRD (WWPRD), which reflects in a more accurate way the real clinical distortion of the compressed signal.
Abstract: Guaranteeing ECG signal quality in wavelet lossy compression methods is essential for clinical acceptability of reconstructed signals. In this paper, we present a simple and efficient method for guaranteeing reconstruction quality measured using the new distortion index wavelet weighted PRD (WWPRD), which reflects in a more accurate way the real clinical distortion of the compressed signal. The method is based on the wavelet transform and its subsequent coding using the set partitioning in hierarchical trees (SPIHT) algorithm. By thresholding the WWPRD in the wavelet transform domain, a very precise reconstruction error can be achieved thus enabling to obtain clinically useful reconstructed signals. Because of its computational efficiency, the method is suitable to work in a real-time operation, thus being very useful for real-time telecardiology systems. The method is extensively tested using two different ECG databases. Results led to an excellent conclusion: the method controls the quality in a very accurate way not only in mean value but also with a low-standard deviation. The effects of ECG baseline wandering as well as noise in compression are also discussed. Baseline wandering provokes negative effects when using WWPRD index to guarantee quality because this index is normalized by the signal energy. Therefore, it is better to remove it before compression. On the other hand, noise causes an increase in signal energy provoking an artificial increase of the coded signal bit rate. Clinical validation by cardiologists showed that a WWPRD value of 10% preserves the signal quality and thus they recommend this value to be used in the compression system.

14 citations


Cites background from "Wavelet and wavelet packet compress..."

  • ...Hindawi Publishing Corporation EURASIP Journal on Advances in Signal Processing Volume 2007, Article ID 93195, 9 pages doi:10.1155/2007/93195 Research Article...

    [...]

Journal ArticleDOI
TL;DR: It is shown that a normal beat can be modelled using 18 parameters and only 15 parameters are needed to reconstruct the premature ventricular contraction beat.

14 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...In the other hand, the same model can be applied for simulating a specific ECG, and could be also required for some compression techniques [1–7]....

    [...]

Proceedings ArticleDOI
18 Mar 2011
TL;DR: A software based effective ECG data compression algorithm is proposed and the data reconstruction algorithm has also been developed using the reversed logic and it is seen that data is reconstructed preserving the significant ECG signal morphology.
Abstract: Efficient and reliable electrocardiogram (ECG) compression system can increase the processing speed of real-time ECG transmission as well as reduce the amount of data storage in long-term ECG recording. In the present paper, a software based effective ECG data compression algorithm is proposed. The whole algorithm is written in C- platform. The algorithm is tested on various ECG data of all the 12 leads taken from PTB Diagnostic ECG Database (PTB-DB). In this compression methodology, all the R-Peaks are detected at first by differentiation technique and QRS regions are located. To achieve a strict lossless compression in QRS regions and a tolerable lossy compression in rest of the signal, two different compression algorithms have developed. In lossless compression method a difference array has been generated from the corresponding input ECG “Voltage” values and then those are multiplied by a considerably large integer number to convert them into integer. In the next step, theses integer numbers are grouped in both forward and reverse direction maintaining some logical criteria. Then all the grouped numbers along with sign bit and other necessary information (position of critical numbers, forward/reverse grouping etc.) are converted into their corresponding ASCII characters. Whereas in lossy area, first of all, the sampling frequency of the original ECG signal is reduced to one half and then, only the “Voltage” values are gathered from the corresponding input ECG data and those are amplified and grouped only in forward direction. Then all the grouped numbers along with sign bit and other necessary information are converted into their corresponding ASCII characters. It is observed that this proposed algorithm can reduce the file size significantly. The data reconstruction algorithm has also been developed using the reversed logic and it is seen that data is reconstructed preserving the significant ECG signal morphology.

14 citations

References
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Abstract: Multiresolution representations are effective for analyzing the information content of images. The properties of the operator which approximates a signal at a given resolution were studied. It is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2/sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions. In L/sup 2/(R), a wavelet orthonormal basis is a family of functions which is built by dilating and translating a unique function psi (x). This decomposition defines an orthogonal multiresolution representation called a wavelet representation. It is computed with a pyramidal algorithm based on convolutions with quadrature mirror filters. Wavelet representation lies between the spatial and Fourier domains. For images, the wavelet representation differentiates several spatial orientations. The application of this representation to data compression in image coding, texture discrimination and fractal analysis is discussed. >

20,028 citations

Book
01 May 1992
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

16,073 citations

Journal ArticleDOI
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

14,157 citations

Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"Wavelet and wavelet packet compress..." refers methods in this paper

  • ...In the work described in this paper, was chosen to be Daubechie's W6 wavelet [10], which is illustrated in Figure 1....

    [...]

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations


Additional excerpts

  • ...algorithm was inspired by that in [28]....

    [...]