scispace - formally typeset
Journal ArticleDOI

A wavelet transform-based ECG compression method guaranteeing desired signal quality

01 Dec 1998-IEEE Transactions on Biomedical Engineering (IEEE)-Vol. 45, Iss: 12, pp 1414-1419

TL;DR: A new electrocardiogram compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity are presented.

AbstractThis paper presents a new electrocardiogram (ECG) compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity.

...read more


Citations
More filters
Journal ArticleDOI
TL;DR: In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.
Abstract: The wavelet transform has emerged over recent years as a powerful time-frequency analysis and signal coding tool favoured for the interrogation of complex nonstationary signals. Its application to biosignal processing has been at the forefront of these developments where it has been found particularly useful in the study of these, often problematic, signals: none more so than the ECG. In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.

713 citations


Cites methods from "A wavelet transform-based ECG compr..."

  • ...In a later paper (Chen and Itoh 1998), again using D10 wavelets, they incorporate an adaptive quantization strategy which allows a predetermined desired signal quality to be achieved....

    [...]

  • ...In a later paper (Chen and Itoh 1998), again using D10 wavelets, they incorporate an adaptive quantization strategy which allows a predetermined desired signal quality to be achieved. Miaou and Lin (2000) also propose a quality driven compression methodology based on Daubechies wavelets and later (Miaou and Lin 2002) on biorthogonal wavelets. The latter algorithm adopts the set partitioning of hierarchical tree (SPIHT) coding strategy. Miaou et al (2002) have also proposed a dynamic vector quantization method employing tree codevectors in a single codebook. Some examples of original and compressed signals from this work are shown in figure 27. Bradie (1996) suggested the use of a wavelet-packet-based algorithm for compression of the ECG....

    [...]

  • ...In a later paper (Chen and Itoh 1998), again using D10 wavelets, they incorporate an adaptive quantization strategy which allows a predetermined desired signal quality to be achieved. Miaou and Lin (2000) also propose a quality driven compression methodology based on Daubechies wavelets and later (Miaou and Lin 2002) on biorthogonal wavelets. The latter algorithm adopts the set partitioning of hierarchical tree (SPIHT) coding strategy. Miaou et al (2002) have also proposed a dynamic vector quantization method employing tree codevectors in a single codebook. Some examples of original and compressed signals from this work are shown in figure 27. Bradie (1996) suggested the use of a wavelet-packet-based algorithm for compression of the ECG. When compared to the Karhunen–Loeve transform (KLT) applied to the same data the WP method generated significantly lower data rates at less than one-third the computational effort with generally excellent reconstructed signal quality. However, Blanchett et al (1998) report at least as good compression results for a KLT-based method. By first normalizing beat periods using multirate processing and normalizing beat amplitudes Ramakrishnan and Saha (1997) converted the ECG into a near cyclostationary sequence....

    [...]

  • ...In a later paper (Chen and Itoh 1998), again using D10 wavelets, they incorporate an adaptive quantization strategy which allows a predetermined desired signal quality to be achieved. Miaou and Lin (2000) also propose a quality driven compression methodology based on Daubechies wavelets and later (Miaou and Lin 2002) on biorthogonal wavelets. The latter algorithm adopts the set partitioning of hierarchical tree (SPIHT) coding strategy. Miaou et al (2002) have also proposed a dynamic vector quantization method employing tree codevectors in a single codebook....

    [...]

  • ...In a later paper (Chen and Itoh 1998), again using D10 wavelets, they incorporate an adaptive quantization strategy which allows a predetermined desired signal quality to be achieved. Miaou and Lin (2000) also propose a quality driven compression methodology based on Daubechies wavelets and later (Miaou and Lin 2002) on biorthogonal wavelets....

    [...]

Journal ArticleDOI
J.D. Gibson1
01 Apr 1987

385 citations

Proceedings ArticleDOI
24 Aug 2008
TL;DR: This work shows how a novel multi-resolution symbolic representation can be used to index datasets which are several orders of magnitude larger than anything else considered in the literature, allowing for the exact mining of truly massive real world datasets.
Abstract: Current research in indexing and mining time series data has produced many interesting algorithms and representations. However, the algorithms and the size of data considered have generally not been representative of the increasingly massive datasets encountered in science, engineering, and business domains. In this work, we show how a novel multi-resolution symbolic representation can be used to index datasets which are several orders of magnitude larger than anything else considered in the literature. Our approach allows both fast exact search and ultra fast approximate search. We show how to exploit the combination of both types of search as sub-routines in data mining algorithms, allowing for the exact mining of truly massive real world datasets, containing millions of time series.

340 citations


Cites background from "A wavelet transform-based ECG compr..."

  • ...For example, in the medical domain it is frequently done for both the wavelet [5] and cosine [3] representations....

    [...]

Journal ArticleDOI
TL;DR: An electrocardiogram (ECG) compression algorithm, called analysis by synthesis ECG compressor (ASEC), is introduced and was found to be superior to several well-known ECG compression algorithms at all tested bit rates.
Abstract: An electrocardiogram (ECG) compression algorithm, called analysis by synthesis ECG compressor (ASEC), is introduced. The ASEC algorithm is based on analysis by synthesis coding, and consists of a beat codebook, long and short-term predictors, and an adaptive residual quantizer. The compression algorithm uses a defined distortion measure in order to efficiently encode every heartbeat, with minimum bit rate, while maintaining a predetermined distortion level. The compression algorithm was implemented and tested with both the percentage rms difference (PRD) measure and the recently introduced weighted diagnostic distortion (WDD) measure. The compression algorithm has been evaluated with the MIT-BIH Arrhythmia Database. A mean compression rate of approximately 100 bits/s (compression ratio of about 30:1) has been achieved with a good reconstructed signal quality (WDD below 4% and PRD below 8%). The ASEC was compared with several well-known ECG compression algorithms and was found to be superior at all tested bit rates. A mean opinion score (MOS) test was also applied. The testers were three independent expert cardiologists. As In the quantitative test, the proposed compression algorithm was found to be superior to the other tested compression algorithms.

152 citations


Cites background or result from "A wavelet transform-based ECG compr..."

  • ...The results in [6], [8], [10] are Fig....

    [...]

  • ...[8], [10], because the signal was not processed to have zero...

    [...]

Journal ArticleDOI
01 Jan 2006
TL;DR: Comparative results with existing quality measures show that the new measure is insensitive to error variation, is accurate, and correlates very well with subjective tests.
Abstract: Electrocardiograph (ECG) compression techniques are gaining momentum due to the huge database requirements and wide band communication channels needed to maintain high quality ECG transmission. Advances in computer software and hardware enable the birth of new techniques in ECG compression, aiming at high compression rates. In general, most of the introduced ECG compression techniques depend on their evaluation performance on either inaccurate measures or measures targeting random behavior of error. In this paper, a new wavelet-based quality measure is proposed. A new wavelet-based quality measure is proposed. The new approach is based on decomposing the segment of interest into frequency bands where a weighted score is given to the band depending on its dynamic range and its diagnostic significance. A performance evaluation of the measure is conducted quantitatively and qualitatively. Comparative results with existing quality measures show that the new measure is insensitive to error variation, is accurate, and correlates very well with subjective tests

138 citations


Cites background from "A wavelet transform-based ECG compr..."

  • ...The complexity of WDD and lack of standard code for comparison make it difficult to be adopted for quantifying a reconstructed signal’s quality....

    [...]

  • ...Three main components should be integrated for proper performance testing: compression measure, reconstruction error, and computational complexity [6]....

    [...]


References
More filters
Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,350 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...Since detailed mathematical aspects of wavelet theory can b found elsewhere [16], here, we shall merely describe the structure of a DOWT-based coding system shown in Fig....

    [...]

  • ...The proposed algorithm was implemented on a SparcStation 2 computer, where the wavelet-based filters with 10-taps were designed by Daubechies’s algorithm [16], the layer was set to , the buffer size for segmenting input ECG signals was set to , and the Lempel–Ziv–Welch (LZW) encoder [20] was chosen as the entropy encoder for simplicity....

    [...]

Journal ArticleDOI
TL;DR: A new compression algorithm is introduced that is based on principles not found in existing commercial methods in that it dynamically adapts to the redundancy characteristics of the data being compressed, and serves to illustrate system problems inherent in using any compression scheme.
Abstract: Data stored on disks and tapes or transferred over communications links in commercial computer systems generally contains significant redundancy. A mechanism or procedure which recodes the data to lessen the redundancy could possibly double or triple the effective data densitites in stored or communicated data. Moreover, if compression is automatic, it can also aid in the rise of software development costs. A transparent compression mechanism could permit the use of "sloppy" data structures, in that empty space or sparse encoding of data would not greatly expand the use of storage space or transfer time; however , that requires a good compression procedure. Several problems encountered when common compression methods are integrated into computer systems have prevented the widespread use of automatic data compression. For example (1) poor runtime execution speeds interfere in the attainment of very high data rates; (2) most compression techniques are not flexible enough to process different types of redundancy; (3) blocks of compressed data that have unpredictable lengths present storage space management problems. Each compression ' This article was written while Welch was employed at Sperry Research Center; he is now employed with Digital Equipment Corporation. 8 m, 2 /R4/OflAb l strategy poses a different set of these problems and, consequently , the use of each strategy is restricted to applications where its inherent weaknesses present no critical problems. This article introduces a new compression algorithm that is based on principles not found in existing commercial methods. This algorithm avoids many of the problems associated with older methods in that it dynamically adapts to the redundancy characteristics of the data being compressed. An investigation into possible application of this algorithm yields insight into the compressibility of various types of data and serves to illustrate system problems inherent in using any compression scheme. For readers interested in simple but subtle procedures, some details of this algorithm and its implementations are also described. The focus throughout this article will be on transparent compression in which the computer programmer is not aware of the existence of compression except in system performance. This form of compression is "noiseless," the decompressed data is an exact replica of the input data, and the compression apparatus is given no special program information, such as data type or usage statistics. Transparency is perceived to be important because putting an extra burden on the application programmer would cause

2,341 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...The proposed algorithm was implemented on a SparcStation 2 computer, where the wavelet-based filters with 10-taps were designed by Daubechies’s algorithm [16], the layer was set to , the buffer size for segmenting input ECG signals was set to , and the Lempel‐Ziv‐Welch (LZW) encoder [ 20 ] was chosen as the entropy encoder for simplicity....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

649 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...In most cases, direct methods are superior to transform methods with respect to system complexity and the error control mechanism, however, transform methods usually achieve higher compression ratios and are insensitive to the noise contained in original ECG signals [1]....

    [...]

  • ...In direct methods, the compression is done directly on the ECG samples; examples include the amplitude zone time epoch coding (AZTEC), the turning point (TP), the coordinate reduction time encoding system (CORTES), the scan-along polygonal approximation (SAPA), peak-picking, cycle-to-cycle, and differential pulse code modulation (DPCM) [1]–[4]....

    [...]

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

434 citations