scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A wavelet transform-based ECG compression method guaranteeing desired signal quality

01 Dec 1998-IEEE Transactions on Biomedical Engineering (IEEE)-Vol. 45, Iss: 12, pp 1414-1419
TL;DR: A new electrocardiogram compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity are presented.
Abstract: This paper presents a new electrocardiogram (ECG) compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity.
Citations
More filters
Journal ArticleDOI
01 May 2007
TL;DR: The presented method is based on the irregular temporal distribution of medical data in the signal and takes advantage of variable sampling frequency for automatically detected VCG loops and reduces the data set volume.
Abstract: This paper discusses principles, implementation details, and advantages of sequence coding algorithm applied to the compression of vectocardiograms (VCG). The main novelty of the proposed method is the automatic management of distortion distribution controlled by the local signal contents in both technical and medical aspects. As in clinical practice, the VCG loops representing P, QRS, and T waves in the three-dimensional (3-D) space are considered here as three simultaneous sequences of objects. Because of the similarity of neighboring loops, encoding the values of prediction error significantly reduces the data set volume. The residual values are de-correlated with the discrete cosine transform (DCT) and truncated at certain energy threshold. The presented method is based on the irregular temporal distribution of medical data in the signal and takes advantage of variable sampling frequency for automatically detected VCG loops. The features of the proposed algorithm are confirmed by the results of the numerical experiment carried out for a wide range of real records. The average data reduction ratio reaches a value of 8.15 while the percent root-mean-square difference (PRD) distortion ratio for the most important sections of signal does not exceed 1.1%

5 citations


Cites background from "A wavelet transform-based ECG compr..."

  • ...Qualityon-demand algorithms [1-6] are certainly useful tools, however the necessity to maintain the same quality for all electrocardiogram sections may be questionable....

    [...]

Journal ArticleDOI
TL;DR: In this article , a one-dimensional complex Discrete Anamorphic Stretch Transform (DAST) is proposed for precompression of the ECG signal for real-time transmission using channels with limited bandwidth.

5 citations

Proceedings ArticleDOI
01 Feb 2015
TL;DR: The simulation results show that the proposed method achieves high compression ratio at relatively low distortion in comparison with other methods, and can clearly demonstrate the algorithm efficiency towards to save a great amount of storage space, bandwidth and power consumption especially in data transmission for tele-health care or m- health care systems.
Abstract: This paper explores the new algorithm for ECG signal compression based on Joint-multiresolution analysis (J-MRA) using Gaussian pyramid and wavelet analysis. From signal compression prospective, MRA play key role to present signal in few numbers of coefficients with its parental features. The proposed algorithm has been tested on 10 second length of 19 ECG signals from MIT-BIH Arrhythmia database and compared with recent contemporary techniques. The simulation results show that the proposed method achieves high compression ratio at relatively low distortion in comparison with other methods. Analysis contains various simulation results, where the average compression is 86.14% at 4.96% of PRD and correlation founded between original and reconstructed signal is 0.998. It can clearly demonstrate the algorithm efficiency towards to save a great amount of storage space, bandwidth and power consumption especially in data transmission for tele-health care or m-health care systems.

5 citations

Proceedings ArticleDOI
01 Nov 2009
TL;DR: This work proposes to evaluate the HOS (Kurtosis) in each Wavelet band to denoise an MECG signal, and shows significant improvement in denoising the M ECG signals.
Abstract: Multichannel Electrocardiogram (MECG) signal de-noising can be described as a process of removing the clinically unimportant contents present from the signal Higher Order Statistics (HOS) can help to retain finer details of an Electrocardiogram (ECG) signal which can effectively reduce the noise levels in MECG signal In this work, it is proposed to evaluate the HOS (Kurtosis) in each Wavelet band to denoise an MECG signal Thresholding levels are derived based on the values of fourth order cumulant, ‘Kurtosis’, of the Wavelet coefficients and Energy Contribution Efficiency (ECE) of Wavelet sub-bands The performance of this method for compressed signals is evaluated using Percentage Root Mean Square Difference (PRD), Weighted PRD (WPRD), and Wavelet Weighted Percentage Root Mean Square Difference (WWPRD) The proposed algorithm is tested with database of CSE Mutlilead Measurement Library The results show significant improvement in denoising the MECG signals

5 citations

Journal ArticleDOI
TL;DR: This study found that the EZ algorithm achieves the best compression efficiency within a low-noise environment, and that the WH algorithm is competitive for use in high-error environments with degraded short-term performance with abnormal or contaminated ECG signals.
Abstract: The use of wireless networks bears great practical importance in instantaneous transmission of ECG signals during movement. In this paper, three typical wavelet-based ECG compression algorithms, Rajoub (RA), Embedded Zerotree Wavelet (EZ), and Wavelet Transform Higher-Order Statistics Coding (WH), were evaluated to find an appropriate ECG compression algorithm for scalable and reliable wireless tele-cardiology applications, particularly over a CDMA network. The short-term and long-term performance characteristics of the three algorithms were analyzed using normal, abnormal, and measurement noise-contaminated ECG signals from the MIT-BIH database. In addition to the processing delay measurement, compression efficiency and reconstruction sensitivity to error were also evaluated via simulation models including the noise-free channel model, random noise channel model, and CDMA channel model, as well as over an actual CDMA network currently operating in Korea. This study found that the EZ algorithm achieves the best compression efficiency within a low-noise environment, and that the WH algorithm is competitive for use in high-error environments with degraded short-term performance with abnormal or contaminated ECG signals.

5 citations

References
More filters
Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...Since detailed mathematical aspects of wavelet theory can b found elsewhere [16], here, we shall merely describe the structure of a DOWT-based coding system shown in Fig....

    [...]

  • ...The proposed algorithm was implemented on a SparcStation 2 computer, where the wavelet-based filters with 10-taps were designed by Daubechies’s algorithm [16], the layer was set to , the buffer size for segmenting input ECG signals was set to , and the Lempel–Ziv–Welch (LZW) encoder [20] was chosen as the entropy encoder for simplicity....

    [...]

Journal ArticleDOI
TL;DR: A new compression algorithm is introduced that is based on principles not found in existing commercial methods in that it dynamically adapts to the redundancy characteristics of the data being compressed, and serves to illustrate system problems inherent in using any compression scheme.
Abstract: Data stored on disks and tapes or transferred over communications links in commercial computer systems generally contains significant redundancy. A mechanism or procedure which recodes the data to lessen the redundancy could possibly double or triple the effective data densitites in stored or communicated data. Moreover, if compression is automatic, it can also aid in the rise of software development costs. A transparent compression mechanism could permit the use of "sloppy" data structures, in that empty space or sparse encoding of data would not greatly expand the use of storage space or transfer time; however , that requires a good compression procedure. Several problems encountered when common compression methods are integrated into computer systems have prevented the widespread use of automatic data compression. For example (1) poor runtime execution speeds interfere in the attainment of very high data rates; (2) most compression techniques are not flexible enough to process different types of redundancy; (3) blocks of compressed data that have unpredictable lengths present storage space management problems. Each compression ' This article was written while Welch was employed at Sperry Research Center; he is now employed with Digital Equipment Corporation. 8 m, 2 /R4/OflAb l strategy poses a different set of these problems and, consequently , the use of each strategy is restricted to applications where its inherent weaknesses present no critical problems. This article introduces a new compression algorithm that is based on principles not found in existing commercial methods. This algorithm avoids many of the problems associated with older methods in that it dynamically adapts to the redundancy characteristics of the data being compressed. An investigation into possible application of this algorithm yields insight into the compressibility of various types of data and serves to illustrate system problems inherent in using any compression scheme. For readers interested in simple but subtle procedures, some details of this algorithm and its implementations are also described. The focus throughout this article will be on transparent compression in which the computer programmer is not aware of the existence of compression except in system performance. This form of compression is "noiseless," the decompressed data is an exact replica of the input data, and the compression apparatus is given no special program information, such as data type or usage statistics. Transparency is perceived to be important because putting an extra burden on the application programmer would cause

2,426 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...The proposed algorithm was implemented on a SparcStation 2 computer, where the wavelet-based filters with 10-taps were designed by Daubechies’s algorithm [16], the layer was set to , the buffer size for segmenting input ECG signals was set to , and the Lempel‐Ziv‐Welch (LZW) encoder [ 20 ] was chosen as the entropy encoder for simplicity....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...In most cases, direct methods are superior to transform methods with respect to system complexity and the error control mechanism, however, transform methods usually achieve higher compression ratios and are insensitive to the noise contained in original ECG signals [1]....

    [...]

  • ...In direct methods, the compression is done directly on the ECG samples; examples include the amplitude zone time epoch coding (AZTEC), the turning point (TP), the coordinate reduction time encoding system (CORTES), the scan-along polygonal approximation (SAPA), peak-picking, cycle-to-cycle, and differential pulse code modulation (DPCM) [1]–[4]....

    [...]

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations