scispace - formally typeset

Proceedings ArticleDOI

Electrocardiogram Compression Technique Using DWT-Based Residue Encoder with Desired Reconstruction Quality

01 Jan 2018-

TL;DR: A new compression technique which exploits the high correlations between the consecutive beats of an Electrocardiogram (ECG) and subjected to compression using Discrete Wavelet Transform (DWT).

AbstractWe propose a new compression technique which exploits the high correlations between the consecutive beats of an Electrocardiogram (ECG). A single progressive average beat (PAB) and residues were computed from every 10 ECG beats and subjected to compression using Discrete Wavelet Transform (DWT). Reconstruction errors were controlled by selective reconstruction of the wavelet coefficients from the PAB and residues using 5% PRDN and 99% energy reconstruction efficiency (ERE) criteria. Finally, the selected coefficients were encoded using a lossless compressor, a combination of delta and run length encoder. An average CR and absolute maximum error of 8.96, and 0.08 mV respectively were obtained with 32 numbers of MITA data under Physionet. Major clinical features like QRS amplitude, T height, P height and QT interval had low distortion when the target PRDN limit was relaxed up to 15%.

...read more


Citations
More filters
Journal ArticleDOI
TL;DR: A hybrid lossy compression technique was implemented to ensure on-demand quality, either in terms of distortion or compression ratio of ECG data, and a useful outcome is the low reconstruction time in rapid screening of long arrhythmia records, while only abnormal beats are presented for evaluation.
Abstract: In long-term electrocardiogram (ECG) recording for arrhythmia monitoring, using a uniform compression strategy throughout the entire data to achieve high compression efficiency may result in unacceptable distortion of abnormal beats. The presented work addressed a solution to this problem, rarely discussed in published research. A support vector machine (SVM)-based binary classifier was implemented to identify the abnormal beats, achieving a classifier sensitivity (SE) and negative predictive value (NPV) of 99.89% and 0.003%, respectively with 34 records from MIT-BIH Arrhythmia database (mitdb). A hybrid lossy compression technique was implemented to ensure on-demand quality, either in terms of distortion or compression ratio (CR) of ECG data. A wavelet-based compression for the abnormal beats was implemented, while the consecutive normal beats were compressed in groups using a hybrid encoder, employing a combination of wavelet and principal component analysis. Finally, a neural network-based intelligent model was used, which was offline tuned by a particle swarm optimization (PSO) technique, to allocate optimal quantization level of transform domain coefficients generated from the hybrid encoder. The proposed technique was evaluated with four types of morphology tags, “A,” “F,” “L,” and “V,” from mitdb database, achieving less than 2% PRDN and less than 1% in two diagnostic distortion measures for abnormal beats. Overall, an average CR of 19.78 and PRDN of 3.34% was obtained. A useful outcome of the proposed technique is the low reconstruction time in rapid screening of long arrhythmia records, while only abnormal beats are presented for evaluation.

7 citations


Cites methods from "Electrocardiogram Compression Techn..."

  • ...For this, a group of consecutive 10 normal beats (N) was formed to compute an average and residuals [21]....

    [...]


References
More filters
Journal ArticleDOI
TL;DR: A real-time algorithm that reliably recognizes QRS complexes based upon digital analyses of slope, amplitude, and width of ECG signals and automatically adjusts thresholds and parameters periodically to adapt to such ECG changes as QRS morphology and heart rate.
Abstract: We have developed a real-time algorithm for detection of the QRS complexes of ECG signals. It reliably recognizes QRS complexes based upon digital analyses of slope, amplitude, and width. A special digital bandpass filter reduces false detections caused by the various types of interference present in ECG signals. This filtering permits use of low thresholds, thereby increasing detection sensitivity. The algorithm automatically adjusts thresholds and parameters periodically to adapt to such ECG changes as QRS morphology and heart rate. For the standard 24 h MIT/BIH arrhythmia database, this algorithm correctly detects 99.3 percent of the QRS complexes.

5,782 citations


"Electrocardiogram Compression Techn..." refers methods in this paper

  • ...Preprocessing, beat extraction, PAB and residue computation: At the first stage, the raw ECG data set was filtered using a 4 order low pass Butterworth filter with a cut off frequency of 70 Hz, followed by detection of R-peaks using Pan-Tompkins algorithm [18]....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

493 citations


"Electrocardiogram Compression Techn..." refers background or methods in this paper

  • ...Based on the encoding procedure, the WT compression algorithms are grouped into three categories: (i) threshold based algorithms [9-11], (ii) embedded coding algorithms [12-13], (iii) vector quantization (VQ) of wavelet coefficients [14]....

    [...]

  • ...In [12-13] the embedded wavelet coding takes a considerable amount of execution time due to the large number of comparisons involved in checking whether a coefficient has descended from a zero-tree (both encoder and decoder)....

    [...]

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

434 citations


"Electrocardiogram Compression Techn..." refers background or methods or result in this paper

  • ...The work [12] reports results with MITA 117 data record....

    [...]

  • ...Based on the encoding procedure, the WT compression algorithms are grouped into three categories: (i) threshold based algorithms [9-11], (ii) embedded coding algorithms [12-13], (iii) vector quantization (VQ) of wavelet coefficients [14]....

    [...]

  • ...In [12-13] the embedded wavelet coding takes a considerable amount of execution time due to the large number of comparisons involved in checking whether a coefficient has descended from a zero-tree (both encoder and decoder)....

    [...]

Journal ArticleDOI
TL;DR: The method of Fourier descriptors (FD's) is presented for ECG data compression, resistant to noisy signals and is simple, requiring implementation of forward and inverse FFT.
Abstract: The method of Fourier descriptors (FD's) is presented for ECG data compression. The two-lead ECG data are segmented into QRS complexes and S-Q intervals, expressed as a complex sequence, and are Fourier transformed to obtain the FD's. A few lower order descriptors symmetrically situated with respect to the dc coefficient represent the data in the Fourier (compressed) domain. While compression ratios of 10:1 are feasible for the S-Q interval, the clinical information requirements limit this ratio to 3:1 for the QRS complex. With an overall compression ratio greater than 7, the quality of the reconstructed signal is well suited for morphological studies. The method is resistant to noisy signals and is simple, requiring implementation of forward and inverse FFT. The results of compression of ECG data obtained from more than 50 subjects with rhythm and morphological abnormalities are presented.

175 citations


"Electrocardiogram Compression Techn..." refers background in this paper

  • ...) become popular after 1980s due to their ability to represent the signal’s entropy using energy compaction, intra or inter-beat correlations [3-8]....

    [...]

Journal ArticleDOI
TL;DR: A new electrocardiogram compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity are presented.
Abstract: This paper presents a new electrocardiogram (ECG) compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity.

134 citations