scispace - formally typeset
Journal ArticleDOI

Preserving Abnormal Beat Morphology in Long-Term ECG Recording: An Efficient Hybrid Compression Approach

01 May 2020-IEEE Transactions on Instrumentation and Measurement (Institute of Electrical and Electronics Engineers (IEEE))-Vol. 69, Iss: 5, pp 2084-2092

TL;DR: A hybrid lossy compression technique was implemented to ensure on-demand quality, either in terms of distortion or compression ratio of ECG data, and a useful outcome is the low reconstruction time in rapid screening of long arrhythmia records, while only abnormal beats are presented for evaluation.

AbstractIn long-term electrocardiogram (ECG) recording for arrhythmia monitoring, using a uniform compression strategy throughout the entire data to achieve high compression efficiency may result in unacceptable distortion of abnormal beats. The presented work addressed a solution to this problem, rarely discussed in published research. A support vector machine (SVM)-based binary classifier was implemented to identify the abnormal beats, achieving a classifier sensitivity (SE) and negative predictive value (NPV) of 99.89% and 0.003%, respectively with 34 records from MIT-BIH Arrhythmia database (mitdb). A hybrid lossy compression technique was implemented to ensure on-demand quality, either in terms of distortion or compression ratio (CR) of ECG data. A wavelet-based compression for the abnormal beats was implemented, while the consecutive normal beats were compressed in groups using a hybrid encoder, employing a combination of wavelet and principal component analysis. Finally, a neural network-based intelligent model was used, which was offline tuned by a particle swarm optimization (PSO) technique, to allocate optimal quantization level of transform domain coefficients generated from the hybrid encoder. The proposed technique was evaluated with four types of morphology tags, “A,” “F,” “L,” and “V,” from mitdb database, achieving less than 2% PRDN and less than 1% in two diagnostic distortion measures for abnormal beats. Overall, an average CR of 19.78 and PRDN of 3.34% was obtained. A useful outcome of the proposed technique is the low reconstruction time in rapid screening of long arrhythmia records, while only abnormal beats are presented for evaluation.

...read more


Citations
More filters
Proceedings ArticleDOI
01 Jun 2020
TL;DR: A dynamic method based on Compressed Sensing to reconstruct multi-lead electrocardiography signals in support of Internet-of-Medical-Things by dynamically evaluated through the signal samples acquired by the first lead.
Abstract: This paper proposes a dynamic method based on Compressed Sensing (CS) to reconstruct multi-lead electrocardiography (ECG) signals in support of Internet-of-Medical-Things. Specifically, the sensing matrix is dynamically evaluated through the signal samples acquired by the first lead. The experimental evaluation demonstrates that, compared to the traditional CS multi-lead method adopting a random sensing matrix, the proposed dynamic method exhibits a lower difference from the original ECG signal.

10 citations


Cites methods from "Preserving Abnormal Beat Morphology..."

  • ...method for ECG signals is evaluated by the Percentage of Root-mean-squared Difference (PRD) [2]–[5], [7], [20]–[22], as, for PRD values lower than 9%, the clinical information...

    [...]

  • ...In classic compression methods, a digital algorithm extracts only the signal features from the Nyquist rate samples [3]–[5]....

    [...]

Journal ArticleDOI
TL;DR: This study points out drawbacks of compression algorithms, presents new compression algorithm which is properly described, tested and objectively compared with other authors and serves as an example how the standardization should look like.
Abstract: Compression of ECG signal is essential especially in the area of signal transmission in telemedicine. There exist many compression algorithms which are described in various details, tested on various datasets and their performance is expressed by different ways. There is a lack of standardization in this area. This study points out these drawbacks and presents new compression algorithm which is properly described, tested and objectively compared with other authors. This study serves as an example how the standardization should look like. Single-cycle fractal-based (SCyF) compression algorithm is introduced and tested on 4 different databases-CSE database, MIT-BIH arrhythmia database, High-frequency signal and Brno University of Technology ECG quality database (BUT QDB). SCyF algorithm is always compared with well-known algorithm based on wavelet transform and set partitioning in hierarchical trees in terms of efficiency (2 methods) and quality/distortion of the signal after compression (12 methods). Detail analysis of the results is provided. The results of SCyF compression algorithm reach up to avL = 0.4460 bps and PRDN = 2.8236%.

5 citations

Journal ArticleDOI
TL;DR: An alternative method for compressed sensing and reconstruction of ECG that is patient agnostic and offers a high compression ratio is introduced that keeps the structure of heartbeats preserved including the exact positions of R waves, and it reduces the noise interfering with ECG signals.
Abstract: This paper introduces an alternative method for compressed sensing and reconstruction of ECG that is patient agnostic and offers a high compression ratio. The high compression ratio is achieved by high decimation of the measurement signal and its post requantization, further decreasing the number of bits needed for information transfer. The sensing method also incorporates a QRS detector to detect exact R wave positions for signal segmentation before compression. ECG signal is also normalized in amplitude and offset, which maintains the bit resolution during requantization. The reconstruction employs a simple dynamic ECG model, parameters of which are calculated from the measurement signal by the Differential Evolution algorithm. The proposed method was evaluated using the MIT-BIH arrhythmia database and compared with two wavelet dictionary reconstruction methods. The proposed method keeps the structure of heartbeats preserved including the exact positions of R waves, and it reduces the noise interfering with ECG signals.

1 citations

Journal ArticleDOI
TL;DR: A multi-lead electrocardiogram (MECG) compression technique, which preserves pathological information in different affected leads while achieving high overall compression, is described.
Abstract: In this paper, we describe a multi-lead electrocardiogram (MECG) compression technique, which preserves pathological information in different affected leads while achieving high overall compression...

Cites background or methods from "Preserving Abnormal Beat Morphology..."

  • ...The current research ismotivated by thework [30], where it is established that lossy hybrid compression can be utilized both for preserving abnormal beat morphology and for achieving higher compression in arrhythmic ECG data....

    [...]

  • ...In the present work, the basic principle of hybrid encoder of [30] has been adopted, and the abnormality detection component has been discarded....

    [...]

Journal ArticleDOI
TL;DR: In this article, a single-cycle fractal-based compression algorithm and a compression algorithm based on combination of wavelet transform and set partitioning in hierarchical trees are used to compress 125 15-leads ECG signals from CSE database.
Abstract: The performance of ECG signals compression is influenced by many things. However, there is not a single study primarily focused on the possible effects of ECG pathologies on the performance of compression algorithms. This study evaluates whether the pathologies present in ECG signals affect the efficiency and quality of compression. Single-cycle fractal-based compression algorithm and compression algorithm based on combination of wavelet transform and set partitioning in hierarchical trees are used to compress 125 15-leads ECG signals from CSE database. Rhythm and morphology of these signals are newly annotated as physiological or pathological. The compression performance results are statistically evaluated. Using both compression algorithms, physiological signals are compressed with better quality than pathological signals according to 8 and 9 out of 12 quality metrics, respectively. Moreover, it was statistically proven that pathological signals were compressed with lower efficiency than physiological signals. Signals with physiological rhythm and physiological morphology were compressed with the best quality. The worst results reported the group of signals with pathological rhythm and pathological morphology. This study is the first one which deals with effects of ECG pathologies on the performance of compression algorithms. Signal-by-signal rhythm and morphology annotations (physiological/pathological) for the CSE database are newly published.

References
More filters
Journal ArticleDOI
TL;DR: A real-time algorithm that reliably recognizes QRS complexes based upon digital analyses of slope, amplitude, and width of ECG signals and automatically adjusts thresholds and parameters periodically to adapt to such ECG changes as QRS morphology and heart rate.
Abstract: We have developed a real-time algorithm for detection of the QRS complexes of ECG signals. It reliably recognizes QRS complexes based upon digital analyses of slope, amplitude, and width. A special digital bandpass filter reduces false detections caused by the various types of interference present in ECG signals. This filtering permits use of low thresholds, thereby increasing detection sensitivity. The algorithm automatically adjusts thresholds and parameters periodically to adapt to such ECG changes as QRS morphology and heart rate. For the standard 24 h MIT/BIH arrhythmia database, this algorithm correctly detects 99.3 percent of the QRS complexes.

5,782 citations


"Preserving Abnormal Beat Morphology..." refers methods in this paper

  • ...The R-peaks were detected using a modification of the Pan-Tompkins algorithm [22]....

    [...]

Journal ArticleDOI
TL;DR: The correlation between the proposed WDD measure and the MOS test measure (MOS/sub error/) was found superior to the correlation betweenThe popular PRD measure andThe MOS/ sub error/.
Abstract: In this paper, a new distortion measure for electrocardiogram (ECG) signal compression, called weighted diagnostic distortion (WDD) is introduced. The WDD measure is designed for comparing the distortion between original ECG signal and reconstructed ECG signal (after compression). The WDD is based on PQRST complex diagnostic features (such as P wave duration, QT interval, T shape, ST elevation) of the original ECG signal and the reconstructed one. Unlike other conventional distortion measures [e.g. percentage root mean square (rms) difference, or PRD], the WDD contains direct diagnostic information and thus is more meaningful and useful. Four compression algorithms were implemented (AZTEC, SAPA2, LTP, ASEC) in order to evaluate the WDD. A mean opinion score (MOS) test was applied to test the quality of the reconstructed signals and to compare the quality measure (MOS/sub error/) with the proposed WDD measure and the popular PRD measure. The evaluators in the WIGS test were three independent expert cardiologists, who studied the reconstructed ECG signals in a blind and a semiblind tests. The correlation between the proposed WDD measure and the MOS test measure (MOS/sub error/) was found superior to the correlation between the popular PRD measure and the MOS/sub error/.

361 citations


"Preserving Abnormal Beat Morphology..." refers methods in this paper

  • ...%) of WDD and WEDD shows that abnormal beat morphology could be preserved in the compressed records....

    [...]

  • ...To evaluate the clinical acceptability of the abnormal beats, two additional parameters, viz., weighted diagnostic distortion (WDD) [26] and wavelet energy-based diagnostic distortion (WEDD) [27] were computed for each type of abnormality as shown in Table III....

    [...]

  • ..., weighted diagnostic distortion (WDD) [26] and wavelet energy-based diagnostic distortion (WEDD) [27] were computed for each type of abnormality as shown in Table III....

    [...]

  • ...For computing the WDD, the following features were evaluated: 1) QRS duration; 2) P-wave height; 3) P-wave duration; 4) R-R interval; 5) QRS amplitude; 6) PR interval; 7) QT interval; 8) T-wave height....

    [...]

  • ...(11) The low WDD and WEDD values clearly show that the clinical signatures of the abnormal beats are preserved in the reconstructed data....

    [...]

Journal ArticleDOI
TL;DR: The proposed algorithm analyzes ECG data utilizing XWT and explores the resulting spectral differences and heuristically determined mathematical formula extracts the parameter(s) from the WCS and WCOH that are relevant for classification of normal and abnormal cardiac patterns.
Abstract: In this paper, we use cross wavelet transform (XWT) for the analysis and classification of electrocardiogram (ECG) signals. The cross-correlation between two time-domain signals gives a measure of similarity between two waveforms. The application of the continuous wavelet transform to two time series and the cross examination of the two decompositions reveal localized similarities in time and frequency. Application of the XWT to a pair of data yields wavelet cross spectrum (WCS) and wavelet coherence (WCOH). The proposed algorithm analyzes ECG data utilizing XWT and explores the resulting spectral differences. A pathologically varying pattern from the normal pattern in the QT zone of the inferior leads shows the presence of inferior myocardial infarction. A normal beat ensemble is selected as the absolute normal ECG pattern template, and the coherence between various other normal and abnormal subjects is computed. The WCS and WCOH of various ECG patterns show distinguishing characteristics over two specific regions R1 and R2, where R1 is the QRS complex area and R2 is the T-wave region. The Physikalisch-Technische Bundesanstalt diagnostic ECG database is used for evaluation of the methods. A heuristically determined mathematical formula extracts the parameter(s) from the WCS and WCOH. Empirical tests establish that the parameter(s) are relevant for classification of normal and abnormal cardiac patterns. The overall accuracy, sensitivity, and specificity after combining the three leads are obtained as 97.6%, 97.3%, and 98.8%, respectively.

222 citations


"Preserving Abnormal Beat Morphology..." refers methods in this paper

  • ...Among these, [5], [6], [11], and [28] used mitdb data, [29] and [30] used other databases, while [5], [6], and [30] achieved very high SE using various threshold-based and SVM-based clustering techniques....

    [...]

Journal ArticleDOI
TL;DR: A technique to truthfully classify ECG signal data into two classes (abnormal and normal class) using various neural classifier is proposed using Back Propagation Network, Feed Forward Network, and Multilayered Perceptron.
Abstract: This paper deals with ECG signal analysis based on Artificial Neural Network and combined based (discrete wavelet transform and morphology) features. We proposed a technique to truthfully classify ECG signal data into two classes (abnormal and normal class) using various neural classifier. MIT–BIH arrhythmia database utilized and selected 45 files of one minute recording (25 files of normal class and 20 files of abnormal class) out of 48 files based on types of beat present in it. The total 64 features are separated in to two classes that is DWT (48) based features and morphological (16) feature of ECG signal which is set as an input to the classifier. Three neural network classifiers: Back Propagation Network (BPN), Feed Forward Network (FFN) and Multilayered Perceptron (MLP) are employed to classify the ECG signal. The classifier performance is measured in terms of Sensitivity (Se), Positive Predictivity (PP) and Specificity (SP). The system performance is achieved with 100% accuracy using MLP.

154 citations


"Preserving Abnormal Beat Morphology..." refers background or methods in this paper

  • ...Among these, [5], [6], [11], and [28] used mitdb data, [29] and [30] used other databases, while [5], [6], and [30] achieved very high SE using various threshold-based and SVM-based clustering techniques....

    [...]

  • ...The published researches on long-term ECG recording are mainly directed to following areas: abnormal beat morphology detection in ECG monitoring [5], [6], and quality controlled data compression [7], [8]....

    [...]

Journal ArticleDOI
01 Jan 2010
TL;DR: An ECG signal processing method with quad level vector (QLV) is proposed for the ECG holter system to achieve better performance with low-computation complexity.
Abstract: An ECG signal processing method with quad level vector (QLV) is proposed for the ECG holter system. The ECG processing consists of the compression flow and the classification flow, and the QLV is proposed for both flows to achieve better performance with low-computation complexity. The compression algorithm is performed by using ECG skeleton and the Huffman coding. Unit block size optimization, adaptive threshold adjustment, and 4-bit-wise Huffman coding methods are applied to reduce the processing cost while maintaining the signal quality. The heartbeat segmentation and the R-peak detection methods are employed for the classification algorithm. The performance is evaluated by using the Massachusetts Institute of Technology-Boston's Beth Israel Hospital Arrhythmia Database, and the noise robust test is also performed for the reliability of the algorithm. Its average compression ratio is 16.9:1 with 0.641% percentage root mean square difference value and the encoding rate is 6.4 kbps. The accuracy performance of the R-peak detection is 100% without noise and 95.63% at the worst case with -10-dB SNR noise. The overall processing cost is reduced by 45.3% with the proposed compression techniques.

149 citations


"Preserving Abnormal Beat Morphology..." refers methods in this paper

  • ...[18] describe a waveform delineation and information level to extract the skeleton of ECG, followed by Delta-Huffman coder to compress the data....

    [...]