scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
01 Jan 2011
TL;DR: A thorough experimental study was conducted to show the superiority of the generalization capability of the Relevance Vector Machine (RVM) compared with Extreme Learning Machine (ELM) approach in the automatic classification of ECG beats.

6 citations

Proceedings ArticleDOI
03 Sep 2007
TL;DR: This paper proposes a methodology for ECG (electrocardiograms) data compression based on R-R segmentation, which uses a segment dictionary combined with an efficient form of progressive error codification.
Abstract: This paper proposes a methodology for ECG (electrocardiograms) data compression based on R-R segmentation. An ECG can be seen as a quasi-periodic signal, where it is possible to find many similarities between heart beats. These similarities are explored by the proposed compression scheme through the use of a segment dictionary combined with an efficient form of progressive error codification. The dictionary is able to incorporate new patterns, in order to assure the algorithm adapts to changes in signal morphology. Experimental results reveal that high compression ratios are possible for highly regular signals, with irregular signals still achieving acceptable results.

6 citations


Cites background from "ECG data compression using truncate..."

  • ...Experimental results reveal that high compression ratios are possible for highly regular signals, with irregular signals still achieving acceptable results....

    [...]

Journal ArticleDOI
TL;DR: An information theory based multiscale singular value decomposition (SVD) is proposed for multilead electrocardiogram (ECG) signal processing and the quality group of all processed signals fall under excellent category.

6 citations

Dissertation
01 Jan 2009
TL;DR: An automated ECG analysis method based on machine learning, signal processing and time-frequency analysis is developed to identify a number of fiducial points in ECG waveforms so that timing intervals and a smooth T-wave segment can be extracted for morphology analysis.
Abstract: This thesis investigates the mechanisms underlying drug-induced arrhythmia and pro- poses a new approach for the automated analysis of the electrocardiogram (ECG). The current method of assessing the cardiac safety of new drugs in clinical trials is by the measurement and analysis of the QT interval. However, the sensitivity and specificity of the QT interval has been questioned and alternative biomarkers based on T-wave mor- phology have been proposed in the literature. The mechanisms underlying drug effects on T-wave morphology are not clearly understood. Therefore, a combined approach of for- ward cardiac modelling and inverse ECG analysis is adopted to investigate the effects of sotalol, a compound known to have pro-arrhythmic effects, on ventricular repolarisation. A computational model of sotalol and IKr, an ion channel that plays a critical role in ventricular repolarisation, was developed. This model was incorporated into a model of the human ventricular myocyte, and subsequently arranged in a 1-D fibre model of 200 cells. The model was used to assess the effect of sotalol on IKr, action potential duration and biomarkers of ventricular repolarisation derived from the simulated ECG. In parallel, an automated ECG analysis method based on machine learning, signal processing and time-frequency analysis is developed to identify a number of fiducial points in ECG waveforms so that timing intervals and a smooth T-wave segment can be extracted for morphology analysis. The approach is to train a hidden Markov model (HMM) using a data set of ECG waveforms and the corresponding expert annotations. The signal is first encoded using the undecimated wavelet transform (UWT). The UWT coefficients are used for R-peak detection, signal encoding for the HMM and a wavelet de-noising procedure. Using the Viterbi algorithm, the trained HMM is then applied to a subset of the ECG signal to infer the fiducial points for each heart beat. Furthermore, a method for deriving a confidence measure based on the trained HMM is implemented so that a level of confidence can be associated with the automated annotations. Finally, the T-wave segment is extracted from the de-noised ECG signal for morphology characterisation. This thesis contributes to the literature on automated characterisation of drug ef- fects on ventricular repolarisation in three different ways. Firstly, it investigates the mechanisms underlying the effects of drug inhibition of IKr on ventricular repolarisation as captured by the simulated ECG signal. Secondly, it shows how the combination of UWT encoding and HMM inference can be effectively used to segment 24-hour Holter ECG recordings. Evaluation of the segmentation algorithm on a clinical ECG data set demonstrates the ability of the algorithm to overcome problems associated with existing automated systems, and hence provide a more robust analysis of ECG signals. Finally, the thesis provides insight into the drug effects of sotalol on ventricular repolarisation as captured by biomarkers extracted from the surface ECG.

6 citations


Cites methods from "ECG data compression using truncate..."

  • ...SVD has been applied to the ECG in a variety of applications, including noise filtering [270], ECG compression [271] and removing foetal ECG signals from maternal ECG signals [272]....

    [...]

Journal ArticleDOI
TL;DR: A novel algorithm for the compression of ECG signals to reduce energy consumption in RHMs using discrete Krawtchouk moments as a feature extractor and the accelerated Ant Lion Optimizer selects the optimum features that achieve the best-reconstructed signal.
Abstract: Remote Healthcare Monitoring Systems (RHMs) that use ECG signals are very effective tools for the early diagnosis of various heart conditions. However, these systems are still confronted with a problem that reduces their efficiency, such as energy consumption in wearable devices because they are battery-powered and have limited storage. This paper presents a novel algorithm for the compression of ECG signals to reduce energy consumption in RHMs. The proposed algorithm uses discrete Krawtchouk moments as a feature extractor to obtain features from the ECG signal. Then the accelerated Ant Lion Optimizer (AALO) selects the optimum features that achieve the best-reconstructed signal. Our proposed algorithm is extensively validated using two benchmark datasets: MIT-BIH arrhythmia and ECG-ID. The proposed algorithm provides the average values of compression ratio (CR), percent root mean square difference (PRD), signal to noise ratio (SNR), Peak Signal to noise ratio (PSNR), and quality score (QS) are 15.56, 0.69, 44.52, 49.04 and 23.92, respectively. The comparison demonstrates the advantages of the proposed compression algorithm on recent algorithms concerning the mentioned performance metrics. It also tested and compared against other existing algorithms concerning Processing Time, compression speed and computational efficiency. The obtained results show that the proposed algorithm extremely outperforms in terms of (Processing Time = 6.89 s), (compression speed = 4640.19 bps) and (computational efficiency = 2.95). The results also indicate that the proposed compression algorithm reduces energy consumption in a wearable device by decreasing the wake-up time by 3600 ms.

6 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]