scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
TL;DR: A novel simple but discriminative algorithm using minimum number of physiological signals and time-varying singular value decomposition (TSVD) approach that provided a computationally efficient and robust characterization of the signals in the presence of individual differences and noises.

22 citations

Journal ArticleDOI
TL;DR: Lossless compression schemes for ECG signals based on neural network predictors and entropy encoders are presented and it is shown that superior performances in terms of the distortion parameters of the reconstructed signals can be achieved with the proposed schemes.
Abstract: This paper presents lossless compression schemes for ECG signals based on neural network predictors and entropy encoders. Decorrelation is achieved by nonlinear prediction in the first stage and encoding of the residues is done by using lossless entropy encoders in the second stage. Different types of lossless encoders, such as Huffman, arithmetic, and runlength encoders, are used. The performances of the proposed neural network predictor-based compression schemes are evaluated using standard distortion and compression efficiency measures. Selected records from MIT-BIH arrhythmia database are used for performance evaluation. The proposed compression schemes are compared with linear predictor-based compression schemes and it is shown that about 11% improvement in compression efficiency can be achieved for neural network predictor-based schemes with the same quality and similar setup. They are also compared with other known ECG compression methods and the experimental results show that superior performances in terms of the distortion parameters of the reconstructed signals can be achieved with the proposed schemes.

21 citations

Proceedings ArticleDOI
01 Sep 2016
TL;DR: Quantitative results show the superiority of the SAM scheme against state-of-the-art techniques: compression ratios of up to 35, 70, and 180-fold are generally achievable respectively for PPG, ECG and RESP signals, while reconstruction errors remain within 2% and 7% and the input signal morphology is preserved.
Abstract: Wearable devices allow the seamless and inexpensive gathering of biomedical signals such as electrocardiograms (ECG), photoplethysmograms (PPG), and respiration traces (RESP). They are battery operated and resource constrained, and as such need dedicated algorithms to optimally manage energy and memory. In this work, we design SAM, a Subject-Adaptive (lossy) coMpression technique for physiological quasi-periodic signals. It achieves a substantial reduction in their data volume, allowing efficient storage and transmission, and thus helping extend the devices' battery life. SAM is based upon a subject-adaptive dictionary, which is learned and refined at runtime exploiting the time-adaptive self-organizing map (TASOM) unsupervised learning algorithm. Quantitative results show the superiority of our scheme against state-of-the-art techniques: compression ratios of up to 35-, 70- and 180-fold are generally achievable respectively for PPG, ECG and RESP signals, while reconstruction errors (RMSE) remain within 2% and 7% and the input signal morphology is preserved.

20 citations

Journal ArticleDOI
TL;DR: The quantization strategy of extracted non-zero wavelet coefficients (NZWC), according to the combination of RLE, HUFFMAN and arithmetic encoding of the NZWC and a resulting lookup table, allow the accomplishment of high compression ratios with good quality reconstructed signals.
Abstract: This paper presents a new Quality-Controlled, wavelet based, compression method for electrocardiogram (ECG) signals. Initially, an ECG signal is decomposed using the wavelet transform. Then, the resulting coefficients are iteratively thresholded to guarantee that a predefined goal percent root mean square difference (GPRD) is matched within tolerable boundaries. The quantization strategy of extracted non-zero wavelet coefficients (NZWC), according to the combination of RLE, HUFFMAN and arithmetic encoding of the NZWC and a resulting lookup table, allow the accomplishment of high compression ratios with good quality reconstructed signals. Keywords—ECG compression, Non-uniform Max-Lloyd quantizer, PRD, Quality-Controlled, Wavelet transform

19 citations

Journal ArticleDOI
TL;DR: A relatively simple and cost-effective method based on the analysis of handwriting data for early detection of premonitory symptoms of Alzheimer's disease demonstrated that it could potentially be used in clinical devices.
Abstract: Early detection of Alzheimer's disease (AD) has attracted the attention of scientific and clinical community because of its application in control, early care, and treatment. The development of a cost-effective but reliable method is a challenge in this field. To address this challenge, in this study an effort was made to represent an efficient algorithm based on the analysis of handwriting data. Detection of premonitory symptoms using the handwriting data could be more difficult due to individual differences, effects of different sources of variability and noises. For this purpose, a noise-robustness paradigm was adopted that was independent of small variations. It was based on the singular value decomposition technique and sparse non-negative least-square classifier. To find out the best results, the effects of single and dual-task conditions as well as several handwriting time series such as horizontal, vertical and absolute velocity, acceleration, pressure, and trajectory curvature were studied. The discriminant capability of the proposed method was studied in 13 subjects with mild cognitive impairment, 15 with AD, and 15 healthy participants. They performed four writing tasks under single and dual task conditions. The new feature extracted from vertical acceleration yielded high average accuracy rates of 100% in classification between healthy controls and subjects with MCI. The average accuracy rate of 93.5% was also obtained in discriminating between healthy controls and AD patients. More investigations confirmed that using the proposed features under dual-task condition could enhance the detection rate. Achieving high performance using a relatively simple and cost-effective method demonstrated that it could potentially be used in clinical devices.

18 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]