scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, a machine vision approach is proposed for automatic inspection of micro-defects in patterned TFT-LCD surfaces, which is based on a global image reconstruction scheme using singular value decomposition.
Abstract: Thin film transistor-liquid crystal displays (TFT-LCDs) have become increasingly attractive and popular as display devices. A machine vision approach is proposed for automatic inspection of microdefects in patterned TFT-LCD surfaces. The proposed method is based on a global image reconstruction scheme using singular value decomposition. A partition procedure that separates the input image into non-overlapping sub-images is used to reduce the computation time of singular value decomposition. Taking the pixel image as a matrix, the singular values on the decomposed diagonal matrix represent different structural details of the TFT-LCD image. The proposed method first selects the dominant singular values that represent the repetitive orthogonal-line texture of the TFT-LCD surface. It then reconstructs the matrix by excluding the dominant singular values. The resulting image can effectively remove the background texture and preserves anomalies distinctly. The experiments have evaluated a variety of TFT-LCD mic...

45 citations


Cites background from "ECG data compression using truncate..."

  • ...…(Konstantinides et al. 1997, Ibrahim et al. 1998, Kamm and Nagy 1998), and image compression and reconstruction (Steriti and Fiddy 1993, Chandrasekaran et al. 1997, Song and Zhang 1999, Cagnoli and Ulrych 2001, Hoge et al. 2001, Popesuc et al. 2001, Selivanov and Lecomte 2001, Wei et al. 2001)....

    [...]

Journal ArticleDOI
TL;DR: An electrocardiogram (ECG) data compression scheme is presented using the gain-shape vector quantization, and both visual quality and the objective quality are excellent even in low bit rates.
Abstract: An electrocardiogram (ECG) data compression scheme is presented using the gain-shape vector quantization. The proposed approach utilizes the fact that ECG signals generally show redundancy among adjacent heartbeats and adjacent samples. An ECG signal is QRS detected and segmented according to the detected fiducial points. The segmented heartbeats are vector quantized, and the residual signals are calculated and encoded using the AREA algorithm. The experimental results show that with the proposed method both visual quality and the objective quality are excellent even in low bit rates. An average PRD of 5.97% at 127 b/s is obtained for the entire 48 records in the MIT-BIH database. The proposed method also outperforms others for the same test dataset.

45 citations


Cites background or methods from "ECG data compression using truncate..."

  • ...There are several approaches, such as the transformation in [9] and the multirate approach in [20]....

    [...]

  • ...Although the multirate approach produces no error with sufficiently high interpolation factor [20], it is considered computationally more complex than the transformation in [9]....

    [...]

  • ...According to the transformation in [9], one heartbeat segment can be converted into a segment that holds the same morphology but in different data length (i....

    [...]

  • ...Therefore, in this paper, the transformation used in [9] is adopted for period normalization....

    [...]

Journal ArticleDOI
TL;DR: New preprocessing techniques for electrocardiogram signals, namely, DC equalization and complexity sorting, are presented, which when applied can improve current 2-D compression algorithms.
Abstract: In this brief, we present new preprocessing techniques for electrocardiogram signals, namely, DC equalization and complexity sorting, which when applied can improve current 2-D compression algorithms. The experimental results with signals from the Massachusetts Institute of Technology - Beth Israel Hospital (MIT-BIH) database outperform the ones from many state-of-the-art schemes described in the literature.

44 citations


Cites background from "ECG data compression using truncate..."

  • ...Several authors have successfully used period normalization [1], [3], [4], which tends to lead to significant improvements....

    [...]

  • ...To correct this behavior and better exploit the interbeat dependencies , Wei et al. in [4] proposed the period normalization, also adopted by [3], which changes the length of all periods to a common value....

    [...]

Journal ArticleDOI
TL;DR: The experimental results obtained of the MIT-BIH Arrhythmia database showed that for all feature representation adopted in this work, the GP detector trained only with 600 beats from PVC and Non-PVC classes can provide an overall accuracy and a sensitivity above 90 % on 20 records.
Abstract: In this paper, we propose to investigate the capabilities of two kernel methods for the detection and classification of premature ventricular contractions (PVC) arrhythmias in Electrocardiogram (ECG signals). These kernel methods are the support vector machine and Gaussian process (GP). We propose to study these two classifiers with various feature representations of ECG signals, such as morphology, discrete wavelet transform, higher-order statistics, and S transform. The experimental results obtained on 48 records (i.e., 109,887 beats) of the MIT-BIH Arrhythmia database showed that for all feature representation adopted in this work, the GP detector trained only with 600 beats from PVC and Non-PVC classes can provide an overall accuracy and a sensitivity above 90 % on 20 records (i.e., 49,774 beats) and 28 records (i.e., 60,113 beats) seen and unseen, respectively, during the training phase.

44 citations


Cites methods from "ECG data compression using truncate..."

  • ...Then, after extracting the three temporal features of interest, we normalized to the same periodic length the duration of the segmented ECG cycles according to the procedure reported in [25]....

    [...]

Proceedings ArticleDOI
23 Apr 2013
TL;DR: The capabilities of two domain adaption methods proposed recently in the literature of machine learning are investigated, known as domain transfer SVM and importance weighted kernel logistic regression method.
Abstract: The detection and classification of heart arrhythmias using Electrocardiogram signals (ECG) has been an active area of research in the literature. Usually, to assess the effectiveness of a proposed classification method, training and test data are extracted from the same ECG record. However, in real scenarios test data may come from different records. In this case, the classification results may be less accurate due to the statistical shift between these samples. In order to solve this issue, we investigate, in this paper, the capabilities of two domain adaption methods proposed recently in the literature of machine learning. The first is known as domain transfer SVM, whereas the second is the importance weighted kernel logistic regression method. To assess the effectiveness of both methods, the MIT-BIH arrhythmia database is used in the experiments.

43 citations


Cites methods from "ECG data compression using truncate..."

  • ...Then, after extracting the temporal features of interest, we normalized to the same periodic length the duration of the segmented ECG cycles according to the procedure reported in [23]....

    [...]

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]