scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
TL;DR: A novel algorithm based on a stochastic gradient ascent procedure which allows for a reliable on-line PPG signal quality estimation that can be implemented on mobile sensor devices with limited processing power is presented.

5 citations

Journal ArticleDOI
01 Nov 2015
TL;DR: A hybrid ECG compression technique based on DWT and reducing the correlation between signal samples and beats has been presented in this paper and has been tested on records extracted from MIT-BIH arrhythmia database.
Abstract: A hybrid ECG compression technique based on DWT and reducing the correlation between signal samples and beats has been presented in this paper. It starts by segmenting the ECG signal into blocks; each has 1024 samples. Then, DPCM approach is adopted by removing the redundancy between successive samples. This yields to residual signal with QRS-complex like waveform without the presence of P-, T- and U-waves. Then the first QRS-complex like wave is isolated and all the succeeding ones are subtracted from the preceding ones to remove the redundancy between signal beats. The next process depends on the application. For telediagnoses, the resulting residual signal is wavelet transformed while for telemonitoring both the first QRS-complex like wave and the residual signal are wavelet transformed. In both cases the resulting wavelet coefficients are thresholded based on energy packing efficiency and coded using modified run-length algorithm. The performance of the proposed algorithm has been tested on records extracted from MIT-BIH arrhythmia database. Simulation results illustrate the excellent quality of the reconstructed signal with percentage-root-mean square difference less than 1.5% and compression ratios greater than 20.

5 citations


Cites methods or result from "ECG data compression using truncate..."

  • ...algorithms is available in these publications [16]-[23]....

    [...]

  • ...Simulation results have clearly shown the effectiveness of the proposed algorithm in getting high compression ratios with acceptable reconstruction signal quality compared to recently published results [16][23]....

    [...]

  • ...Methods belonging to this category are: peak picking, linear prediction, neural networks, long term prediction, vector quantization and singular value decomposition (SVD) methods [13]-[16]....

    [...]

  • ...Similarly, SVD based techniques have been presented based on the beat correlation properties of ECG signal to enhance compression performance of the algorithm [16]....

    [...]

  • ...Tables (3) and (4) include the comparative performance study of the proposed method and the earlier published algorithms [16]-[23]....

    [...]

Journal ArticleDOI
TL;DR: Compression using dual tree complex wavelet transform (DT-CWT) has been proposed, that results in many wavelet coefficients getting close to zero and Set Partitioning in Hierarchical Tree (SPIHT) coding is used along with DT-C WT to compress data.
Abstract: Electrocardiogram (ECG) records the electrical potentials of the heart. ECG reveals a lot of useful information on the normal and abnormal conditions of heart. It is very difficult to analyse ECG signals as they are non-stationary in nature. There is a need to compress the ECG signal in an efficient way to reduce the amount of data that is transmitted, stored, and analysed without losing the significant clinical information. In this paper, compression using dual tree complex wavelet transform (DT-CWT) has been proposed, that results in many wavelet coefficients getting close to zero. To improve the compression ratio, Set Partitioning in Hierarchical Tree (SPIHT) coding is used along with DT-CWT to compress data. The proposed method gives better compression ratios and reduced reconstruction errors compared to stationary wavelet transform (SWT). Experimental results of DT-CWT based SPIHT are shown on many MIT-BIH records which show improved performance by 35.19% over existing methods namely, SWT and...

5 citations

Journal ArticleDOI
25 Oct 2016-Symmetry
TL;DR: A novel method based on generalized Gabor direction pattern (GGDP) and weighted discrepancy measurement model (WDMM) to overcome defects of traditional texture feature description method and outperforms other existing classical methods.
Abstract: Texture feature description is a remarkable challenge in the fields of computer vision and pattern recognition. Since the traditional texture feature description method, the local binary pattern (LBP), is unable to acquire more detailed direction information and always sensitive to noise, we propose a novel method based on generalized Gabor direction pattern (GGDP) and weighted discrepancy measurement model (WDMM) to overcome those defects. Firstly, a novel patch-structure direction pattern (PDP) is proposed, which can extract rich feature information and be insensitive to noise. Then, motivated by searching for a description method that can explore richer and more discriminant texture features and reducing the local Gabor feature vector’s high dimension problem, we extend PDP to form the GGDP method with multi-channel Gabor space. Furthermore, WDMM, which can effectively measure the feature distance between two images, is presented for the classification and recognition of image samples. Simulated experiments on olivetti research laboratory (ORL), Carnegie Mellon University pose, illumination, and expression (CMUPIE) and Yale B face databases under different illumination or facial expression conditions indicate that the proposed method outperforms other existing classical methods.

5 citations

Journal ArticleDOI
TL;DR: The effect of inter-subject data variance on emotion recognition, important data annotation techniques for emotion recognition and their comparison, data preprocessing techniques for each physiological signal, data splitting techniques for improving the generalization of emotion recognition models and different multimodal fusion techniques and their compare are reviewed.
Abstract: Physiological signals are the most reliable form of signals for emotion recognition, as they cannot be controlled deliberately by the subject. Existing review papers on emotion recognition based on physiological signals surveyed only the regular steps involved in the workflow of emotion recognition such as pre-processing, feature extraction, and classification. While these are important steps, such steps are required for any signal processing application. Emotion recognition poses its own set of challenges that are very important to address for a robust system. Thus, to bridge the gap in the existing literature, in this paper, we review the effect of inter-subject data variance on emotion recognition, important data annotation techniques for emotion recognition and their comparison, data pre-processing techniques for each physiological signal, data splitting techniques for improving the generalization of emotion recognition models and different multimodal fusion techniques and their comparison. Finally, we discuss key challenges and future directions in this field.

4 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]