scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
TL;DR: A novel dictionary learning algorithm that employs k-medoids cluster optimized by k-means++ and builds dictionaries by searching and using representative samples is proposed, which can avoid the interference of dirty data, and thus boost the classification performance of ECG systems based on vector quantization features.
Abstract: We improve dictionary learning algorithm for vector quantization of ECG.The algorithm is employed to extract feature of ECG.The algorithm can avoid interference from dirty data.The algorithm is capable of increasing classification accuracy.An initial cluster centers selecting method is utilized to speed up the algorithm. Vector quantization(VQ) can perform efficient feature extraction from electrocardiogram (ECG) with the advantages of dimensionality reduction and accuracy increase. However, the existing dictionary learning algorithms for vector quantization are sensitive to dirty data, which compromises the classification accuracy. To tackle the problem, we propose a novel dictionary learning algorithm that employs k-medoids cluster optimized by k-means++ and builds dictionaries by searching and using representative samples, which can avoid the interference of dirty data, and thus boost the classification performance of ECG systems based on vector quantization features. We apply our algorithm to vector quantization feature extraction for ECG beats classification, and compare it with popular features such as sampling point feature, fast Fourier transform feature, discrete wavelet transform feature, and with our previous beats vector quantization feature. The results show that the proposed method yields the highest accuracy and is capable of reducing the computational complexity of ECG beats classification system. The proposed dictionary learning algorithm provides more efficient encoding for ECG beats, and can improve ECG classification systems based on encoded feature.

39 citations

Journal ArticleDOI
01 Mar 2011
TL;DR: A medical-grade WLAN architecture for remote ECG monitoring is proposed, which employs the point-coordination function (PCF) for medium access control and Reed-Solomon coding for error control and the reliability of ECG transmission exceeds 99.99% with the initial buffering delay.
Abstract: In telecardiology, electrocardiogram (ECG) signals from a patient are acquired by sensors and transmitted in real time to medical personnel across a wireless network. The use of IEEE 802.11 wireless LANs (WLANs), which are already deployed in many hospitals, can provide ubiquitous connectivity and thus allow cardiology patients greater mobility. However, engineering issues, including the error-prone nature of wireless channels and the unpredictable delay and jitter due to the nondeterministic nature of access to the wireless medium, need to be addressed before telecardiology can be safely realized. We propose a medical-grade WLAN architecture for remote ECG monitoring, which employs the point-coordination function (PCF) for medium access control and Reed-Solomon coding for error control. Realistic simulations with uncompressed two-lead ECG data from the MIT-BIH arrhythmia database demonstrate reliable wireless ECG monitoring; the reliability of ECG transmission exceeds 99.99% with the initial buffering delay of only 2.4 s.

38 citations

Journal ArticleDOI
TL;DR: A new approach that aims at assisting the human user (cardiologist) in his/her work of labeling by removing in an automatic way the training samples with potential mislabeling problems by based on a genetic optimiza- tion process.

36 citations

Journal ArticleDOI
TL;DR: This paper focuses on providing a comparison of the major techniques of ECG data compression which are intended to attain a lossless compressed data with relatively high compression ratio (CR) and low percent root mean square difference (PRD).
Abstract: m (ECG) data compression reduced the storage requirements to develop a more efficient tele- cardiology system for cardiac analysis and diagnosis. The ECG compression without loss of diagnostic information is based on the fact that consecutive samples of the digitized ECG carry redundant information that can be removed with very less computing effort. This paper focuses on providing a comparison of the major techniques (direct, transform, parameter extraction and 2D approaches) of ECG data compression which are intended to attain a lossless compressed data with relatively high compression ratio (CR) and low percent root mean square difference (PRD).The paper concludes with the presentation of a framework for evaluation and comparison of ECG compression schemes. Keywordsm; ECG; Compression; CR; PRD; PRDN; QS

34 citations


Cites background or methods from "ECG data compression using truncate..."

  • ...The “cut and align beats approach and 2D DCT” and “period normalization and truncated SVD algorithm” are available preprocessing techniques to get good compression results in ECG [51,52]....

    [...]

  • ...Many researchers have proposed ECG compression techniques by treating 1D ECG signal as a 2D image and exploiting the inter- and intra-beat correlations by encoder [37,43, 50, 51]....

    [...]

Journal ArticleDOI
TL;DR: A thorough experimental study was done to show the superiority of the generalization capability of the Extreme Learning Machine (ELM) is presented and compared with support vector machine (SVM) approach in the automatic classification of ECG beats.
Abstract: An Electrocardiogram or ECG is an electrical recording of the heart and is used in the investigation of heart disease. This ECG can be classified as normal and abnormal signals. The classification of the ECG signals is presently performed with the support vector machine. The generalization performance of the SVM classifier is not sufficient for the correct classification of ECG signals. To overcome this problem the ELM classifier is used which works by searching for the best value of the parameters that tune its discriminant function, and upstream by looking for the best subset of features that feed the classifier. The experiments were conducted on the ECG data from the Physionet arrhythmia database to classify five kinds of abnormal waveforms and normal beats. In this paper a thorough experimental study was done to show the superiority of the generalization capability of the Extreme Learning Machine (ELM) is presented and compared with support vector machine (SVM) approach in the automatic classification of ECG beats. In particular, the sensitivity of the ELM classifier is tested and that is compared with SVM combined with two classifiers, they are the k-nearest neighbor classifier (kNN) and the radial basis function neural network classifier (RBF), with respect to the curse of dimensionality and the number of available training beats. The obtained results clearly confirm the superiority of the ELM approach as compared to traditional classifiers.

31 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]