scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
01 Sep 2008
TL;DR: A thorough experimental study to show the superiority of the generalization capability of the support vector machine (SVM) approach in the automatic classification of electrocardiogram (ECG) beats and suggest that further substantial improvements in terms of classification accuracy can be achieved by the proposed PSO-SVM classification system.
Abstract: The aim of this paper is twofold. First, we present a thorough experimental study to show the superiority of the generalization capability of the support vector machine (SVM) approach in the automatic classification of electrocardiogram (ECG) beats. Second, we propose a novel classification system based on particle swarm optimization (PSO) to improve the generalization performance of the SVM classifier. For this purpose, we have optimized the SVM classifier design by searching for the best value of the parameters that tune its discriminant function, and upstream by looking for the best subset of features that feed the classifier. The experiments were conducted on the basis of ECG data from the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) arrhythmia database to classify five kinds of abnormal waveforms and normal beats. In particular, they were organized so as to test the sensitivity of the SVM classifier and that of two reference classifiers used for comparison, i.e., the k-nearest neighbor (kNN) classifier and the radial basis function (RBF) neural network classifier, with respect to the curse of dimensionality and the number of available training beats. The obtained results clearly confirm the superiority of the SVM approach as compared to traditional classifiers, and suggest that further substantial improvements in terms of classification accuracy can be achieved by the proposed PSO-SVM classification system. On an average, over three experiments making use of a different total number of training beats (250, 500, and 750, respectively), the PSO-SVM yielded an overall accuracy of 89.72% on 40438 test beats selected from 20 patient records against 85.98%, 83.70%, and 82.34% for the SVM, the kNN, and the RBF classifiers, respectively.

480 citations


Cites methods from "ECG data compression using truncate..."

  • ...In this paper, we shall explore the simple SV count as a fitness criterion in the PSO optimization framework because of its simplicity and effectiveness, as shown in the classification of hyperspectral remote sensing images [16]....

    [...]

Journal ArticleDOI
TL;DR: By compressing the size of the dictionary in the time domain, this work is able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously.
Abstract: Magnetic resonance (MR) fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition, which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously.

253 citations


Cites background or methods from "ECG data compression using truncate..."

  • ...Data compression using the SVD has been extensively studied, for example in the compression of ECG signals [6] and for images [7], [8]....

    [...]

  • ...and the energy ratio [6] represents the fraction of the energy retained in the rank- approximation...

    [...]

Journal ArticleDOI
TL;DR: Because the proposed real-time data compression and transmission algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.
Abstract: This paper introduces a real-time data compression and transmission algorithm between e-health terminals for a periodic ECGsignal. The proposed algorithm consists of five compression procedures and four reconstruction procedures. In order to evaluate the performance of the proposed algorithm, the algorithm was applied to all 48 recordings of MIT-BIH arrhythmia database, and the compress ratio (CR), percent root mean square difference (PRD), percent root mean square difference normalized (PRDN), rms, SNR, and quality score (QS) values were obtained. The result showed that the CR was 27.9:1 and the PRD was 2.93 on average for all 48 data instances with a 15% window size. In addition, the performance of the algorithm was compared to those of similar algorithms introduced recently by others. It was found that the proposed algorithm showed clearly superior performance in all 48 data instances at a compression ratio lower than 15:1, whereas it showed similar or slightly inferior PRD performance for a data compression ratio higher than 20:1. In light of the fact that the similarity with the original data becomes meaningless when the PRD is higher than 2, the proposed algorithm shows significantly better performance compared to the performance levels of other algorithms. Moreover, because the algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.

173 citations

Journal ArticleDOI
TL;DR: A novel approach for generating the wavelet that best represents the ECG beats in terms of discrimina- tion capability is proposed, which makes use of the polyphase representation of the wavelets filter bank and formulates the design problem within a particle swarm optimization (PSO) framework.

161 citations

Journal ArticleDOI
TL;DR: In this article, a global image reconstruction scheme using the singular value decomposition (SVD) is proposed to eliminate periodical, repetitive patterns of the textured image, and preserve the anomalies in the restored image.
Abstract: Thin film transistor liquid crystal displays (TFT-LCDs) have become increasingly popular and dominant as display devices. Surface defects on TFT panels not only cause visual failure, but result in electrical failure and loss of LCD operational functionally. In this paper, we propose a global approach for automatic visual inspection of micro defects on TFT panel surfaces. Since the geometrical structure of a TFT panel surface involves repetitive horizontal and vertical elements, it can be classified as a structural texture in the image. The proposed method does not rely on local features of textures. It is based on a global image reconstruction scheme using the singular value decomposition (SVD). Taking the image as a matrix of pixels, the singular values on the decomposed diagonal matrix represent different degrees of detail in the textured image. By selecting the proper singular values that represent the background texture of the surface and reconstructing the matrix without the selected singular values, we can eliminate periodical, repetitive patterns of the textured image, and preserve the anomalies in the restored image. In the experiments, we have evaluated a variety of micro defects including pinholes, scratches, particles and fingerprints on TFT panel surfaces, and the result reveals that the proposed method is effective for LCD defect inspections.

144 citations


Cites background from "ECG data compression using truncate..."

  • ...[10,11,16], and image compression and reconstruction [1,2, 9 ,15,19,20,22,27,28]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: It is demonstrated that for the low sample rate and coarse quantization required for ambulatory recording, without sufficient temporal resolution in beat location, beat subtraction does not significantly improve compression, and may even worsen compression performance.
Abstract: A strategy is evaluated for compression of ambulatory electrocardiograms (ECGs) that uses average beat subtraction and Huffman coding of the differenced residual signal. A sample rate of 100 sps and a quantization level of 35 mu V are selected to minimize the mean-square-error distortion while maintaining a data rate that allows 24 h of two-channel ECG data to be stored in less than 4 MB of memory. With this method, sample rate, and quantization level, the ambulatory ECG is compressed and stored in real time with an average data rate of 174 b/s per channel. It is demonstrated that, for the low sample rate and coarse quantization required for ambulatory recording, without sufficient temporal resolution in beat location, beat subtraction does not significantly improve compression, and may even worsen compression performance. It is also shown that with average beat subtraction, compression is improved if multiple beat averages maintained. Improvement is most significant for ECG signals that exhibit frequent ectopic beats. >

117 citations


"ECG data compression using truncate..." refers background in this paper

  • ...For instance, long-term prediction [12] and average beat subtraction [13] reportedly use the beat-to-beat correlation....

    [...]

Proceedings ArticleDOI
20 Sep 1995
TL;DR: This paper proposes a new ECG signal compression algorithm using a discrete symmetric wavelet transform that may find applications in digital Holter recording, inECG signal archiving and in ECG data transmission through communication channels.
Abstract: This paper proposes a new ECG signal compression algorithm using a discrete symmetric wavelet transform. This proposed compression scheme may find applications in digital Holter recording, in ECG signal archiving and in ECG data transmission through communication channels. Using the new method, a compression ratio of 8 to 1 can be achieved with PRD=3.9%, in contrast to the AZTEC compression ratio of 6.8 to 1 with PRD=10.0% and the fan algorithm compression ratio of 7;4 to 1 with PRD=8.1%.

106 citations

Journal ArticleDOI
TL;DR: In a sample of 220 Frank4ead ECG's the removal of signal redundancy by second-order prediction or interpolation with subsequent entropy encoding of the respective residual errors was investigated, finding interpolation provided a 6 dB smaller residual error variance than prediction.
Abstract: Compression of digital electrocardiogram (ECG) signals is desirable for two reasons: economic use of storage space for data bases and reduction of the data transmission rate for compatibility with telephone lines. In a sample of 220 Frank4ead ECG's the removal of signal redundancy by second-order prediction or interpolation with subsequent entropy encoding of the respective residual errors was investigated. At the sampling rate of 200 Hz, interpolation provided a 6 dB smaller residual error variance than prediction. A near-optimal value for the interpolation coefficients is 0.5, permitting simple implementation of the algorithm and requiring a word length for arithmetic processing of only 2 bits in extent of the signal precision. For linear prediction, the effects of occasional transmission errors decay exponentially, whereas for interpolation they do not, necessitating error control in certain applications. Encoding of the interpolation errors by a Huffman code truncated to ±5 quantization levels of 30 ?V, required an average word length of 2.21 bits/sample (upper 96 percentile 3 bits/sample), resulting in data transmission rates of 1327 bits/s (1800 bits/s) for three simultaneous leads sampled at the rate of 200 Hz. Thus, compared with the original signal of 8 bit samples at 500 Hz, the average compression is 9:1. Encoding of the prediction errors required an average wordlength of 2.67 bits/sample with a 96 percentile of 5.5 bits/sample, making this method less suitable for synchronous transmission.

106 citations

Journal ArticleDOI
TL;DR: A new concept of decomposition of a signal into component periodic waveforms is presented, configured for the sequential extraction of successively weaker components with different period lengths.
Abstract: This paper presents a new concept of decomposition of a signal into component periodic waveforms. The singular value decomposition (SVD) is used for the detection of periodicity and separation of the component signals. The signal is configured for the sequential extraction of successively weaker components with different period lengths. The approach enjoys the numerical stability associated with SVD. >

93 citations

Journal ArticleDOI
TL;DR: An orthogonalization method to eliminate unwanted signal components in standard 12-lead exercise electrocardiograms (ECG's) is presented and it is observed that the first two decomposed channels with highest energy are sufficient to reconstruct the ST-segment and J-point.
Abstract: An orthogonalization method to eliminate unwanted signal components in standard 12-lead exercise electrocardiograms (ECG's) is presented in this work. A singular-value-decomposition-based algorithm is proposed to decompose the signal into two time-orthogonal subspaces; one containing the ECG and the other containing artifacts like baseline wander and electromyogram. The method makes use of redundancy in 12-lead ECG. The same method is also tested for reconstruction of a completely lost channel. The on-line implementation of the method is given. It is observed that the first two decomposed channels with highest energy are sufficient to reconstruct the ST-segment and J-point. The dimension of the signal space, on the other hand, does not exceed three. Data from 23 patients, with duration ranging from 9 to 21 min, are used.

85 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The concept wasalso presented in literature [19], in which SVD was used to remove the artifacts and electromyogram in a 12-lead ECG, and tested for reconstruction of a completely lost channel....

    [...]