scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper presents a high performance quality-guaranteed two-dimensional (2D) single-lead ECG compression algorithm using singular value decomposition (SVD) and lossless-ASCII-character-encoding (LLACE)-based techniques.

14 citations

Proceedings ArticleDOI
18 Dec 2007
TL;DR: A new and simple target data rate (TDK) driven Wavelet-threshold based cardiac signals compression algorithm is presented for mobile telemedicine applications that is less complex because it does not require QRS detection, amplitude and period normalization and period sorting.
Abstract: One of the emerging issues in telehealth care system is how effectively the limited and well established mobile technologies that are now almost globally usable are exploited. The main challenge is to develop a mobile telemedicine system to transmit biosignals directly to a specialist in an emergency medical care unit for monitoring/diagnosis using an unmodified mobile telephone which provides the patient's information on the spot without unnecessary delays in seeking care, access to health facility and provision of adequate care at the facility. To provide a practical mobile telemedicine in GSM/GPRS/EDGE/UMTS limited capacity for transmitting the cardiac data for the diagnosis of cardiovascular diseases (CVD) which are widespread health problems with unpredictable and life-threatening consequences in most regions throughout the world, the implementation of biosignals compression technique is focused in this paper. Therefore, a new and simple target data rate (TDK) driven Wavelet-threshold based cardiac signals compression algorithm is presented for mobile telemedicine applications. The performance of the compression system is assessed in terms of compression efficiency, reconstructed signal quality and coding delay. This algorithm is tested using MIT-BIH ECG databases and qdheart PCG database records and the experimental results are compared with other Wavelet based ECG coders. The presented algorithm is less complex because it does not require QRS detection, amplitude and period normalization and period sorting.

13 citations


Cites background from "ECG data compression using truncate..."

  • ...These techniques can increase computational complexity and the memory space requirements for realtime system [4], [5]....

    [...]

Journal ArticleDOI
01 Dec 2021
TL;DR: In this paper, a simple but efficient method utilizing singular value decomposition (SVD) to decompose ECG signals, then applied the decompressed data to a convolutional neural network (CNN) and supporting vector machine (SVM) for classification.
Abstract: Electrocardiogram (ECG) monitoring systems are widely applied to tele-cardiology healthcare programs nowadays, where ECG signals should always be compressed first during its transmission and storage. Previous studies attempted to achieve high quality decompressed signal with compression ratio as high as possible. In this paper, we investigated the performance on ECG arrhythmia classification on ECG signal decompressed after lossy compression with a high compression ratio. We proposed a simple but efficient method utilizing singular value decomposition (SVD) to decompose ECG signals, then applied the decompressed data to a convolutional neural network (CNN) and supporting vector machine (SVM) for classification. Using the optimization method with accuracy and compression ratio as objective functions, the highest average accuracy obtained is above 96% when the selected number of singular value is only 3. The evaluation results illustrated that the decompressed ECG signal even with a relatively high distortion can still achieve a satisfying performance in the arrhythmia classification.Thus,we proved that the real-time nature of the remote mobile ECG monitoring system can be greatly improved and countless people who are in need of ECG diagnosis can benefit from it.

13 citations

Proceedings ArticleDOI
16 Jul 2008
TL;DR: This work attempts to evaluate the closeness of the objective quality measures with subjective measure and investigation may help to suggest a better quality criterion for optimizing rate-distortion algorithms.
Abstract: Measurement of quality is of fundamental importance to electrocardiogram (ECG) signal processing applications. A number of distortion measures are used for ECG signal quality assessment. A simple and widely used distortion measure is the percentage root mean square difference (PRD). It is an attractive measure due to its simplicity and mathematical convenience. But PRD is not a good measure of the true compression error and results in poor diagnostic relevance. In this paper, we discuss the advantages and drawbacks of the objective distortion measures using different compressed signals. With extensive analysis it is shown that although some distortion measures correlate well with the subjective evaluation for distortions resulting from a given compression method, they may not be reliable for evaluation of some other compression distortions. It is also concluded that a distortion measure should be subjectively meaningful in order to correlate a large or small quantitative distortion measure with bad and good quality. This work attempts to evaluate the closeness of the objective quality measures with subjective measure and investigation may help to suggest a better quality criterion for optimizing rate-distortion algorithms. Experimental results show that wavelet energy based diagnostic distortion (WEDD) measure is significantly better than other measures. This measure is sensitive to ECG feature changes and insensitive to smoothing of low-level background noise.

13 citations


Cites background or methods from "ECG data compression using truncate..."

  • ...In most of the methods, the compression performance is evaluated using the noisy records [5], [6]....

    [...]

  • ...This may mislead the judgement of the signal quality when the signal contains more noise [6]....

    [...]

  • ...Under this condition, filtering effect due to lossy compression methodologies employed may be dissimilar [6]....

    [...]

  • ...Many lossy ECG compression methods are reported in literature and their performances are assessed in terms of compression ratio, reconstructed signal quality and coding delay [1]-[6]....

    [...]

Journal ArticleDOI
TL;DR: A novel illumination-insensitive texture feature descriptor (Weber synergistic center-surround pattern) is presented, and a novel distance measurement model is presented called weighted similarity measurement model (WSMM) to improve classification accuracy, by sufficiently utilizing the information contents and orientation distributions of each pattern.

13 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]