scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Proceedings ArticleDOI
01 Dec 2017
TL;DR: This paper aims at the construction of new indices through clustering of AA index database with correlation distance and suggests that due to the correlation of these new maps with groups of AAindex indices (in clusters); they have the potential to be used for numerical representation of protein sequence in different studies.
Abstract: As a first step of genomics signal processing, alphabetical sequence is mapped to numerical. The choice of mapping techniques depends on the application and affects the result of the study. Since biological function is the result of amino acids interactions, a significant method for alphabetical to numerical conversion of sequence is to use the physico-chemical and biochemical properties of amino acids. AAindex database is a rich collection of such properties that can be used for numerical representation of protein. Each of these properties gives a viewpoint in the study of biological functions. Taking into account all AAindex indices leads to a multi-viewpoint representation and provides more options to observe and study the target biological phenomena. But this advantage increases variables number, space dimension and computation time. Since there is correlation between AAindex databases, to handle the issue of space dimension increasement, compact versions of correlated indices are extracted. This paper aims at the construction of new indices through clustering of AAindex database with correlation distance. The results suggest that due to the correlation of these new maps with groups of AAindex indices (in clusters); they have the potential to be used for numerical representation of protein sequence in different studies.

3 citations


Cites methods from "ECG data compression using truncate..."

  • ...The method of Singular Value Decomposition was proposed in 1970, and has been applied in a wide range of areas such as: image compression, texture processing, and feature extraction [25]....

    [...]

Journal ArticleDOI
TL;DR: The results obtained were positive with low PRD, PRDN and PMAE at different compression ratios compared to many other loss-type compressing methods, proving the high efficiency of the proposed algorithm.
Abstract: Compressing the ECG signal is considered a feasible solution for supporting a system to manipulate the package size, a major factor leading to congestion in an ECG wireless network. Hence, this paper proposes a compression algorithm, called the advanced two-state algorithm, which achieves three necessary characteristics: a) flexibility towards all ECG signal conditions, b) the ability to adapt to each requirement of the package size and c) be simple enough. In this algorithm, the ECG pattern is divided into two categories: “complex” durations such as QRS complexes, are labeled as low-state durations, and “plain” duratio ns such P or T waves, are labeled as high-state durations. Each duration type can be compressed at different compression ratios, and Piecewise Cubic Spline can be used for reconstructing the signal. For evaluation, the algorithm was applied to 48 records of the MIT-BIH arrhythmia database (clear PQRST complexes) and 9 records of the CU ventricular tachyarrhythmia database (unclear PQRST complexes). Parameters including Compression Ratio (CR), Percentage Root mean square Difference (PRD), Percentage Root mean square Difference, Normalized (PRDN), root mean square (RMS), Signal-to-noise Ratio (SNR) and a new proposed index called Peak Maximum Absolute Error (PMAE) were used to comprehensively evaluate the performance of the algorithm. Eventually, the results obtained were positive with low PRD, PRDN and PMAE at different compression ratios compared to many other loss-type compressing methods, proving the high efficiency of the proposed algorithm. All in all, with its extremely low-cost computation, versatility and good-quality reconstruction, this algorithm could be applied to a number of wireless applications to control package size and overcome congested situations.

3 citations


Cites methods from "ECG data compression using truncate..."

  • ...Likewise, H[40-42] and 2-D methods[43-48] are even more computationally complex and are difficult for implementing in wireless applications as a step....

    [...]

Proceedings ArticleDOI
11 Dec 2009
TL;DR: The SVD method is proved to be suitable for denoising the ECG signal and the algorithm of calculating Singular Value Ratio (SVR) spectrum is improved, and a constructive approach of analysis characteristic patterns is proposed.
Abstract: The Singular Value Decomposition (SVD) method is introduced to denoise the ECG signal during spaceflight. The theory base of SVD method is given briefly. The denoising process of the strategy is presented combining a segment of real ECG signal. We improve the algorithm of calculating Singular Value Ratio (SVR) spectrum, and propose a constructive approach of analysis characteristic patterns. We reproduce the ECG signal very well and compress the noise effectively. The SVD method is proved to be suitable for denoising the ECG signal.

3 citations


Cites methods from "ECG data compression using truncate..."

  • ...Although SVD method has been widely used in the data process, it was applied in ECG denoise very little([8-10])....

    [...]

Proceedings ArticleDOI
17 Jul 2019
TL;DR: An approach through Principal Component Analysis (PCA) has been proposed to compress the pre-processed ECG signal and de-compress it efficiently, such that maximum amount of variance is retained.
Abstract: Several cardiac disorders can be diagnosed by meticulous analysis of ECG signals. The quality of signal determines the accuracy of diagnosis. Usually the size of ECG signals is huge and they are associated with noise. By compressing the ECG signals, they can be stored and transmitted easily. Hence, it is important to pre-process (denoise) and compress it to a maximum extent. In the recent past many works have been done on ECG compression. Compression techniques have been done using time-domain as well transform domain techniques. In this work an approach through Principal Component Analysis (PCA) has been proposed to compress the pre-processed ECG signal and de-compress it efficiently, such that maximum amount of variance is retained. This algorithm has been tested for 28 ECG signals from the MIT-BIH database. In order to analyze the performance of the algorithm, CR (compression ratio) and PRD (Percent Root Mean Square Difference) have been considered as performance parameters and are calculated. The proposed method achieves a good CR along with small PRD in comparison with algorithms that has been proposed by other researchers.

3 citations


Cites background from "ECG data compression using truncate..."

  • ...The data points are normalized in order to ensure that the data occupies fixed number of columns in each row [14]....

    [...]

08 Jul 2008
TL;DR: In this article, principal component analysis (PCA) is used for de-noising the measurement data and quantifying the underlying manufacturing uncertainty in turbine blades, and a method for dimensionality reduction has been proposed which utilizes prior information available on the variance of measurement error for different measurement types.
Abstract: Efficient designing of the turbine blades is critical to the performance of an aircraft engine. An area of significant research interest is the capture of manufacturing uncertainty in the shapes of these turbine blades. The available data used for estimation of this manufacturing uncertainty inevitably contains the effects of measurement error/noise. In the present work, we propose the application of Principal Component Analysis (PCA) for de-noising the measurement data and quantifying the underlying manufacturing uncertainty. Once the PCA is performed, a method for dimensionality reduction has been proposed which utilizes prior information available on the variance of measurement error for different measurement types. Numerical studies indicate that approximately 82% of the variation in the measurements from their design values is accounted for by the manufacturing uncertainty, while the remaining 18% variation is filtered out as measurement error.

3 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]