scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Journal ArticleDOI
16 Jun 2021-Irbm
TL;DR: A methodological review of different ECG data compression techniques based on their experimental performance on ECG records of the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) arrhythmia database and includes different validation methods of ECG compression techniques.
Abstract: Objective: Globally, cardiovascular diseases (CVDs) are one of the most leading causes of death. In medical screening and diagnostic procedures of CVDs, electrocardiogram (ECG) signals are widely used. Early detection of CVDs requires acquisition of longer ECG signals. It has triggered the development of personal healthcare systems which can be used by cardio-patients to manage the disease. These healthcare systems continuously record, store, and transmit the ECG data via wired/wireless communication channels. There are many issues with these systems such as data storage limitation, bandwidth limitation and limited battery life. Involvement of ECG data compression techniques can resolve all these issues. Method: In the past, numerous ECG data compression techniques have been proposed. This paper presents a methodological review of different ECG data compression techniques based on their experimental performance on ECG records of the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) arrhythmia database. Results: It is observed that experimental performance of different compression techniques depends on several parameters. The existing compression techniques are validated using different distortion measures. Conclusion: This study elaborates advantages and disadvantages of different ECG data compression techniques. It also includes different validation methods of ECG compression techniques. Although compression techniques have been developed very widely but the validation of compression methods is still a prospective research area to accomplish an efficient and reliable performance.

11 citations

Journal ArticleDOI
TL;DR: Evaluation results show that the proposed algorithm provides a good compression performance; in particular, the mean opinion score of the reconstructed signal falls under the category “very good” as per the gold standard subjective measure.
Abstract: Advancements in electronics and miniaturized device fabrication technologies have enabled simultaneous acquisition of multiple biosignals (MBioSigs), but the area of compression of MBioSigs remains unexplored to date. This paper presents a robust singular value decomposition (SVD) and American standard code for information interchange (ASCII) character encoding-based algorithm for compression of MBioSigs for the first time to the best of our knowledge. At the preprocessing stage, MBioSigs are denoised, down sampled and then transformed to a two-dimensional (2-D) data array. SVD of the 2-D array is carried out and the dimensionality of the singular values is reduced. The resulting matrix is then compressed by a lossless ASCII character encoding-based technique. The proposed compression algorithm can be used in a variety of modes such as lossless, with or without using the down sampling operation. The compressed file is then uploaded to a hypertext preprocessor (PHP)-based website for remote monitoring application. Evaluation results show that the proposed algorithm provides a good compression performance; in particular, the mean opinion score of the reconstructed signal falls under the category “very good” as per the gold standard subjective measure.

10 citations


Cites methods from "ECG data compression using truncate..."

  • ...Therefore, the use of SVD is highly beneficial for dimension-reduction or compression of correlated signals without jeopardizing their morphologies [25], [27]....

    [...]

Proceedings ArticleDOI
03 May 2009
TL;DR: The morphological information of ECG is incorporated to improve the compression algorithm in Compression Ratio, PRD and CC and there is no need to detect ECG complexes, this makes the algorithm more robust and accurate.
Abstract: In this paper a new efficient fractal based compression algorithm is proposed for electrocardiogram signals. The self-similarities in the ECG signals make them suitable to be compressed efficiently using fractal based methods. In the proposed method, as in the basic fractal based compression method, each part of the signal is mapped to another part with a reasonable error. The transformed maps are then stored instead of the original signal samples. The signal is built up using these transforms in an iterative process using an arbitrary initial signal. Here, the morphological information of ECG is incorporated to improve the compression algorithm in Compression Ratio, PRD and CC. As a novel point and in contrary to other methods there is no need to detect ECG complexes, this makes the algorithm more robust and accurate. The fixed size of blocks with rotated transformed blocks and optimal coefficients for the maximum similarity between blocks are employed. The proposed algorithm was tested on a reasonable set of MIT-BIH database signals. The experiments all showed that the proposed algorithm outperforms all reported Fractal-Based methods.

10 citations

Journal ArticleDOI
TL;DR: The main idea is to separate different components of the signal, select the most relevant components which are used to quantify inter-limb deviation in singular value space and demonstrate that the use of new features outperforms most of the previous methods.

10 citations

Journal ArticleDOI
TL;DR: This study proposes a mathematical approach to reconstruct gene regulatory networks at a coarse-grain level from high throughput gene expression data that provides the a posteriori probability that a given gene regulates positively, negatively or does not regulate each one of the network genes.
Abstract: In the postgenome era many efforts have been dedicated to systematically elucidate the complex web of interacting genes and proteins. These efforts include experimental and computational methods. Microarray technology offers an opportunity for monitoring gene expression level at the genome scale. By recourse to information theory, this study proposes a mathematical approach to reconstruct gene regulatory networks at a coarse-grain level from high throughput gene expression data. The method provides the a posteriori probability that a given gene regulates positively, negatively or does not regulate each one of the network genes. This approach also allows the introduction of prior knowledge and the quantification of the information gain from experimental data used in the inference procedure. This information gain can be used to choose those genes that will be perturbed in subsequent experiments in order to refine our knowledge about the architecture of an underlying gene regulatory network. The performance of the proposed approach has been studied by in numero experiments. Our results suggest that the approach is suitable for focusing on size-limited problems, such as recovering a small subnetwork of interest by performing perturbation over selected genes.

10 citations

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]