scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Proceedings ArticleDOI
TL;DR: This work uses singular value decomposition (SVD) approach to localize structural damage in large, spatially and temporally varying EOCs and shows that this SVD-based approach successfully localize damage while current temperature-compensated baseline subtraction methods fail.
Abstract: Guided waves can propagate long distances and are sensitive to subtle structural damage. Guided-wave based damage localization often requires extracting the scatter signal(s) produced by damage, which is typically obtained by subtracting an intact baseline record from a record to be tested. However, in practical applications, environmental and operational conditions (EOC) dramatically affect guided wave signals. In this case, the baseline subtraction process can no longer perfectly remove the baseline, thereby defeating localization algorithms. In previous work, we showed that singular value decomposition (SVD) can be used to detect the presence of damage under large EOC variations, because it can differentiate the trends of damage from other EOC variations. This capability of differentiation implies that SVD can also robustly extract a scatter signal, originating from damage in the structure, that is not affected by temperature variation. This process allows us to extract a scatterer signal without the challenges associated with traditional temperature compensation and baseline subtraction routines. . In this work, we use to approach to localize structural damage in large, spatially and temporally varying EOCs. We collect pitch-catch records from randomly placed PZT transducers on an aluminum plate while undergoing temperature variations. Damage is introduced to the plate during the monitoring period. We then use our SVD method to extract the scatter signal from the records, and use the scatter signal to localize damage using the delay-and-sum method. To compare results, we also apply several temperature compensation methods to the records and then perform baseline subtraction. We show that our SVD-based approach successfully localize damage while current temperature-compensated baseline subtraction methods fail.

8 citations


Cites background from "ECG data compression using truncate..."

  • ...emoval composition th a rank r <= ector (LSV) m the singular v right singula is intensively g [16], and sig ply SVD on erature varia e the principal refers to the ges of voltage ncipal compon nd that if the ....

    [...]

Journal ArticleDOI
01 Dec 2021
TL;DR: This manuscript builds an own ECG dataset containing signals under both exercise and rest situations, and evaluates the resulting performance on ECG human identification (ECGID), finding that current methods which can well support the identification of individual under rests cannot equally present satisfying performance under exercise situations, therefore exposing the deficiency of existing ECG identification algorithms.
Abstract: As a core technology in the field of information security, human biometric recognition has become the focus of researchers’ attention during the past few years, which is based on a myriad of biometric features including fingerprint, face, retina, etc. Due to the high difficulty of forgery, electrocardiogram (ECG) has a great potential to be applied into identification, while merely experiments on its rest situation have been worked on. In this manuscript, we overcome the oversimplification of previous researches, build our own ECG dataset containing signals under both exercise and rest situations, and evaluate the resulting performance on ECG human identification (ECGID), especially the influence of exercise on the whole experiment. By applying several established learning algorithms to our own ECG dataset, we find that current methods which can well support the identification of individual under rests, cannot equally present satisfying performance under exercise situations, therefore exposing the deficiency of existing ECG identification algorithms.

8 citations

Journal ArticleDOI
TL;DR: A new CLE system based on the novel application of integrated singular value decomposition (SVD), cepstrum analysis and sparse non-negative least-square coding method was proposed, which extracted both of algebraic and harmonic information.

8 citations

Journal ArticleDOI
01 Oct 2022-Irbm
TL;DR: In this article , a novel electrocardiogram data compression technique which utilizes modified run-length encoding of wavelet coefficients is presented. But the proposed technique can be utilized for compression of ECG records of Holter monitoring.
Abstract: In cardiac patient-care, compression of long-term ECG data is essential to minimize the data storage requirement and transmission cost. Hence, this paper presents a novel electrocardiogram data compression technique which utilizes modified run-length encoding of wavelet coefficients. First, wavelet transform is applied to the ECG data which decomposes it and packs maximum energy to less number of transform coefficients. The wavelet transform coefficients are quantized using dead-zone quantization. It discards small valued coefficients lying in the dead-zone interval while other coefficients are kept at the formulated quantized output interval. Among all the quantized coefficients, an average value is assigned to those coefficients for which energy packing efficiency is less than 99.99%. The obtained coefficients are encoded using modified run-length coding. It offers higher compression ratio than conventional run-length coding without any loss of information. Compression performance of the proposed technique is evaluated using different ECG records taken from the MIT-BIH arrhythmia database. The average compression performance in terms of compression ratio, percent root mean square difference, normalized percent mean square difference, and signal to noise ratio are 17.18, 3.92, 6.36, and 28.27 dB respectively for 48 ECG records. The compression results obtained by the proposed technique is better than techniques recently introduced by others. The proposed technique can be utilized for compression of ECG records of Holter monitoring.

7 citations

Proceedings ArticleDOI
03 Aug 2010
TL;DR: The resulting statistical system identification is based on the estimation of the multivariate probability density function of system outputs, whose convergence towards that computed by kernel estimation has also been proved by verifying the asymptotically vanishing of Kullback-Leibler divergences.
Abstract: In this paper an effective unsupervised statistical identification technique for nonstationary nonlinear systems is presented. This technique extracts from the system outputs the multivariate relationships of the system natural modes, by means of the separation property of the Karhunen-Loeve transform (KLT). Then, it applies a Self-Organizing Map (SOM) to the KLT output vectors in order to give an optimal representation of data. Finally, it exploits an optimized Expectation Maximization (EM) algorithm to find the optimal parameters of a Gaussian mixture model. The resulting statistical system identification is thus based on the estimation of the multivariate probability density function (PDF) of system outputs, whose convergence towards that computed by kernel estimation has also been proved by verifying the asymptotically vanishing of Kullback-Leibler divergences. A large number of simulations on ECG signals demonstrated the validity and the excellent performance of this technique along with its applicability to noninvasive diagnosis of a large class of medical pathologies originated by unknown, unpractical to measure, physiological factors.

7 citations


Cites methods from "ECG data compression using truncate..."

  • ...The mean beat period (MBE) is chosen as the reference length [6]....

    [...]

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]