scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression using truncated singular value decomposition

01 Dec 2001-Vol. 5, Iss: 4, pp 290-299
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
Citations
More filters
Proceedings ArticleDOI
01 Apr 2009
TL;DR: This paper proposes a compression method aimed at preserving and exploiting the different diagnostic importance of different ECG segments, making smart use of context information, i.e. information about the patient's condition.
Abstract: The use of telemedicine capabilities to manage aged and cardiac chronically ill patients is going to become a common practice. Usefulness and diagnostic value of classical ECG monitoring and recording can be enhanced by jointly collecting and analysing data detected by other sensors (e.g. movement detectors) which enable to associate specific cardiac events with the patient's environment and activity at the time epoch the cardiac event appears. In this scenario, characterized by a continuous growth of data volume to be stored and transmitted, data compression plays a crucial role. In this paper we propose a compression method aimed at preserving and exploiting the different diagnostic importance of different ECG segments, making smart use of context information, i.e. information about the patient's condition. Specifically, we focus on a 2D compression method that exploits the features of JPEG2000 compression and we propose a novel paradigm for context-adaptive compression of ECG data.

4 citations


Cites background or methods from "ECG data compression using truncate..."

  • ...In [7] it is shown that the period transformation described above will not cause distortion in the normalized segments o f the recovered waveforms....

    [...]

  • ...Two-dimensional (2-D) approaches to heartbeat signal compression have recently been proposed [4] [5] [6] [7] in order to exploit both sample-to-sample and beat-to-beat correla tion....

    [...]

Journal ArticleDOI
TL;DR: The PCA appropriately classified two groups of women in relation to age (young and Middle‐aged) based on PSD analysis of consecutive normal RR intervals based on principal component coefficients extracted from PSD signals.
Abstract: The purpose of this study was to investigate the application of the principal component analysis (PCA) technique on power spectral density function (PSD) of consecutive normal RR intervals (iRR) aiming at assessing its ability to discriminate healthy women according to age groups: young group (20–25 year-old) and middle-aged group (40–60 year-old). Thirty healthy and non-smoking female volunteers were investigated (13 young [mean SD (median): 228 09 years (230)] and 17 Middle-aged [517 53 years (500)]). The iRR sequence was collected during ten minutes, breathing spontaneously, in supine position and in the morning, using a heart rate monitor. After selecting an iRR segment (5 min) with the smallest variance, an auto regressive model was used to estimate the PSD. Five principal component coefficients, extracted from PSD signals, were retained for analysis according to the Mahalanobis distance classifier. A threshold established by logistic regression allowed the separation of the groups with 100% specificity, 832% sensitivity and 933% total accuracy. The PCA appropriately classified two groups of women in relation to age (young and Middle-aged) based on PSD analysis of consecutive normal RR intervals.

4 citations


Cites methods from "ECG data compression using truncate..."

  • ...In Cardiology, the PCA can be used for different purposes, such as: detection and classification of heart beats (Wei et al., 2001), analysis of the heterogeneity of ventricular repolarization (Acar et al., 1999; Pueyo et al., 2009), detection of atrial fibrilation (Faes et al., 2001; Castells et…...

    [...]

  • ...In Cardiology, the PCA can be used for different purposes, such as: detection and classification of heart beats (Wei et al., 2001), analysis of the heterogeneity of ventricular repolarization (Acar et al....

    [...]

Journal ArticleDOI
TL;DR: This work presents an electrocardiogram (ECG) compression processor for wireless sensors with configurable data lossless and lossy compression, implemented in SMIC 40nm CMOS process with low power consumption and compression ratio.
Abstract: This work presents an electrocardiogram (ECG) compression processor for wireless sensors with configurable data lossless and lossy compression. Lifting wavelet transforms of 9/7-M and 5/3 are employed for signal decomposition instead of traditional wavelet. A hybrid encoding scheme improves compression efficiency by encoding the higher scales of decomposed coefficients with modified embedded zero-tree wavelet (EZW) and the lowest scale with Huffman encoding. Besides, a transposable register matrix for coefficients buffering during EZW encoding lowers the processing frequency without extra register resource. Implemented in SMIC 40nm CMOS process, the processor only takes a total gate count of 10.8K with 92nW power consumption under 0.5V voltage and achieves a compression ratio of 2.71 for lossless compression and 14.9 for lossy compression with PRD of 0.39%.

4 citations

Book ChapterDOI
01 Jan 2017
TL;DR: This chapter proposed an encryption process having a 4-round five steps -encryption structure includes: the random pixel insertion, row separation, substitution of each separated row, row combination and rotation.
Abstract: Remote health-care monitoring systems communicate biomedical information (e.g. Electrocardiogram (ECG)) over insecure networks. Protection of the integrity, authentication and confidentiality of the medical data is a challenging issue. This chapter proposed an encryption process having a 4-round five steps -encryption structure includes: the random pixel insertion, row separation, substitution of each separated row, row combination and rotation. Accuracy and security analysis of proposed method for 2D ECG encryption is evaluated on MIT-BIH arrhythmia database.

4 citations


Cites methods from "ECG data compression using truncate..."

  • ...The “cut and align beats approach and 2D DCT” and “period normalization and truncated SVD algorithm” are available preprocessing techniques to get good compression results in ECG (Wei et al., 2001; Lee et al., 1999)....

    [...]

Journal Article
TL;DR: It is proven that an exact recovering of the original signal is achieved when it is stretched and the limit in shrinking the signal without producing significant distortion is analysed using objective measure of distortion.
Abstract: The irregularity in the ECG heartbeats durations and application of two dimensional ECG compression algorithms has undoubtedly been a challenge in this field. In this paper, an efficient alternative solution for ECG period normalization is proposed. Each ECG heartbeat is transformed into the SVD domain formed from the LPC filter impulse response matrix, where only a few components contain most of the energy of the signal. The transformed signal is zero padded or truncated to match the desired length then, multiplied by a basis of higher or lower dimension, respectively, to form a normalized ECG heartbeat. Reverse steps are applied to recover the heartbeat with original length. It is proven that an exact recovering of the original signal is achieved when it is stretched. On the other hand, the limit in shrinking the signal without producing significant distortion is also analysed using objective measure of distortion. In addition, it is shown that the singular vectors are orthogonal sinusoids which lead to a reduction of the computational complexity of the algorithm.

4 citations


Cites methods from "ECG data compression using truncate..."

  • ...Alternatively, a widely used technique for period normalization was reported by [2] and applied to the ECG by Wei et al [3]....

    [...]

References
More filters
Book
01 Jan 1983

34,729 citations


"ECG data compression using truncate..." refers background in this paper

  • ...Therefore, the SVD of the matrix can be performed as [20], where are the left and right singular vectors, respectively....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...The compression techniques for an ECG have been extensively discussed [ 1 ] and can be classified into the following three major categories....

    [...]

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...[23]) provides a better performance than previous wavelet-based methods (Hilton [22] and Djohan et al....

    [...]

Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations


"ECG data compression using truncate..." refers methods in this paper

  • ...2) Direct time-domain techniques: including amplitude zone time epoch coding (AZTEC), delta coding, and entropy coding [2]–[4]....

    [...]