scispace - formally typeset
Journal ArticleDOI

A novel compression algorithm for electrocardiogram signals based on the linear prediction of the wavelet coefficients

01 Oct 2003-Digital Signal Processing (Academic Press)-Vol. 13, Iss: 4, pp 604-622

...read more


Citations
More filters
Journal ArticleDOI

[...]

TL;DR: Because the proposed real-time data compression and transmission algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.
Abstract: This paper introduces a real-time data compression and transmission algorithm between e-health terminals for a periodic ECGsignal. The proposed algorithm consists of five compression procedures and four reconstruction procedures. In order to evaluate the performance of the proposed algorithm, the algorithm was applied to all 48 recordings of MIT-BIH arrhythmia database, and the compress ratio (CR), percent root mean square difference (PRD), percent root mean square difference normalized (PRDN), rms, SNR, and quality score (QS) values were obtained. The result showed that the CR was 27.9:1 and the PRD was 2.93 on average for all 48 data instances with a 15% window size. In addition, the performance of the algorithm was compared to those of similar algorithms introduced recently by others. It was found that the proposed algorithm showed clearly superior performance in all 48 data instances at a compression ratio lower than 15:1, whereas it showed similar or slightly inferior PRD performance for a data compression ratio higher than 20:1. In light of the fact that the similarity with the original data becomes meaningless when the PRD is higher than 2, the proposed algorithm shows significantly better performance compared to the performance levels of other algorithms. Moreover, because the algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.

151 citations


Cites methods from "A novel compression algorithm for e..."

  • [...]

Journal ArticleDOI

[...]

TL;DR: A comprehensive review of up-to-date requirements in hardware, communication, and computing for next-generation u-Health systems is presented and new technological trends and design challenges they have to cope with, while designing such systems are presented.
Abstract: With the increase of an ageing population and chronic diseases, society becomes more health conscious and patients become "health consumers" looking for better health management. People's perception is shifting towards patient-centered, rather than the classical, hospital-centered health services which has been propelling the evolution of telemedicine research from the classic e-Health to m-Health and now is to ubiquitous healthcare (u-Health). It is expected that mobile & ubiquitous Telemedicine, integrated with Wireless Body Area Network (WBAN), have a great potential in fostering the provision of next-generation u-Health. Despite the recent efforts and achievements, current u-Health proposed solutions still suffer from shortcomings hampering their adoption today. This paper presents a comprehensive review of up-to-date requirements in hardware, communication, and computing for next-generation u-Health systems. It compares new technological and technical trends and discusses how they address expected u-Health requirements. A thorough survey on various worldwide recent system implementations is presented in an attempt to identify shortcomings in state-of-the art solutions. In particular, challenges in WBAN and ubiquitous computing were emphasized. The purpose of this survey is not only to help beginners with a holistic approach toward understanding u-Health systems but also present to researchers new technological trends and design challenges they have to cope with, while designing such systems.

143 citations

Journal ArticleDOI

[...]

TL;DR: This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding, which takes into account both the reconstruction errors and the compression ratio.
Abstract: This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called ldquoquality score,rdquo which takes into account both the reconstruction errors and the compression ratio, is proposed.

129 citations

Journal ArticleDOI

[...]

TL;DR: A prospective review of wavelet-based ECG compression methods and their performances based upon findings obtained from various experiments conducted using both clean and noisy ECG signals is presented.
Abstract: Cardiovascular disease (CVD) is one of the most widespread health problems with unpredictable and life-threatening consequences. The electrocardiogram (ECG) is commonly recorded for computer-aided CVD diagnosis, human emotion recognition and person authentication systems. For effective detection and diagnosis of cardiac diseases, the ECG signals are continuously recorded, processed, stored, and transmitted via wire/wireless communication networks. But long-term continuous cardiac monitoring system generates huge volume of ECG data daily. Therefore, a reliable and efficient ECG signal compression method is highly demanded to meet the real-time constraints including limited channel capacity, memory and battery-power of remote cardiac monitoring, ECG record management and telecardiology systems. In such scenarios, the main objective of the ECG signal compression is to reduce the data rate for effective transmission and/or storage purposes without significantly distorting the clinical features of different kinds of PQRST morphologies contained in the recorded ECG signal. Numerous ECG compression methods have been proposed by exploiting the intra-beat correlation, inter-beat correlation and intra-channel correlation of the ECG signals. This paper presents a prospective review of wavelet-based ECG compression methods and their performances based upon findings obtained from various experiments conducted using both clean and noisy ECG signals. This paper briefly describes different kinds of compression techniques used in the one-dimensional wavelet-based ECG compression methods. Then, the performance of each of the wavelet-based compression methods is tested and validated using the standard MIT-BIH arrhythmia databases and performance metrics. The pros and cons of different wavelet-based compression methods are demonstrated based upon the experimental results. Finally, various practical issues involved in the validation procedures, reconstructed signal quality assessment, and performance comparisons are highlighted by considering the future research studies based on the recent powerful digital signal processing techniques and computing platform.

88 citations

Journal ArticleDOI

[...]

TL;DR: In order to increase the performance of heart sound classification, an incremental neural network is proposed in this study and it is observed that ISOM successfully classifies the HSs even in noisy environment.
Abstract: Determination of heart condition by heart auscultation is a difficult task and requires special training of medical staff Computerized techniques suggest objective and more accurate results in a fast and easy manner Hence, in this study it is aimed to perform computer-aided heart sound analysis to give support to medical doctors in decision making In this study, a novel method is presented for the classification of heart sounds (HSs) Discrete wavelet transform is applied to windowed one cycle of HS Wavelet transform is used both for the segmentation of S1-S2 sounds and determination of the features Based on the third, fourth and the fifth decomposition-level detail coefficients, the timings of S1-S2 sounds are determined by an adaptive peak-detector For the feature extraction, powers of detail coefficients in all five sub-bands are utilized In the classification stage, Kohonen's SOM network and an incremental self-organizing map (ISOM) are examined comparatively In order to increase the performance of heart sound classification, an incremental neural network is proposed in this study It is observed that ISOM successfully classifies the HSs even in noisy environment

79 citations


References
More filters
Journal ArticleDOI

[...]

John Makhoul1
01 Apr 1975
TL;DR: This paper gives an exposition of linear prediction in the analysis of discrete signals as a linear combination of its past values and present and past values of a hypothetical input to a system whose output is the given signal.
Abstract: This paper gives an exposition of linear prediction in the analysis of discrete signals The signal is modeled as a linear combination of its past values and present and past values of a hypothetical input to a system whose output is the given signal In the frequency domain, this is equivalent to modeling the signal spectrum by a pole-zero spectrum The major part of the paper is devoted to all-pole models The model parameters are obtained by a least squares analysis in the time domain Two methods result, depending on whether the signal is assumed to be stationary or nonstationary The same results are then derived in the frequency domain The resulting spectral matching formulation allows for the modeling of selected portions of a spectrum, for arbitrary spectral shaping in the frequency domain, and for the modeling of continuous as well as discrete spectra This also leads to a discussion of the advantages and disadvantages of the least squares error criterion A spectral interpretation is given to the normalized minimum prediction error Applications of the normalized error are given, including the determination of an "optimal" number of poles The use of linear prediction in data compression is reviewed For purposes of transmission, particular attention is given to the quantization and encoding of the reflection (or partial correlation) coefficients Finally, a brief introduction to pole-zero modeling is given

4,096 citations

Book

[...]

05 Sep 1978
TL;DR: This paper presents a meta-modelling framework for digital Speech Processing for Man-Machine Communication by Voice that automates the very labor-intensive and therefore time-heavy and expensive process of encoding and decoding speech.
Abstract: 1. Introduction. 2. Fundamentals of Digital Speech Processing. 3. Digital Models for the Speech Signal. 4. Time-Domain Models for Speech Processing. 5. Digital Representation of the Speech Waveform. 6. Short-Time Fourier Analysis. 7. Homomorphic Speech Processing. 8. Linear Predictive Coding of Speech. 9. Digital Speech Processing for Man-Machine Communication by Voice.

3,101 citations

Journal ArticleDOI

[...]

TL;DR: The perfect reconstruction condition is posed as a Bezout identity, and it is shown how it is possible to find all higher-degree complementary filters based on an analogy with the theory of Diophantine equations.
Abstract: The wavelet transform is compared with the more classical short-time Fourier transform approach to signal analysis. Then the relations between wavelets, filter banks, and multiresolution signal processing are explored. A brief review is given of perfect reconstruction filter banks, which can be used both for computing the discrete wavelet transform, and for deriving continuous wavelet bases, provided that the filters meet a constraint known as regularity. Given a low-pass filter, necessary and sufficient conditions for the existence of a complementary high-pass filter that will permit perfect reconstruction are derived. The perfect reconstruction condition is posed as a Bezout identity, and it is shown how it is possible to find all higher-degree complementary filters based on an analogy with the theory of Diophantine equations. An alternative approach based on the theory of continued fractions is also given. These results are used to design highly regular filter banks, which generate biorthogonal continuous wavelet bases with symmetries. >

1,748 citations

Journal ArticleDOI

[...]

TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

649 citations

Journal ArticleDOI

[...]

TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

434 citations