scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A novel compression algorithm for electrocardiogram signals based on the linear prediction of the wavelet coefficients

TL;DR: A new algorithm for electrocardiogram (ECG) compression based on the compression of the linearly predicted residuals of the wavelet coefficients of the signal, which reduces the bit rate while keeping the reconstructed signal distortion at a clinically acceptable level.
About: This article is published in Digital Signal Processing.The article was published on 2003-10-01. It has received 97 citations till now. The article focuses on the topics: Wavelet transform & Stationary wavelet transform.
Citations
More filters
Journal ArticleDOI
TL;DR: Because the proposed real-time data compression and transmission algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.
Abstract: This paper introduces a real-time data compression and transmission algorithm between e-health terminals for a periodic ECGsignal. The proposed algorithm consists of five compression procedures and four reconstruction procedures. In order to evaluate the performance of the proposed algorithm, the algorithm was applied to all 48 recordings of MIT-BIH arrhythmia database, and the compress ratio (CR), percent root mean square difference (PRD), percent root mean square difference normalized (PRDN), rms, SNR, and quality score (QS) values were obtained. The result showed that the CR was 27.9:1 and the PRD was 2.93 on average for all 48 data instances with a 15% window size. In addition, the performance of the algorithm was compared to those of similar algorithms introduced recently by others. It was found that the proposed algorithm showed clearly superior performance in all 48 data instances at a compression ratio lower than 15:1, whereas it showed similar or slightly inferior PRD performance for a data compression ratio higher than 20:1. In light of the fact that the similarity with the original data becomes meaningless when the PRD is higher than 2, the proposed algorithm shows significantly better performance compared to the performance levels of other algorithms. Moreover, because the algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.

173 citations


Cites methods from "A novel compression algorithm for e..."

  • ...Examples include the peak picking method, the linear prediction method [10], the syntactic method, or the neural network method [11]....

    [...]

Journal ArticleDOI
TL;DR: A comprehensive review of up-to-date requirements in hardware, communication, and computing for next-generation u-Health systems is presented and new technological trends and design challenges they have to cope with, while designing such systems are presented.
Abstract: With the increase of an ageing population and chronic diseases, society becomes more health conscious and patients become "health consumers" looking for better health management. People's perception is shifting towards patient-centered, rather than the classical, hospital-centered health services which has been propelling the evolution of telemedicine research from the classic e-Health to m-Health and now is to ubiquitous healthcare (u-Health). It is expected that mobile & ubiquitous Telemedicine, integrated with Wireless Body Area Network (WBAN), have a great potential in fostering the provision of next-generation u-Health. Despite the recent efforts and achievements, current u-Health proposed solutions still suffer from shortcomings hampering their adoption today. This paper presents a comprehensive review of up-to-date requirements in hardware, communication, and computing for next-generation u-Health systems. It compares new technological and technical trends and discusses how they address expected u-Health requirements. A thorough survey on various worldwide recent system implementations is presented in an attempt to identify shortcomings in state-of-the art solutions. In particular, challenges in WBAN and ubiquitous computing were emphasized. The purpose of this survey is not only to help beginners with a holistic approach toward understanding u-Health systems but also present to researchers new technological trends and design challenges they have to cope with, while designing such systems.

152 citations

Journal ArticleDOI
TL;DR: This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding, which takes into account both the reconstruction errors and the compression ratio.
Abstract: This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called ldquoquality score,rdquo which takes into account both the reconstruction errors and the compression ratio, is proposed.

144 citations

Journal ArticleDOI
TL;DR: A prospective review of wavelet-based ECG compression methods and their performances based upon findings obtained from various experiments conducted using both clean and noisy ECG signals is presented.

110 citations

Journal ArticleDOI
TL;DR: In order to increase the performance of heart sound classification, an incremental neural network is proposed in this study and it is observed that ISOM successfully classifies the HSs even in noisy environment.

93 citations

References
More filters
Journal ArticleDOI
TL;DR: The proposed technique yields the lowest PRD compared to the other two algorithms and for a compression ratio less than 10 the optimal transform can be obtained for only one ECG period, however, for a higher compression ratio the PRD is smaller for long signals.

60 citations

Journal ArticleDOI
TL;DR: The concept of sequential ranking is developed, which can be seen as a generalization of sequential prediction, and it is shown that a combined scheme may result in faster convergence rate to the source entropy.
Abstract: Most state-of-the-art lossless image compression schemes use prediction followed by some form of context modeling. This might seem redundant at first, as the contextual information used for prediction is also available for building the compression model, and a universal coder will eventually learn the "predictive" patterns of the data. In this correspondence, we provide a format justification to the combination of these two modeling tools, by showing that a combined scheme may result in faster convergence rate to the source entropy. This is achieved via a reduction in the model cost of universal coding. In deriving the main result, we develop the concept of sequential ranking, which can be seen as a generalization of sequential prediction, and we study its combinatorial and probabilistic properties.

34 citations

Journal ArticleDOI
TL;DR: A study of ECG compression using an upper bound on the percentage root mean square difference (PRD) is presented, which could be specified by the clinician after correlating the quality of the compressed versions of the ECG and the resulting PRD.
Abstract: The main goal of any electrocardiogram (ECG) compression algorithm is to reduce the bit rate while keeping the signal distortion at a clinically acceptable level. Percentage root mean square difference (PRD), the commonly used figure of merit, does not directly reveal whether the clinically significant ECG waveform information is preserved or not. We present the results of a study of ECG compression using an upper bound on the PRD. This bound is based on the initial performance of the algorithm and could be specified by the clinician after correlating the quality of the compressed versions of the ECG and the resulting PRD.

19 citations

Proceedings ArticleDOI
09 May 1995
TL;DR: The paper presents an ECG data compression technique using multiscale peak analysis as the wavelet maxima representation of which the basic wavelet is the second derivative of a symmetric smoothing function.
Abstract: The paper presents an ECG data compression technique using multiscale peak analysis. The authors define multiscale peak analysis as the wavelet maxima representation of which the basic wavelet is the second derivative of a symmetric smoothing function. The wavelet transform of an ECG shows maxima at the start, peak and stop points of five transient waves P through T. The number of wavelet maxima is expected to be less than the number of original data samples. The wavelet maxima can be enough to reconstruct original signals precisely. The wavelet maxima representation can lead to ECG data compression and analysis. The compressed data still keep the peaks of QRS waves, and abnormal behavior search will be feasible in practice. The result of the compression shows that a normal ECG data is compressed by a factor 10.

9 citations