scispace - formally typeset
Search or ask a question
Journal ArticleDOI

ECG data compression techniques-a unified approach

TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >
Citations
More filters
Journal ArticleDOI
TL;DR: In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.
Abstract: The wavelet transform has emerged over recent years as a powerful time-frequency analysis and signal coding tool favoured for the interrogation of complex nonstationary signals. Its application to biosignal processing has been at the forefront of these developments where it has been found particularly useful in the study of these, often problematic, signals: none more so than the ECG. In this review, the emerging role of the wavelet transform in the interrogation of the ECG is discussed in detail, where both the continuous and the discrete transform are considered in turn.

794 citations


Cites methods from "ECG data compression techniques-a u..."

  • ...Transform methods, as their name implies, operate by first transforming the ECG signal into another domain including Fourier, Walsh, Kahunen Loeve, discrete cosine transforms and more recently the wavelet transform (Jalaleddine et al 1990)....

    [...]

Journal ArticleDOI
TL;DR: This paper quantifies the potential of the emerging compressed sensing (CS) signal acquisition/compression paradigm for low-complexity energy-efficient ECG compression on the state-of-the-art Shimmer WBSN mote and shows that CS represents a competitive alternative to state- of- the-art digital wavelet transform (DWT)-basedECG compression solutions in the context of WBSn-based ECG monitoring systems.
Abstract: Wireless body sensor networks (WBSN) hold the promise to be a key enabling information and communications technology for next-generation patient-centric telecardiology or mobile cardiology solutions. Through enabling continuous remote cardiac monitoring, they have the potential to achieve improved personalization and quality of care, increased ability of prevention and early diagnosis, and enhanced patient autonomy, mobility, and safety. However, state-of-the-art WBSN-enabled ECG monitors still fall short of the required functionality, miniaturization, and energy efficiency. Among others, energy efficiency can be improved through embedded ECG compression, in order to reduce airtime over energy-hungry wireless links. In this paper, we quantify the potential of the emerging compressed sensing (CS) signal acquisition/compression paradigm for low-complexity energy-efficient ECG compression on the state-of-the-art Shimmer WBSN mote. Interestingly, our results show that CS represents a competitive alternative to state-of-the-art digital wavelet transform (DWT)-based ECG compression solutions in the context of WBSN-based ECG monitoring systems. More specifically, while expectedly exhibiting inferior compression performance than its DWT-based counterpart for a given reconstructed signal quality, its substantially lower complexity and CPU execution time enables it to ultimately outperform DWT-based ECG compression in terms of overall energy efficiency. CS-based ECG compression is accordingly shown to achieve a 37.1% extension in node lifetime relative to its DWT-based counterpart for “good” reconstruction quality.

680 citations

Journal ArticleDOI
TL;DR: This statement examines the relation of the resting ECG to its technology to establish standards that will improve the accuracy and usefulness of the ECG in practice and to recommend recommendations for ECG standards.

649 citations

Journal ArticleDOI
TL;DR: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed and is significantly more efficient in compression and in computation than previously proposed ECG compression schemes.
Abstract: A wavelet electrocardiogram (ECG) data codec based on the set partitioning in hierarchical trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm (A. Said and W.A. Pearlman, IEEE Trans. Ccts. Syst. II, vol. 6, p. 243-50, 1996) has achieved notable success in still image coding. The authors modified the algorithm for the one-dimensional case and applied it to compression of ECG data. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate.

521 citations

References
More filters
Journal ArticleDOI
01 Mar 1967
TL;DR: Methods of achieving greater flexibility by combining the redundancy reduction techniques with an electronically programmable telemetry system are discussed and application of these techniques to other data management problems is also described.
Abstract: Acceptance of redundancy reduction as a practical method of bandwidth and/or power conservation is a significant tread in aerospace communication and telemetry. Techniques for accomplishing data compression are discussed, examples of performance are given, and a machine design for aerospace vehicle use is described. Methods of achieving greater flexibility by combining the redundancy reduction techniques with an electronically programmable telemetry system are discussed. Application of these techniques to other data management problems is also described. A list of relevant publications is provided.

73 citations

Journal ArticleDOI
TL;DR: The feasibility of using a fast Walsh transform algorithm to implement a real-time microprocessor-based e.
Abstract: The feasibility of using a fast Walsh transform algorithm to implement a real-time microprocessor-based e.c.g. data-compression system was studied. Using the mean square error between the original and reconstructed e.c.g. signals as a measure of the utility of the reconstructed signals, the limit to which an e.c.g. signal could be compressed and still yield an acceptable reconstruction was determined. The possibility of enhancing the quality of the reconstructed signals using linear filtering techniques was also investigated.

72 citations

Journal ArticleDOI
C.A. Andrews1, J.M. Davies1, G.R. Schwarz1
01 Mar 1967
TL;DR: The geometric aperture techniques give results comparable to or better than the more "exotic" methods and are more economical to implement at the present state-of-the-art.
Abstract: Data compression techniques are classified into four categories which describe the effect a compression method has on the form of the signal transmitted. Each category is defined and examples of techniques in each category are given. Compression methods which have received previous investigation, such as the geometric aperture methods, as well as techniques which have not received much attention, such as Fourier filter, optimum discrete filter, and variable sampling rate compression, are described. Results of computer simulations with real data are presented for each method in terms of rms and peak errors versus compression ratio. It is shown that, in general, the geometric aperture techniques give results comparable to or better than the more "exotic" methods and are more economical to implement at the present state-of-the-art. In addition, the aperture compression methods provide bounded peak error which is not readily obtainable with other methods. A general system design is given for a stored-logic data compression system with adaptive buffer control to prevent loss of data and to provide efficient transmission of multiplexed channels with compressed data. An adaptive buffer design is described which is shown to be practical, based on computer simulations with five different types of representative data.

71 citations

Journal ArticleDOI
TL;DR: The criterion to be used for predictors for the purpose of predictive coding is defined: that predictor is optimum in the information theory (IT) sense which minimizes the entropy of the average error-term distribution.
Abstract: In Part I predictive coding was defined and messages, prediction, entropy, and ideal coding were discussed. In the present paper the criterion to be used for predictors for the purpose of predictive coding is defined: that predictor is optimum in the information theory (IT) sense which minimizes the entropy of the average error-term distribution. Ordered averages of distributions are defined and it is shown that if a predictor gives an ordered average error term distribution it will be a best IT predictor. Special classes of messages are considered for which a best IT predictor can easily be found, and some examples are given. The error terms which are transmitted in predictive coding are treated as if they were statistically independent. If this is indeed the case, or a good approximation, then it is still necessary to show that sequences of message terms which are statistically independent may always be coded efficiently, without impractically large memory requirements, in order to show that predictive coding may be practical and efficient in such cases. This is done in the final section of this paper.

71 citations

Journal ArticleDOI
TL;DR: A real-time compression algorithm that represents a modification of the amplitude zone time epoch coding (AZTEC) technique extended with several statistical parameters used to calculate the variable threshold has been developed and applied in the design of a pacemaker followup system.
Abstract: A real-time compression algorithm has been developed which is suitable for both real-time ECG (electrocardiogram) transmission and ECG data storing. The algorithm represents a modification of the amplitude zone time epoch coding (AZTEC) technique extended with several statistical parameters used to calculate the variable threshold. The proposed algorithm has been applied in the design of a pacemaker followup system for the online ECG data transmission from the pacemaker implanted in a human being to the computer system located at the clinic. >

67 citations