scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

01 May 1997-IEEE Transactions on Biomedical Engineering (IEEE Trans Biomed Eng)-Vol. 44, Iss: 5, pp 394-402
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.
Citations
More filters
Journal ArticleDOI
TL;DR: A comprehensive review of up-to-date requirements in hardware, communication, and computing for next-generation u-Health systems is presented and new technological trends and design challenges they have to cope with, while designing such systems are presented.
Abstract: With the increase of an ageing population and chronic diseases, society becomes more health conscious and patients become "health consumers" looking for better health management. People's perception is shifting towards patient-centered, rather than the classical, hospital-centered health services which has been propelling the evolution of telemedicine research from the classic e-Health to m-Health and now is to ubiquitous healthcare (u-Health). It is expected that mobile & ubiquitous Telemedicine, integrated with Wireless Body Area Network (WBAN), have a great potential in fostering the provision of next-generation u-Health. Despite the recent efforts and achievements, current u-Health proposed solutions still suffer from shortcomings hampering their adoption today. This paper presents a comprehensive review of up-to-date requirements in hardware, communication, and computing for next-generation u-Health systems. It compares new technological and technical trends and discusses how they address expected u-Health requirements. A thorough survey on various worldwide recent system implementations is presented in an attempt to identify shortcomings in state-of-the art solutions. In particular, challenges in WBAN and ubiquitous computing were emphasized. The purpose of this survey is not only to help beginners with a holistic approach toward understanding u-Health systems but also present to researchers new technological trends and design challenges they have to cope with, while designing such systems.

152 citations

Journal ArticleDOI
01 Jan 2006
TL;DR: Comparative results with existing quality measures show that the new measure is insensitive to error variation, is accurate, and correlates very well with subjective tests.
Abstract: Electrocardiograph (ECG) compression techniques are gaining momentum due to the huge database requirements and wide band communication channels needed to maintain high quality ECG transmission. Advances in computer software and hardware enable the birth of new techniques in ECG compression, aiming at high compression rates. In general, most of the introduced ECG compression techniques depend on their evaluation performance on either inaccurate measures or measures targeting random behavior of error. In this paper, a new wavelet-based quality measure is proposed. A new wavelet-based quality measure is proposed. The new approach is based on decomposing the segment of interest into frequency bands where a weighted score is given to the band depending on its dynamic range and its diagnostic significance. A performance evaluation of the measure is conducted quantitatively and qualitatively. Comparative results with existing quality measures show that the new measure is insensitive to error variation, is accurate, and correlates very well with subjective tests

152 citations

Journal ArticleDOI
TL;DR: Best basis selection and optimization of the mother wavelet through parameterization led to substantial improvement of performance in signal compression with respect to DWT and randon selection of themother wavelet.
Abstract: We propose a novel scheme for signal compression based on the discrete wavelet packet transform (DWPT) decompositon. The mother wavelet and the basis of wavelet packets were optimized and the wavelet coefficients were encoded with a modified version of the embedded zerotree algorithm. This signal dependant compression scheme was designed by a two-step process. The first (internal optimization) was the best basis selection that was performed for a given mother wavelet. For this purpose, three additive cost functions were applied and compared. The second (external optimization) was the selection of the mother wavelet based on the minimal distortion of the decoded signal given a fixed compression ratio. The mother wavelet was parameterized in the multiresolution analysis framework by the scaling filter, which is sufficient to define the entire decomposition in the orthogonal case. The method was tested on two sets of ten electromyographic (EMG) and ten electrocardiographic (ECG) signals that were compressed with compression ratios in the range of 50%-90%. For 90% compression ratio of EMG (ECG) signals, the percent residual difference after compression decreased from (mean ) 48.69.9% (21.58.4%) with discrete wavelet transform (DWT) using the wavelet leading to poorest performance to 28.43.0% (6.71.9%) with DWPT, with optimal basis selection and wavelet optimization. In conclusion, best basis selection and optimization of the mother wavelet through parameterization led to substantial improvement of performance in signal compression with respect to DWT and randon selection of the mother wavelet. The method provides an adaptive approach for optimal signal representation for compression and can thus be applied to any type of biomedical signal.

148 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...Among these methods, wavelet transform has shown promising results in various areas [5], [19]....

    [...]

Journal ArticleDOI
TL;DR: This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding, which takes into account both the reconstruction errors and the compression ratio.
Abstract: This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called ldquoquality score,rdquo which takes into account both the reconstruction errors and the compression ratio, is proposed.

144 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...Digital Object Identifier 10.1109/TBME.2008.918465 The methods based on linear transformations use various linear transformations (Fourier, Walsh, Cosine, Karhunen–Loeve, wavelet, etc. [6]–[9]) to code a signal through the most significant coefficients of its representation with respect to a particular basis chosen by means of an error criterion....

    [...]

  • ...Digital Object Identifier 10.1109/TBME.2008.918465 The methods based on linear transformations use various linear transformations (Fourier, Walsh, Cosine, Karhunen–Loeve, wavelet, etc. [6]–[9]) to code a signal through the most significant coefficients of its representation with respect to a…...

    [...]

Journal ArticleDOI
TL;DR: A two-dimensional wavelet-based electrocardiogram (ECG) data compression method which employs a modified set partitioning in hierarchical trees (SPIHT) algorithm and achieves high compression ratio with relatively low distortion and is effective for various kinds of ECG morphologies.
Abstract: A two-dimensional (2-D) wavelet-based electrocardiogram (ECG) data compression method is presented which employs a modified set partitioning in hierarchical trees (SPIHT) algorithm. This modified SPIHT algorithm utilizes further the redundancy among medium- and high-frequency subbands of the wavelet coefficients and the proposed 2-D approach utilizes the fact that ECG signals generally show redundancy between adjacent beats and between adjacent samples. An ECG signal is cut and aligned to form a 2-D data array, and then 2-D wavelet transform and the modified SPIHT can be applied. Records selected from the MIT-BIH arrhythmia database are tested. The experimental results show that the proposed method achieves high compression ratio with relatively low distortion and is effective for various kinds of ECG morphologies.

141 citations


Cites background or methods from "Wavelet and wavelet packet compress..."

  • ...Hilton [4] reported the PRD value of 2....

    [...]

  • ...94% for the same record and CR, which is considered better than the coders in [1], [4], and [7]....

    [...]

  • ...In the fifth experiment, the proposed method was compared with other wavelet-based algorithms [1], [4], [7], [9]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Abstract: Multiresolution representations are effective for analyzing the information content of images. The properties of the operator which approximates a signal at a given resolution were studied. It is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2/sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions. In L/sup 2/(R), a wavelet orthonormal basis is a family of functions which is built by dilating and translating a unique function psi (x). This decomposition defines an orthogonal multiresolution representation called a wavelet representation. It is computed with a pyramidal algorithm based on convolutions with quadrature mirror filters. Wavelet representation lies between the spatial and Fourier domains. For images, the wavelet representation differentiates several spatial orientations. The application of this representation to data compression in image coding, texture discrimination and fractal analysis is discussed. >

20,028 citations

Book
01 May 1992
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

16,073 citations

Journal ArticleDOI
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

14,157 citations

Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"Wavelet and wavelet packet compress..." refers methods in this paper

  • ...In the work described in this paper, was chosen to be Daubechie's W6 wavelet [10], which is illustrated in Figure 1....

    [...]

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations


Additional excerpts

  • ...algorithm was inspired by that in [28]....

    [...]