scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

01 May 1997-IEEE Transactions on Biomedical Engineering (IEEE Trans Biomed Eng)-Vol. 44, Iss: 5, pp 394-402
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.
Citations
More filters
Book ChapterDOI
01 Jan 2006
TL;DR: The optimization of the parameters of a uniform scalar dead zone quantizer used in a wavelet-based ECG data compression scheme is presented and experiment results show that the optimized quantizer produces improved compression performance.
Abstract: The optimization of the parameters of a uniform scalar dead zone quantizer used in a wavelet-based ECG data compression scheme is presented. Two quantization parameters: a threshold T and a step size Δ are optimized for a target bit rate through the particle swarm optimization algorithm. Experiment results on several records from the MIT-BIH arrhythmia database show that the optimized quantizer produces improved compression performance.

8 citations


Cites background or methods from "Wavelet and wavelet packet compress..."

  • ...However, the width of the dead zone was either determined by a threshold (in [3],[4]) or fixed to 2 times the width of other quantization bins(in [ 1 ],[2])....

    [...]

  • ...In [ 1 ],[2], both the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical tree (SPIHT) algorithms, which have shown very good results in image coding, were applied to ECG signals....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a block sparse M-lEC (BlS M-LEC) method to exploit between-lead correlations to compress the signals in a more efficient way.
Abstract: Multi-lead ECG compression (M-lEC) has attracted tremendous attention in long-term monitoring of the patient's heart behaviour. This study proposes a method denoted by block sparse M-lEC (BlS M-lEC) in order to exploit between-lead correlations to compress the signals in a more efficient way. This is due to the fact that multi-lead electrocardiography signals are multiple observations of the same source (heart) from different locations. Consequently, they have a high correlation in terms of the support set of their sparse models which leads them to share dominant common structure. In order to obtain the block sparse model, the collaborative version of lasso estimator is applied. In addition, it is shown that raised cosine kernel has advantages over conventional Gaussian and wavelet (Daubechies family) due to its specific properties. It is demonstrated that using raised cosine kernel in constructing the sparsifying basis matrix gives a sparser model which results in higher compression ratio and lower reconstruction error. The simulation results show the average improvement of 37, 88 and 90-97% for BlS M-lEC compared to the non-collaborative case with raised cosine kernel, Gaussian kernel and collaborative case with Daubechies wavelet kernels, respectively, in terms of reconstruction error while the compression ratio is considered fixed.

8 citations

Proceedings ArticleDOI
26 Oct 2011
TL;DR: An algorithm of the ECG signal compression, based on the combination of the run length encoding and discrete wavelet transform, intended for a simulated transmission via the IEEE 802.11b WLAN channel, is presented in this work.
Abstract: An algorithm of the ECG signal compression, based on the combination of the run length encoding and discrete wavelet transform, intended for a simulated transmission via the IEEE 802.11b WLAN channel, is presented in this work. The algorithm consists of two basic phases that are ECG signal compression and transmission via the IEEE 802.11b WLAN channel. The algorithm is based on applying the run length coding upon the thresholded discrete wavelet transform of the real ECG signal. In terms of compression efficiency, applying the compression procedure on several ECG data, presenting diverse cardiac status, selected from the MIT-BIH arrhythmia data base, achieves compression ratio of around 10:1, normalized root mean squared error (NRMSE) of 4% and (mean± standard deviation) of the difference between the restituted ECG signal and the original one of around (3 10-6) ± 0.03. The end point of this work is to simulate transmission of the compressed ECG signal via the IEEE 802.11b WLAN channel. The unavoidable distortion introduced by the transmission channel reduces the compression ratio to about 6.7:1 in the cost of preserving the ECG signal fidelity.

8 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...[17] Robert S. H. Istepanian, Leontios J. Hadjileontiadis, Stavros M. Panas “ECG Data Compression Using Wavelets and Higher Order Statistics Methods“IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 5, NO. 2, JUNE 2001 pp 108-115 [18] Michael Hilton 1997 wavelets and wavelets packet compression of electrocardiograms IEEE trans. biomed. engineering, 44, 394–402....

    [...]

  • ...…Data Compression Using Wavelets and Higher Order Statistics Methods“IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 5, NO. 2, JUNE 2001 pp 108-115 [18] Michael Hilton 1997 wavelets and wavelets packet compression of electrocardiograms IEEE trans. biomed. engineering, 44, 394–402....

    [...]

Journal ArticleDOI
TL;DR: In this paper , a lightweight blockchain-based and fog-enabled remote patient monitoring system was proposed to provide a high level of security and efficient response time in remote healthcare applications, and the proposed lightweight blockchain architecture fits the resource-constrained IoT devices well and is secure against attacks.
Abstract: IoT has enabled the rapid growth of smart remote healthcare applications. These IoT-based remote healthcare applications deliver fast and preventive medical services to patients at risk or with chronic diseases. However, ensuring data security and patient privacy while exchanging sensitive medical data among medical IoT devices is still a significant concern in remote healthcare applications. Altered or corrupted medical data may cause wrong treatment and create grave health issues for patients. Moreover, current remote medical applications' efficiency and response time need to be addressed and improved. Considering the need for secure and efficient patient care, this paper proposes a lightweight Blockchain-based and Fog-enabled remote patient monitoring system that provides a high level of security and efficient response time. Simulation results and security analysis show that the proposed lightweight blockchain architecture fits the resource-constrained IoT devices well and is secure against attacks. Moreover, the augmentation of Fog computing improved the responsiveness of the remote patient monitoring system by 40%.

8 citations

Proceedings ArticleDOI
14 Oct 2008
TL;DR: An improved wavelet based 2-D ECG compression method is presented which employs set partitioning in hierarchical trees (SPIHT) algorithm and run length (RL) coding and results show that the wavelet function biorthogonal-6.8 with five level of decomposition has better performance compared to others.
Abstract: An improved wavelet based 2-D ECG compression method is presented which employs set partitioning in hierarchical trees (SPIHT) algorithm and run length (RL) coding. The proposed 2-D approach utilizes the fact that ECG signal shows redundancy between adjacent beats and also adjacent samples. The results of several experiments show that the wavelet function biorthogonal-6.8 with five level of decomposition has better performance compared to others. In period normalization repeating each beat instead of zero padding is more efficient. The initializing of list of insignificant pixels (LIP) is also done in a different way. Results of applying the proposed algorithm on several record of MIT/BIH database show lower percent root mean square difference (PRD) than other 1-D and several 2-D methods for the same compression ratio.

8 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...In digital signal processing the fast-forward and inverse wavelet transforms are implemented as treestructured, perfect-reconstruction filter banks [10]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Abstract: Multiresolution representations are effective for analyzing the information content of images. The properties of the operator which approximates a signal at a given resolution were studied. It is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2/sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions. In L/sup 2/(R), a wavelet orthonormal basis is a family of functions which is built by dilating and translating a unique function psi (x). This decomposition defines an orthogonal multiresolution representation called a wavelet representation. It is computed with a pyramidal algorithm based on convolutions with quadrature mirror filters. Wavelet representation lies between the spatial and Fourier domains. For images, the wavelet representation differentiates several spatial orientations. The application of this representation to data compression in image coding, texture discrimination and fractal analysis is discussed. >

20,028 citations

Book
01 May 1992
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

16,073 citations

Journal ArticleDOI
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

14,157 citations

Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"Wavelet and wavelet packet compress..." refers methods in this paper

  • ...In the work described in this paper, was chosen to be Daubechie's W6 wavelet [10], which is illustrated in Figure 1....

    [...]

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations


Additional excerpts

  • ...algorithm was inspired by that in [28]....

    [...]