scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

01 May 1997-IEEE Transactions on Biomedical Engineering (IEEE Trans Biomed Eng)-Vol. 44, Iss: 5, pp 394-402
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.
Citations
More filters
Proceedings ArticleDOI
01 Dec 2019
TL;DR: This work demonstrates how one can leverage the correlation across several related time series streams to both drastically improve the compression efficiency and reduce the accuracy loss, and introduces a method to threshold the information loss of the compression.
Abstract: Time series streams are ubiquitous in many application domains, e.g., transportation, network monitoring, autonomous vehicles, or the Internet of Things (IoT). Transmitting and storing large amounts of such fine-grained data is however expensive, which makes compression schemes necessary in practice. Time series streams that are transmitted together often share properties or evolve together, making them significantly correlated. Despite the rich literature on compression methods, the state-of-the-art approaches do not typically avail correlation information when compressing times series. In this work, we demonstrate how one can leverage the correlation across several related time series streams to both drastically improve the compression efficiency and reduce the accuracy loss.We present a novel compression algorithm for time series streams called CORAD (CORelation-Aware compression of time series streams based on sparse Dictionary coding). Based on sparse dictionary learning, CORAD has the unique ability to exploit the correlation across multiple related time series to eliminate redundancy and perform a more efficient compression. To ensure the accuracy of the compressed time series, we further introduce a method to threshold the information loss of the compression. Extensive validation on real-world datasets shows that CORAD drastically outperforms state-of-the-art approaches achieving up to 40:1 compression ratios while minimizing the information loss.

13 citations

Journal ArticleDOI
TL;DR: This paper presents a combined wavelet and a modified runlength encoding scheme for the compression of electrocardiogram (ECG) signals that has been tested using ECG signals obtained from the MIT-BIH Compression Database.
Abstract: This paper presents a combined wavelet and a modified runlength encoding scheme for the compression of electrocardiogram (ECG) signals. First, a discrete wavelet transform is applied to the ECG signal. The resulting coefficients are classified into significant and insignificant ones based on the required PRD (percent root mean square difference). Second, both coefficients are encoded using a modified run-length coding method. The scheme has been tested using ECG signals obtained from the MIT-BIH Compression Database. A compression of 20:1 (which is equivalent to 150 bit per second) is achieved with PRD less than 10.

13 citations


Cites methods or result from "Wavelet and wavelet packet compress..."

  • ...It should be mentioned that wavelet 2 in this paper is wavelet 4 in [3] which is the same one used in the FBI Ž ngerprint image coding standard....

    [...]

  • ...We noticed that these results outperformed the results shown in Ž gure 6 in [3] and Ž gure 2 in [4]....

    [...]

  • ...The transform-based technique removes redundancy by the application of some kind of transform such as one of the following: the cosine transform, the Walsh transform and Karhunen – Loeve transform [2, 3]....

    [...]

  • ...The proposed scheme gives better performance than the schemes in [3, 4]....

    [...]

  • ...Simulation runs show that this method performs better than the algorithms proposed by [3, 4]....

    [...]

01 Jan 2009
TL;DR: An improved wavelet compression algorithm for electrocardiogram (ECG) signals which is combined with the lifting wavelet transform (WT) and the dynamic multi-stage vector quantization (MS-VQ) and preliminary results indicate that the proposed method excels over previous techniques for high fidelity compression.
Abstract: In this paper, an improved wavelet compression algorithm is proposed for electrocardiogram (ECG) signals which is combined with the lifting wavelet transform (WT) and the dynamic multi-stage vector quantization (MS-VQ). The lifting wavelet transformed coefficients in a hierarchical tree order are taken as the components of a vector called a tree vector (TV). Based on the property of wavelet coefficients that gives emphasis to approximation coefficients, the components of a target vector for two stages VQ is differently extracted from different WT sub- bands. In the first-stage, 32 dimensional TV for crude quantization is extracted in the order of a hierarchical tree from a WT sub-bands except the last sub-band and in the second stage, 64 dimensional code vectors from all WT sub-bands. Each codebook is adaptively updated by the distortion constrained codebook replenishment mechanism (DCCR). The combination of lifting WT and dynamic multi-stage VQ retains feature integrity of the ECG at high compression ratios. Preliminary results indicate that the proposed method excels over previous techniques for high fidelity compression.

13 citations

Proceedings ArticleDOI
03 Aug 2008
TL;DR: A novel customers' demand forecasting model based on least squares support vector machines (LS-SVM) for e-business enterprises that shows outstanding performance in simulation and practical results is introduced.
Abstract: This paper introduces a novel customers' demand forecasting model based on least squares support vector machines (LS-SVM) for e-business enterprises. Firstly, the paper presents actual state of e-business, and discusses some factors that block e-business advance in China. Then, some common techniques used for forecasting are briefly reviewed together with their shortcomings respectively. To solve these disadvantages, the paper reviews the fundamental theory of least squares support vector machines for regression, and analyses some merits of the theory. At last, based on the theory, the paper proposes a forecasting model to forecast pure water demand in a week for an e-business website. Compared with linear neural network predictor, RBF neural network predictor and BP neural network predictor, the LS-SVM forecasting model shows outstanding performance in simulation and practical results.

13 citations


Additional excerpts

  • ...[14] M. Hilton (1997)....

    [...]

Journal ArticleDOI
TL;DR: Analysis of the impact of a cognitive task during gait on the cerebral blood flow velocity, the blood flow signal features and the correlation of gait and blood flow features through a dual task methodology shows that cognitive process has an impact onThe cerebral activity during walking.
Abstract: Gait is a complex process involving both cognitive and sensory ability and is strongly impacted by the environment. In this paper, we propose to study of the impact of a cognitive task during gait on the cerebral blood flow velocity, the blood flow signal features and the correlation of gait and blood flow features through a dual task methodology. Both cerebral blood flow velocity and gait characteristics of eleven participants with no history of brain or gait conditions were recorded using transcranial Doppler on mid-cerebral artery while on a treadmill. The cognitive task was induced by a backward counting starting from 10,000 with decrement of 7. Central blood flow velocity raw and envelope features were extracted in both time, frequency and time-scale domain; information-theoretic metrics were also extracted and statistical significances were inspected. A similar feature extraction was performed on the stride interval signal. Statistical differences between the cognitive and baseline trials, between the left and right mid-cerebral arteries signals and the impact of the antropometric variables where studied using linear mixed models. No statistical differences were found between the left and right mid-cerebral arteries flows or the baseline and cognitive state gait features, while statistical differences for specific features were measured between cognitive and baseline states. These statistical differences found between the baseline and cognitive states show that cognitive process has an impact on the cerebral activity during walking. The state was found to have an impact on the correlation between the gait and blood flow features.

13 citations


Cites methods from "Wavelet and wavelet packet compress..."

  • ...(where a10 is the approximation coefficient and dk represents detail coefficient at the kth level [58]) using a discrete wavelet transform approach and the Meyer wavelet as a mother wavelet....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Abstract: Multiresolution representations are effective for analyzing the information content of images. The properties of the operator which approximates a signal at a given resolution were studied. It is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2/sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions. In L/sup 2/(R), a wavelet orthonormal basis is a family of functions which is built by dilating and translating a unique function psi (x). This decomposition defines an orthogonal multiresolution representation called a wavelet representation. It is computed with a pyramidal algorithm based on convolutions with quadrature mirror filters. Wavelet representation lies between the spatial and Fourier domains. For images, the wavelet representation differentiates several spatial orientations. The application of this representation to data compression in image coding, texture discrimination and fractal analysis is discussed. >

20,028 citations

Book
01 May 1992
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

16,073 citations

Journal ArticleDOI
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

14,157 citations

Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"Wavelet and wavelet packet compress..." refers methods in this paper

  • ...In the work described in this paper, was chosen to be Daubechie's W6 wavelet [10], which is illustrated in Figure 1....

    [...]

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations


Additional excerpts

  • ...algorithm was inspired by that in [28]....

    [...]