scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

01 May 1997-IEEE Transactions on Biomedical Engineering (IEEE Trans Biomed Eng)-Vol. 44, Iss: 5, pp 394-402
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.
Citations
More filters
Journal ArticleDOI
05 Nov 2019
TL;DR: The performance of an internet of things (IoT) based, real-time, smart ECG signal compression and transmission protocol is investigated through both simulation and hardware implementation.
Abstract: In this paper, the performance of an internet of things (IoT) based, real-time, smart ECG signal compression and transmission protocol is investigated through both simulation and hardware implementation. The protocol, consisting of a combination of fog and cloud computing architecture, has a four layered structure with first layer comprising of wearable ECG sensor and noise filter embedded in the device. Second layer is an encoder which consists of an algorithm subdivided into two schemes namely, geometry-based method (GBM) and wavelet-based iterative thresholding (WTIT). The algorithm is based on the fact that ECG signals can be approximated by the linear combination of a few coefficients obtained from a wavelet basis. The third layer consists of wireless transmission medium from the wearable to a private cloud, accessed by the hospital servers. The data is obtained in the receiver, the final layer, where signal reconstruction is performed using inverse transforms.
Proceedings ArticleDOI
27 Jan 2023
TL;DR: In this article , a comparative study between 1D and 2D ECG data compression methods were taken out from existing literature to provide an update in this regard and give an idea of the latest algorithms for the valuation of ECG signal quality after compression approaches and efficiency of the signal.
Abstract: The valuation of ECG signal quality after compression is an important task to be performed for the compression process. Compression enables the signal achieving, accelerates and decreases the energy ingesting. In contrary, lossy compression approaches pursuance by both compression efficacy and quality of signal. The manuscript gives an overview of the objective algorithms for the valuation of both ECG signal quality. The paper gives an idea of the latest algorithms for the valuation of ECG signal quality after compression approaches and efficiency of the signal. In this domain there is the absence of extensive review. Electrocardiogram (ECG) data compression plays a significant role in localized digital storage or efficient communication channel utilization in telemedicine applications. Based on domain ECG data compression can be performed either one dimensional (1D) or two dimensional (2D) for utilization of inter and inter with intra beat correlation respectively. In this paper, a comparative study between 1D & 2D ECG data compression methods were taken out from existing literature to provide an update in this regard. ECG data compression techniques and algorithms in 1D & 2D domain have their own merits and limitations. Recently, numerous research and techniques in 1D ECG data compression have been developed including direct and transformed domain. Additionally, 2D ECG data compression research is reported based on period normalization and complexity sorting in recent times.
Proceedings ArticleDOI
28 Jul 2020
TL;DR: A modified adaptive dictionary framework that utilizes sparsity enhancing pre-processing techniques, viz Kronecker product and signal filtering, which improves the reconstruction accuracy in comparison to more advanced CS based compression techniques is proposed.
Abstract: Achieving effective compression of Electrocardiogram (ECG) signals is a vital step for most telecadiography applications. Compressed Sensing (CS) provides a low energy alternative when compared with other compression techniques, making it a subject of extensive research in the field of telemedicine. This paper aims to propose a modified adaptive dictionary framework that utilizes sparsity enhancing pre-processing techniques, viz Kronecker product and signal filtering, which improves the reconstruction accuracy in comparison to more advanced CS based compression techniques. The suggested framework utilizes multiple dictionaries, constructed using Dictionary Learning (DL) techniques which are used for signal reconstruction. Compression ratio (CR) and Percentage-Root-Mean-Square-Difference (PRD) are important metrics used for comparison with existing CS based approaches. The training and test records used in experiments are taken from MIT-BIH Arrhythmia database. Results indicate that the modified reconstruction scheme provides 18% reduction in PRD (relative to method across all CRs.

Cites background or methods from "Wavelet and wavelet packet compress..."

  • ...PRD comparison between the proposed modified AD-Q6, AD-Q6 [1], SD-Q7 [2] , MMB-IHT [5], MMB-CoSAMP [5], and BSBL [20] approaches in the literature and SPIHT [3]....

    [...]

  • ...Embedded Zerotree Wavelet (EZW) [2] and Set Partitioning in Hierarchical Trees (SPIHT) [3] are examples of algorithms which use Nyquist rate sampling....

    [...]

Proceedings ArticleDOI
26 Jul 2011
TL;DR: The experiment results show that the algorithm can significantly reduce the signal distortion by selecting optimal wavelet bases for each segment electrocardiogram in compression algorithm.
Abstract: This paper is associated with the choice of wavelet bases in electrocardiogram compression. The integer-to-integer wavelet transform in the algorithm, the zero-tree coding and the adaptive arithmetic coding in electrocardiogram is used. The wavelet library is composed of nine wavelets. The MIT-BIH arrhythmia database is used to test the algorithm. The experiment results show that it can significantly reduce the signal distortion by selecting optimal wavelet bases for each segment electrocardiogram in compression algorithm.
19 Sep 2014
TL;DR: In this article, the consequences of upper spinal cord injury on cerebral blood flow velocity (CBFV) were examined using transcranial Doppler (TCD) in a male participant.
Abstract: Spinal cord injury (SCI) is one of the most common neurological disorders. In this paper, we examined the consequences of upper SCI in a male participant on cerebral blood flow velocity (CBFV). In particular, transcranial Doppler (TCD) was used to study these effects through middle cerebral arteries (MCA) during resting-state periods and during cognitive challenges (non-verbal word-generation tasks and geometric-rotation tasks). Signal characteristics were analyzed from raw signals and envelope signals (maximum velocity) in time domain, frequency domain and time frequency domain. Frequency features pointed out an increase of peak frequency in L-MCA and R-MCA raw signals which revealed stronger cerebral blood flow during geometric/verbal processes respectively. This underlined a slight dominance of the right hemisphere during word-generation periods and a slight dominance of the left hemisphere during geometric processes. This finding was confirmed by cross-correlation in time domain and by entropy rate in information-theoretic domain. Comparing our results to other neurological disorders (Alzheimer’s disease, Parkinson’s disease, autism, epilepsy, traumatic brain injury) showed that the SCI had similar effects such as a general decreased cerebral blood flow and similar regular hemispheric dominance in a few cases.
References
More filters
Journal ArticleDOI
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Abstract: Multiresolution representations are effective for analyzing the information content of images. The properties of the operator which approximates a signal at a given resolution were studied. It is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2/sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions. In L/sup 2/(R), a wavelet orthonormal basis is a family of functions which is built by dilating and translating a unique function psi (x). This decomposition defines an orthogonal multiresolution representation called a wavelet representation. It is computed with a pyramidal algorithm based on convolutions with quadrature mirror filters. Wavelet representation lies between the spatial and Fourier domains. For images, the wavelet representation differentiates several spatial orientations. The application of this representation to data compression in image coding, texture discrimination and fractal analysis is discussed. >

20,028 citations

Book
01 May 1992
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

16,073 citations

Journal ArticleDOI
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Abstract: Introduction Preliminaries and notation The what, why, and how of wavelets The continuous wavelet transform Discrete wavelet transforms: Frames Time-frequency density and orthonormal bases Orthonormal bases of wavelets and multiresolutional analysis Orthonormal bases of compactly supported wavelets More about the regularity of compactly supported wavelets Symmetry for compactly supported wavelet bases Characterization of functional spaces by means of wavelets Generalizations and tricks for orthonormal wavelet bases References Indexes.

14,157 citations

Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"Wavelet and wavelet packet compress..." refers methods in this paper

  • ...In the work described in this paper, was chosen to be Daubechie's W6 wavelet [10], which is illustrated in Figure 1....

    [...]

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations


Additional excerpts

  • ...algorithm was inspired by that in [28]....

    [...]