scispace - formally typeset
Journal ArticleDOI

Wavelet and wavelet packet compression of electrocardiograms

TLDR
Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract
Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

read more

Citations
More filters
Journal ArticleDOI

Effective compression and classification of ECG arrhythmia by singular value decomposition

TL;DR: In this paper, a simple but efficient method utilizing singular value decomposition (SVD) to decompose ECG signals, then applied the decompressed data to a convolutional neural network (CNN) and supporting vector machine (SVM) for classification.
Journal ArticleDOI

Hilbert Transform-Based ECG Modeling

TL;DR: The shape of the ECG beat can dynami� cally change and is highly correlated with type of pathol� ogy, since the mathematical model can be used in different ways.
Journal ArticleDOI

Cognitive tasks and cerebral blood flow through anterior cerebral arteries: a study via functional transcranial Doppler ultrasound recordings

TL;DR: Understanding CBFV in ACA during cognitive tasks could complement information extracted from cerebral blood flow in middle cerebral arteries during similar cognitive tasks (i.e. sex effects) even if no lateralization effects were noticed during resting-state, verbal and geometric tasks.
Proceedings ArticleDOI

Wavelet-based compression of power disturbances using the minimum description length criterion

TL;DR: In this article, a compression technique for power disturbance data via discrete wavelet transform (DWT) and wavelet packet transform (WPT) is introduced, which is performed through signal decomposition up to a certain level, thresholding of wavelet coefficients, and signal reconstruction.
Journal ArticleDOI

Generalized Rational Variable Projection With Application in ECG Compression

TL;DR: In this article, the authors developed an adaptive transform-domain technique based on rational function systems for electrocardiogram (ECG) signal compression, which is designed especially for the rational transforms in question.
References
More filters
Journal ArticleDOI

A theory for multiresolution signal decomposition: the wavelet representation

TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Book

Ten lectures on wavelets

TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Journal ArticleDOI

Ten Lectures on Wavelets

TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Journal ArticleDOI

Orthonormal bases of compactly supported wavelets

TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Journal ArticleDOI

A new, fast, and efficient image codec based on set partitioning in hierarchical trees

TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Related Papers (5)