Journal ArticleDOI
Wavelet and wavelet packet compression of electrocardiograms
TLDR
Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.Abstract:
Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.read more
Citations
More filters
Proceedings ArticleDOI
Diagnostic quality driven physiological data collection for personal healthcare
TL;DR: A diagnostic quality driven mechanism for remote ECG monitoring is presented, which enables a notation of priorities encoded into the wave segments, which provides accurate inference results while effectively compressing the data.
Proceedings Article
On a compression algorithm for ECG signals
Monica Negoita,Liviu Goras +1 more
TL;DR: The paper presents a new algorithm for ECG signal compression based on local extreme extraction, adaptive hysteretic filtering and LZW coding that is robust with respect to noise, has a rather small computational complexity and provides good compression ratios with excellent reconstruction quality.
Patent
Hybrid 2-D ECG data compression based on wavelet transforms
TL;DR: In this article, a method for hybrid 2D ECG data compression based on wavelet transforms is described. But, the method is not suitable for the use of high-frequency ECG signals.
Journal ArticleDOI
Classification of Cardiac Vascular Disease from ECG Signals for Enhancing Modern Health Care Scenario
TL;DR: In this paper, the abnormalities found in the ECG signals are analyzed by identifying the Normal, Bradycardia Arrhythmia, TachycardiaArrhythmmia and Ischemia signal using the method of Neuro Fuzzy Classifier.
Proceedings ArticleDOI
A novel method to represent ECG signals via predefined personalized signature and envelope functions
TL;DR: It has been shown that the new method of modeling provides significant data compression and transmission of ECG signals reduces to the transmission of indexes "R" and "K" of [/spl alpha//sub r/(t), /spl phi//sub k/(t)] pairs and the coefficients C/sub i/, which also result in considerable saving in the transmission band.
References
More filters
Journal ArticleDOI
A theory for multiresolution signal decomposition: the wavelet representation
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Book
Ten lectures on wavelets
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Journal ArticleDOI
Ten Lectures on Wavelets
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Journal ArticleDOI
Orthonormal bases of compactly supported wavelets
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Journal ArticleDOI
A new, fast, and efficient image codec based on set partitioning in hierarchical trees
Amir Said,William A. Pearlman +1 more
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.