Journal ArticleDOI
Wavelet and wavelet packet compression of electrocardiograms
Reads0
Chats0
TLDR
Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.Abstract:
Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.read more
Citations
More filters
Proceedings ArticleDOI
ECG Monitoring over Bluetooth: Data Compression and Transmission
TL;DR: A low complexity ECG compression method is proposed based on the special considerations of (processing and transmission) power consumption at wireless ECG sensors and Comparisons with existing approaches confirm the superior performance of this method.
Proceedings ArticleDOI
Wavelet-Based 2-D ECG Data Compression Method Using SPIHT and VQ Coding
TL;DR: An improved wavelet-based 2-D ECG data compression method is presented which employs a double stage compression and utilizes both inter-beat and inter-sample redundancies in the ECG signal.
Journal ArticleDOI
Fault Diagnosis and Tolerance Control of Five-Level Nested NPP Converter Using Wavelet Packet and LSTM
TL;DR: A fault diagnosis and tolerance solution for a five-level nested NPP converter is proposed and a deep learning method integrating the wavelet packet transform and long short-term memory is presented, proving the effectiveness of the fault-tolerant strategy.
Journal ArticleDOI
Extraction of extended X-ray absorption fine structure information from the experimental data using the wavelet transform
TL;DR: In this article, a novel method for extracting the extended X-ray absorption fine structure (EXAFS) information from a measured absorption spectrum is proposed, which can be easily retrieved from the experimental spectrum by means of wavelet decomposition.
Proceedings ArticleDOI
ECG Signal Compression using Discrete Sinc Interpolation
TL;DR: It is observed that higher compression ratio (CR) is achieved with a relatively lower percentage RMS difference (PRD) by DSI algorithm, which is lower in case of the DSI algorithms compared to the AZTEC and FAN algorithm.
References
More filters
Journal ArticleDOI
A theory for multiresolution signal decomposition: the wavelet representation
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Book
Ten lectures on wavelets
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Journal ArticleDOI
Ten Lectures on Wavelets
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Journal ArticleDOI
Orthonormal bases of compactly supported wavelets
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Journal ArticleDOI
A new, fast, and efficient image codec based on set partitioning in hierarchical trees
Amir Said,William A. Pearlman +1 more
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.