Journal ArticleDOI
Wavelet and wavelet packet compression of electrocardiograms
Reads0
Chats0
TLDR
Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.Abstract:
Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.read more
Citations
More filters
Journal ArticleDOI
High performance data compression method with pattern matching for biomedical ECG and arterial pulse waveforms.
TL;DR: The simulation results show that the compression algorithms achieve a very significant improvement in the performances of compression ratio and error measurement for both ECG and pulse, as compared with some other compression methods.
Journal ArticleDOI
A Review of ECG Data Compression Techniques
TL;DR: This paper focuses on providing a comparison of the major techniques of ECG data compression which are intended to attain a lossless compressed data with relatively high compression ratio (CR) and low percent root mean square difference (PRD).
Journal ArticleDOI
Determination of the component number in overlapping multicomponent chromatogram using wavelet transform
TL;DR: It was proved that the wavelet transform is a very easy and convenient method for detecting the component number in overlapping multicomponent chromatograms.
Journal ArticleDOI
Authenticity verification of audio signals based on fragile watermarking for audio forensics
TL;DR: A new fragile watermarking method for digital audio authenticity for audio forensics purposes based on an embedding process of a text that is encoded through OVSF (Orthogonal Variable Spreading Factor) codes and spread into the entire signal using automatic adjustment.
Journal ArticleDOI
A novel scheme for simultaneous image compression and encryption based on wavelet packet transform and multi-chaotic systems
Xiupin Lv,Xiaofeng Liao,Bo Yang +2 more
TL;DR: The proposed scheme solves a long-standing contradiction that a signal should be first compressed or encrypted, which often appears in traditional image compression and encryption systems.
References
More filters
Journal ArticleDOI
A theory for multiresolution signal decomposition: the wavelet representation
TL;DR: In this paper, it is shown that the difference of information between the approximation of a signal at the resolutions 2/sup j+1/ and 2 /sup j/ (where j is an integer) can be extracted by decomposing this signal on a wavelet orthonormal basis of L/sup 2/(R/sup n/), the vector space of measurable, square-integrable n-dimensional functions.
Book
Ten lectures on wavelets
TL;DR: This paper presents a meta-analyses of the wavelet transforms of Coxeter’s inequality and its applications to multiresolutional analysis and orthonormal bases.
Journal ArticleDOI
Ten Lectures on Wavelets
TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Journal ArticleDOI
Orthonormal bases of compactly supported wavelets
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Journal ArticleDOI
A new, fast, and efficient image codec based on set partitioning in hierarchical trees
Amir Said,William A. Pearlman +1 more
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.