scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A wavelet transform-based ECG compression method guaranteeing desired signal quality

01 Dec 1998-IEEE Transactions on Biomedical Engineering (IEEE)-Vol. 45, Iss: 12, pp 1414-1419
TL;DR: A new electrocardiogram compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity are presented.
Abstract: This paper presents a new electrocardiogram (ECG) compression method based on orthonormal wavelet transform and an adaptive quantization strategy, by which a predetermined percent root mean square difference (PRD) can be guaranteed with high compression ratio and low implementation complexity.
Citations
More filters
Journal ArticleDOI
01 Apr 2012
TL;DR: FPGA design of ECG compression by using the Discrete Wavelet Transform (DWT) and one lossless encoding method and a model is developed for a fixed-point convolution scheme which has a good performance in relation to the throughput, the latency, the maximum frequency of operation and the quality of the compressed signal.
Abstract: espanolEste documento presenta el diseno basado en FPGA para la compresion de senales ECG utilizando la Transformada Wavelet Discreta y un metodo de codificacion sin perdida de informacion A diferencia de los trabajos clasicos para modo off-line, el trabajo actual permite la compresion en tiempo real de la senal ECG por medio de la reduccion de la informacion redundante Se propone un modelo para el esquema de convolucion en formato punto fijo, el cual tiene buen desempeno en relacion a la tasa de salida, la latencia del sistema, la maxima frecuencia de operacion y la calidad de la senal comprimida La arquitectura propuesta, la cuantizacion utilizada y el metodo de codificacion proporcionan un PRD que es apto para el analisis clinico EnglishThis paper presents FPGA design of ECG compression by using the Discrete Wavelet Transform (DWT) and one lossless encoding method Unlike the classical works based on off-line mode, the current work allows the real-time processing of the ECG signal to reduce the redundant information A model is developed for a fixed-point convolution scheme which has a good performance in relation to the throughput, the latency, the maximum frequency of operation and the quality of the compressed signal The quantization of the coefficients of the filters and the selected fixed-threshold give a low error in relation to clinical applications

6 citations


Cites background from "A wavelet transform-based ECG compr..."

  • ...The limit of the threshold is related to the desired Percent-Root-Mean-Square-Difference (PRD) of the compressed (or filtered) signal [2]....

    [...]

Dissertation
15 Nov 2007
TL;DR: In this article, the authors compare polynome orthogonaux-based methods for the compression of signaux ECG with polynomes of Legendre et Tchebychev.
Abstract: La compression des signaux ECG trouve encore plus d'importance avec le developpement de la telemedecine. En effet, la compression permet de reduire considerablement les couts de la transmission des informations medicales a travers les canaux de telecommunication. Notre objectif dans ce travail de these est d'elaborer des nouvelles methodes de compression des signaux ECG a base des polynomes orthogonaux. Pour commencer, nous avons etudie les caracteristiques des signaux ECG, ainsi que differentes operations de traitements souvent appliquees a ce signal. Nous avons aussi decrit de facon exhaustive et comparative, les algorithmes existants de compression des signaux ECG, en insistant sur ceux a base des approximations et interpolations polynomiales. Nous avons aborde par la suite, les fondements theoriques des polynomes orthogonaux, en etudiant successivement leur nature mathematique, les nombreuses et interessantes proprietes qu'ils disposent et aussi les caracteristiques de quelques uns de ces polynomes. La modelisation polynomiale du signal ECG consiste d'abord a segmenter ce signal en cycles cardiaques apres detection des complexes QRS, ensuite, on devra decomposer dans des bases polynomiales, les fenetres de signaux obtenues apres la segmentation. Les coefficients produits par la decomposition sont utilises pour synthetiser les segments de signaux dans la phase de reconstruction. La compression revient a utiliser un petit nombre de coefficients pour representer un segment de signal constitue d'un grand nombre d'echantillons. Nos experimentations ont etabli que les polynomes de Laguerre et les polynomes d'Hermite ne conduisaient pas a une bonne reconstruction du signal ECG. Par contre, les polynomes de Legendre et les polynomes de Tchebychev ont donne des resultats interessants. En consequence, nous concevons notre premier algorithme de compression de l'ECG en utilisant les polynomes de Jacobi. Lorsqu'on optimise cet algorithme en supprimant les effets de bords, il devient universel et n'est plus dedie a la compression des seuls signaux ECG. Bien qu'individuellement, ni les polynomes de Laguerre, ni les fonctions d'Hermite ne permettent une bonne modelisation des segments du signal ECG, nous avons imagine l'association des deux systemes de fonctions pour representer un cycle cardiaque. Le segment de l'ECG correspondant a un cycle cardiaque est scinde en deux parties dans ce cas: la ligne isoelectrique qu'on decompose en series de polynomes de Laguerre et les ondes P-QRS-T modelisees par les fonctions d'Hermite. On obtient un second algorithme de compression des signaux ECG robuste et performant.

6 citations

Journal ArticleDOI
TL;DR: The test results and performance indices have proved beyond doubt that the EBP-NN method is very efficient for the data compression and help in the management of ECG data in both offline and real-time applications.
Abstract: This paper deals with an efficient algorithm, which has been developed for Electrocardiogram (ECG) data compression using error back propagation neural networks (EBP-NN). Four EBP-NN have been trained to retrieve all the 12 standard leads of the ECG signal. The combination of leads and the network topologies have been finalized after an extensive study of correlation between the ECG leads using CSE database. Each network has a topology of N-4-4-N, where N represents the number of samples in one cycle in any particular lead. It has been observed that this method compresses the data as well as improves the quality of retrieved signal due to elimination of high frequency noise. The compression ratio (CR) in EBP-NN method goes on increasing with the increase in the number of ECG cycles. This method is best suited for data compression in Holter monitoring, ambulatory care and telemedicine. The performance of the algorithm has been evaluated by comparing the vital reference points like onsets, offsets, amplitud...

6 citations

Proceedings ArticleDOI
01 Dec 2007
TL;DR: A compression method, based on the choice of a wavelet that matches the electrocardiogram signal to be compressed, is proposed in this paper and compared with the compression using the classical wavelet Db3.
Abstract: A compression method, based on the choice of a wavelet that matches the electrocardiogram signal to be compressed, is proposed in this paper. The coefficients of the scaling filter that minimize the distortion of the compressed signal are used to determine the wavelet. The choice of the scaling filter is done by the parametrization of the scaling coefficients in a way that all the constraints are satisfied for any set of parameters. A genetic algorithm is used to determine the parameters that minimize the distortion of the compressed signal. The performance of the proposed algorithm is analyzed and compared with the compression using the classical wavelet Db3.

6 citations

Journal ArticleDOI
TL;DR: In this paper, a one-dimensional complex Discrete Anamorphic Stretch Transform (DAST) is proposed for precompression of the ECG signal for real-time transmission using channels with limited bandwidth.

5 citations

References
More filters
Journal ArticleDOI
Ingrid Daubechies1
TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Abstract: We construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity. The order of regularity increases linearly with the support width. We start by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction. The construction then follows from a synthesis of these different approaches.

8,588 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...Since detailed mathematical aspects of wavelet theory can b found elsewhere [16], here, we shall merely describe the structure of a DOWT-based coding system shown in Fig....

    [...]

  • ...The proposed algorithm was implemented on a SparcStation 2 computer, where the wavelet-based filters with 10-taps were designed by Daubechies’s algorithm [16], the layer was set to , the buffer size for segmenting input ECG signals was set to , and the Lempel–Ziv–Welch (LZW) encoder [20] was chosen as the entropy encoder for simplicity....

    [...]

Journal ArticleDOI
TL;DR: A new compression algorithm is introduced that is based on principles not found in existing commercial methods in that it dynamically adapts to the redundancy characteristics of the data being compressed, and serves to illustrate system problems inherent in using any compression scheme.
Abstract: Data stored on disks and tapes or transferred over communications links in commercial computer systems generally contains significant redundancy. A mechanism or procedure which recodes the data to lessen the redundancy could possibly double or triple the effective data densitites in stored or communicated data. Moreover, if compression is automatic, it can also aid in the rise of software development costs. A transparent compression mechanism could permit the use of "sloppy" data structures, in that empty space or sparse encoding of data would not greatly expand the use of storage space or transfer time; however , that requires a good compression procedure. Several problems encountered when common compression methods are integrated into computer systems have prevented the widespread use of automatic data compression. For example (1) poor runtime execution speeds interfere in the attainment of very high data rates; (2) most compression techniques are not flexible enough to process different types of redundancy; (3) blocks of compressed data that have unpredictable lengths present storage space management problems. Each compression ' This article was written while Welch was employed at Sperry Research Center; he is now employed with Digital Equipment Corporation. 8 m, 2 /R4/OflAb l strategy poses a different set of these problems and, consequently , the use of each strategy is restricted to applications where its inherent weaknesses present no critical problems. This article introduces a new compression algorithm that is based on principles not found in existing commercial methods. This algorithm avoids many of the problems associated with older methods in that it dynamically adapts to the redundancy characteristics of the data being compressed. An investigation into possible application of this algorithm yields insight into the compressibility of various types of data and serves to illustrate system problems inherent in using any compression scheme. For readers interested in simple but subtle procedures, some details of this algorithm and its implementations are also described. The focus throughout this article will be on transparent compression in which the computer programmer is not aware of the existence of compression except in system performance. This form of compression is "noiseless," the decompressed data is an exact replica of the input data, and the compression apparatus is given no special program information, such as data type or usage statistics. Transparency is perceived to be important because putting an extra burden on the application programmer would cause

2,426 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...The proposed algorithm was implemented on a SparcStation 2 computer, where the wavelet-based filters with 10-taps were designed by Daubechies’s algorithm [16], the layer was set to , the buffer size for segmenting input ECG signals was set to , and the Lempel‐Ziv‐Welch (LZW) encoder [ 20 ] was chosen as the entropy encoder for simplicity....

    [...]

Journal ArticleDOI
TL;DR: The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods and a framework for evaluation and comparison of ECG compression schemes is presented.
Abstract: Electrocardiogram (ECG) compression techniques are compared, and a unified view of these techniques is established. ECG data compression schemes are presented in two major groups: direct data compression and transformation methods. The direct data compression techniques are ECG differential pulse code modulation (DPCM) and entropy coding, AZTEC, Turning-point, CORTES, Fan and SAPA algorithms, peak-picking, and cycle-to-cycle compression methods. The transformation methods include Fourier, Walsh, and Karhunen-Loeve transforms. The theoretical bases behind the direct ECG data compression schemes are presented and classified into three categories: tolerance-comparison compression, DPCM, and entropy coding methods. A framework for evaluation and comparison of ECG compression schemes is presented. >

690 citations


"A wavelet transform-based ECG compr..." refers methods in this paper

  • ...In most cases, direct methods are superior to transform methods with respect to system complexity and the error control mechanism, however, transform methods usually achieve higher compression ratios and are insensitive to the noise contained in original ECG signals [1]....

    [...]

  • ...In direct methods, the compression is done directly on the ECG samples; examples include the amplitude zone time epoch coding (AZTEC), the turning point (TP), the coordinate reduction time encoding system (CORTES), the scan-along polygonal approximation (SAPA), peak-picking, cycle-to-cycle, and differential pulse code modulation (DPCM) [1]–[4]....

    [...]

Journal ArticleDOI
TL;DR: Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECGs are clinically useful.
Abstract: Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

445 citations