scispace - formally typeset
Search or ask a question

Analysis of ECG Data Compression Techniques

01 Jan 2013-
TL;DR: A comparative study of Fast Fourier Transform (FFT, Discrete Cosine Transform (DCT), Discrete sine transform (DST) and DiscreteCosine Transform-II (D CT-II) is presented.
Abstract: ECG (electrocardiogram) is a test that measures the electrical activity of the heart. The heart is a muscular organ that beats in rhythm to pump the blood through the body. Large amount of signal data needs to be stored and transmitted. So, it is necessary to compress the ECG signal data in an efficient way. In the past decades, many ECG compression methods have been proposed and these methods can be roughly classified into three categories: direct methods, parameter extraction methods and transform methods. In this paper a comparative study of Fast Fourier Transform (FFT), Discrete Cosine Transform (DCT), Discrete sine Transform (DST) and Discrete Cosine Transform-II (DCT-II). Records selected from MIT-BIH arrhythmia database are tested. For performance evaluation Compression Ratio (CR), Percent Root Mean Square differences (PRD) are used.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: The results show that this method can be efficiently used for compression of ECG signal from multiple leads and performs well than the techniques based on SVD and Huffman Encoding.
Abstract: ECG (Electrocardiogram) is a test that analyzes the electrical behaviour of the heart. ECG is used in diagnosing most of the cardiac diseases. Large amount of ECG data from multiple leads needs to be stored and transmitted, which requires compression for effective data storage and retrieval. Proposed work has been developed with Singular Value Decomposition (SVD) followed by Run Length Encoding (RLE) combined with Huffman Encoding (HE) and Arithmetic Encoding (AE) individually. The ECG signal is first preprocessed. SVD is used to factorize the signal into three smaller set of values, which preserve the significant features of the ECG. Finally, Run Length Encoding combined with Huffman encoding (RLE-HE) and Arithmetic encoding (RLE-AE) individually are employed and the compression performance metrics are compared. The proposed method is evaluated with PTB Diagnostic database. Performance measures such as Compression Ratio (CR), Percentage Root mean square Difference (PRD) and Signal to Noise Ratio (SNR) of the reconstructed signal are used to evaluate the proposed technique. It is evident that the proposed method performs well than the techniques based on SVD and Huffman Encoding. The results show that this method can be efficiently used for compression of ECG signal from multiple leads.

5 citations

Proceedings ArticleDOI
Zhe Liu1, Wenbin Yu1, Cailian Chen1, Bo Yang1, Xinping Guan1 
27 Jul 2016
TL;DR: An adaptive compression ratio estimation technique of ECG signal is illustrated and a common model of relationship between compression ratio and sparsity is proposed, which can guarantee the reconstruction quality stable and improve the compression performance by 18.85% compared with traditional CS-based methods.
Abstract: Real-time electrocardiogram (ECG) monitoring system has sprung up due to the considerable interest attracted to Wireless Body Area Networks (WBANs). Commonly, the ECG data is required to be compressed for higher energy efficiency and Compressive Sensing (CS) has been proved to be an effective way. However, for the real-time ECG monitoring, the length of data frame should be strictly limited for short data latency, which unavoidably causes variation of data sparsity and fluctuation of reconstruction quality. Furthermore, the compression ratio is well worth considering with corresponding energy cost in WBANs. To balance the reconstruction quality and compression ratio, this paper illustrates an adaptive compression ratio estimation technique of ECG signal and proposes a common model of relationship between compression ratio and sparsity. Correlated-sparsity compression ratio (CoCR) is defined to reflect the influence of sparsity on compression performance. Moreover, a two-dimensional clustering algorithm is designed to accelerate the operation speed and improve the precision of classification without prior knowledge. Finally, simulation results verify that the proposed method can guarantee the reconstruction quality stable and improve the compression performance by 18.85% compared with traditional CS-based methods.

3 citations

Journal ArticleDOI
TL;DR: In this paper, ECG signal data compression techniques are analyzed using the MIT-BIH database and then compared with the Apnea-ECG and Challenge 2017 Training bases, where DCT and SAPA/FAN have the best compression performance compared to other methods.
Abstract: Technological advances introduce different methods for telecardiology. Telecardiology includes many applications and is one of the long-standing medical fields that have grown very well. In telecardiology, a very high amount of ECG data is recorded. Therefore, to compress electrocardiogram data (ECG), there is a need for an efficient technique and lossless compression. The compression of ECG data reduces the storage needs for a more efficient cardiological system and for analyzing and diagnosing the condition of the heart. In this paper, ECG signal data compression techniques are analyzed using the MIT-BIH database and then compared with the Apnea-ECG and Challenge 2017 Training bases. During the study, some of the various techniques of frequency analysis, range and time are widely used, such as run-time coding, AZTEC, Spin-Coding, FFT, DCT, DST, SAPA/FAN and DCT-II, where DCT and SAPA/FAN have the best compression performance compared to other methods.

3 citations

Journal ArticleDOI
TL;DR: Signal processing techniques incorporated with data compression processes enrich the signals and boost up storage efficiency and transmission reliability, and compressions based on these techniques are efficient since the compressed signal is reconstructed exactly with the original signal.
Abstract: Signal processing techniques incorporated with data compression processes enrich the signals and boost up storage efficiency and transmission reliability. Transmitting uncompressed original data consume wide bandwidth, which increases transmission time and leads to data hammering. These limitations enforce to look for strategic data compression techniques. Lossless compression techniques are requisite where it is important that the original and the decompressed data should be identical or where deviations from the original data would lead to catastrophic consequences, especially in biomedical signal analysis and diagnostics. For which, the input signal preprocessed with differential pulse code modulation (DPCM) reduces the interchannel dependencies to get the desired output. A whole array of unique compression techniques are being utilized in the compression process. The combination of (K‐means clustering, arithmetic encoding [AE], Huffman encoding [HE]) clustering and coding compression techniques are analyzed using electro cardiogram (ECG) and electroencephalogram (EEG) signals. The proposed method employs k‐means clustering combined with Huffman encoding (DiKHE) and k‐means clustering combined with arithmetic encoding (DiKAE) individually. Compression ratio (CR) is analyzed with these combinations of compression techniques for various cluster size K (K = 2,3,4,5,6). A maximum CR of 6.03144 and 4.54126 is obtained for ECG and EEG signals respectively. The compressions based on these techniques are efficient since the compressed signal is reconstructed perfectly as it matches exactly with the original signal.

2 citations


Cites methods from "Analysis of ECG Data Compression Te..."

  • ...It is measured with 3 leads or 12 leads with the electrodes connected to the surface of the skin.(1,2) The detected ECG signal is amplified by an amplifier, filtered, converted to a digital signal and stored....

    [...]

Proceedings ArticleDOI
01 Apr 2017
TL;DR: A comparative study of turning point (TP) compression technique and fan compression technique is done and the main comparison has been made morphologically.
Abstract: Electrocardiogram (ECG) is utilized in finding and treatment of various heart diseases. ECG data is compressed so that it can be effectively used in telemedicine. For telemedicine huge quantity of data signals are to be stored and sent to different places. So it is very essential to compress the ECG signal data in a resourceful way. Electrocardiogram (ECG) signals are mainly compressed for two reasons, on-line data transmission and effective and economical data storage. In the last five decades several data compression techniques has been developed for the compression of ECG signals. These techniques can be classified into three categories: direct data compression (DDC), transformation compression (TC), parameter extraction compression (PEC). In this paper a comparative study of turning point (TP) compression technique and fan compression technique is done. The comparison has been on parameters like compression ratio (CR), percentage root mean square difference (PRD) and quality score (QS). The main comparison has been made morphologically.

2 citations

References
More filters
Journal ArticleDOI
TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Abstract: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis. The program suppresses low amplitude signals, reduces the data rate by a factor of about 10, and codes the result in a form convenient for analysis.

374 citations

Journal ArticleDOI
TL;DR: The method of Fourier descriptors (FD's) is presented for ECG data compression, resistant to noisy signals and is simple, requiring implementation of forward and inverse FFT.
Abstract: The method of Fourier descriptors (FD's) is presented for ECG data compression. The two-lead ECG data are segmented into QRS complexes and S-Q intervals, expressed as a complex sequence, and are Fourier transformed to obtain the FD's. A few lower order descriptors symmetrically situated with respect to the dc coefficient represent the data in the Fourier (compressed) domain. While compression ratios of 10:1 are feasible for the S-Q interval, the clinical information requirements limit this ratio to 3:1 for the QRS complex. With an overall compression ratio greater than 7, the quality of the reconstructed signal is well suited for morphological studies. The method is resistant to noisy signals and is simple, requiring implementation of forward and inverse FFT. The results of compression of ECG data obtained from more than 50 subjects with rhythm and morphological abnormalities are presented.

183 citations

Journal ArticleDOI
TL;DR: Because the proposed real-time data compression and transmission algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.
Abstract: This paper introduces a real-time data compression and transmission algorithm between e-health terminals for a periodic ECGsignal. The proposed algorithm consists of five compression procedures and four reconstruction procedures. In order to evaluate the performance of the proposed algorithm, the algorithm was applied to all 48 recordings of MIT-BIH arrhythmia database, and the compress ratio (CR), percent root mean square difference (PRD), percent root mean square difference normalized (PRDN), rms, SNR, and quality score (QS) values were obtained. The result showed that the CR was 27.9:1 and the PRD was 2.93 on average for all 48 data instances with a 15% window size. In addition, the performance of the algorithm was compared to those of similar algorithms introduced recently by others. It was found that the proposed algorithm showed clearly superior performance in all 48 data instances at a compression ratio lower than 15:1, whereas it showed similar or slightly inferior PRD performance for a data compression ratio higher than 20:1. In light of the fact that the similarity with the original data becomes meaningless when the PRD is higher than 2, the proposed algorithm shows significantly better performance compared to the performance levels of other algorithms. Moreover, because the algorithm can compress and transmit data in real time, it can be served as an optimal biosignal data transmission method for limited bandwidth communication between e-health devices.

173 citations

Journal ArticleDOI
TL;DR: The algorithm SAPA-2, presented recently as a method for representing electrocardiographic wavefonns as a series of straight-line segments, appears to be equivalent to an older algorithm, the Fan.
Abstract: The algorithm SAPA-2, presented recently as a method for representing electrocardiographic wavefonns as a series of straight-line segments, appears to be equivalent to an older algorithm, the Fan.

41 citations