scispace - formally typeset
Open AccessJournal ArticleDOI

ECG data compression algorithm for tele-monitoring of cardiac patients

TLDR
An efficient electrocardiogram (ECG) data compression algorithm for tele-monitoring of cardiac patients from rural area, based on combination of two encoding techniques with discrete cosine transform, which provides good compression ratio (CR) with low percent root-mean-square difference (PRD) values.
Abstract
This paper reports an efficient electrocardiogram (ECG) data compression algorithm for tele-monitoring of cardiac patients from rural area, based on combination of two encoding techniques with discrete cosine transform. The proposed technique provides good compression ratio (CR) with low percent root-mean-square difference (PRD) values. For performance evaluation of the proposed algorithm 48 records of ECG signals are taken from MIT-BIH arrhythmia database. Each record of ECG signal is of duration 1 minute and sampled at sampling frequency of 360 Hz. Noise of the ECG signal has been removed using Savitzky-Golay filter. To transform the signal from time domain to frequency domain, discrete cosine transform has been used which compacts energy of the signal to lower order of frequency coefficients. After normalisation and rounding of transform coefficients, signals are encoded using dual encoding technique which consists of run length encoding and Huffman encoding. The dual encoding technique compresses data significantly without any loss of information. The proposed algorithm offers average values of CR, PRD, quality score, percent root mean square difference normalised, RMS error and SNR of 11.49, 3.43, 3.82, 5.51, 0.012 and 60.11 dB respectively.

read more

Content maybe subject to copyright    Report

I
nt. J. Telemedicine and Clinical Practices, Vol. 2, No. 1, 201
7
31
Copyright © 2017 Inderscience Enterprises Ltd.
ECG data compression algorithm for tele-monitoring
of cardiac patients
Chandan Kumar Jha* and
Maheshkumar H. Kolekar
Department of Electrical Engineering,
Indian Institute of Technology Patna,
Bihta – 801103, India
Email: ckjha.pee15@iitp.ac.in
Email: mahesh@iitp.ac.in
*Corresponding author
Abstract: This paper reports an efficient electrocardiogram (ECG) data
compression algorithm for tele-monitoring of cardiac patients from rural area,
based on combination of two encoding techniques with discrete cosine
transform. The proposed technique provides good compression ratio (CR) with
low percent root-mean-square difference (PRD) values. For performance
evaluation of the proposed algorithm 48 records of ECG signals are taken from
MIT-BIH arrhythmia database. Each record of ECG signal is of duration
1 minute and sampled at sampling frequency of 360 Hz. Noise of the ECG
signal has been removed using Savitzky-Golay filter. To transform the signal
from time domain to frequency domain, discrete cosine transform has been
used which compacts energy of the signal to lower order of frequency
coefficients. After normalisation and rounding of transform coefficients, signals
are encoded using dual encoding technique which consists of run length
encoding and Huffman encoding. The dual encoding technique compresses data
significantly without any loss of information. The proposed algorithm offers
average values of CR, PRD, quality score, percent root mean square difference
normalised, RMS error and SNR of 11.49, 3.43, 3.82, 5.51, 0.012 and 60.11 dB
respectively.
Keywords: compression; transmission; eHealth; discrete cosine transform;
Huffman encoding; run length encoding.
Reference to this paper should be made as follows: Jha, C.K. and
Kolekar, M.H. (2017) ‘ECG data compression algorithm for tele-monitoring of
cardiac patients’, Int. J. Telemedicine and Clinical Practices, Vol. 2, No. 1,
pp.31–41.
Biographical notes: Chandan Kumar Jha is currently pursuing PhD from
Indian Institute of Technology Patna in the Department of Electrical
Engineering. He worked as Assistant Professor of Electronics and
Communication Engineering at Institute of Information Technology and
Management, Gwalior during March 2013 to December. 2014. He received his
Master degree with distinction in Electronics and Communication Engineering
from Birla Institute of Technology Mesra in 2012. He received his Bachelor
degree in Electronics and Communication Engineering from West Bengal
University of Technology Kolkata in 2010. His research interests include
biomedical signal processing, ECG signal analysis and telemedicine.

32 C.K. Jha and M.H. Koleka
r
Maheshkumar H. Kolekar is working as Assistant Professor in the Department
of Electrical Engineering, Indian Institute of Technology Patna since March
2010. At IIT Patna, he worked as Head of the Department, Electrical
Engineering during 2013 and working as Head of the Centre for Advanced
Systems Engineering since January 2014. He received his PhD in Electronics
and Electrical Communication Engineering from Indian Institute of Technology
Kharagpur in 2007. During 2008–2009, he was Post-doctoral Research Fellow
in the Department of Computer Science, University of Missouri, Columbia,
USA. His research interests include digital video processing, video surveillance
and biomedical image and signal processing.
1 Introduction
Electrocardiogram is an electrophysiological signal which plays a vital role for
diagnosis of heart diseases. It is a recording of electrical activity of heart muscles using
electrodes placed on patient’s body. For a long time of monitoring, recorded
electrocardiogram (ECG) data may occupy large memory space. Compressed ECG data
acquires less memory space for storage and it also helps for efficient data transmission in
case of telemedicine (Dutta, 2015). Compressed ECG data also reduces time of
transmission of signal as due to compression number of bits to be transmitted becomes
less. Using telemedicine, expert cardiologists can receive ECG data in their own premises
and after analysis of the data; they can send required medical suggestions to patient side.
Thus compression of ECG data is very helpful for remote health monitoring of heart
patients from rural area (Chaudhari and Karule, 2014) which is a great challenge in India.
Compression algorithm of ECG data can also be helpful for big data (Sharma et al., 2014,
2015; Sharma, 2016) perspective of biomedical signal. In this paper, an efficient ECG
signal compression algorithm has been proposed which offers comparatively high
compression ratio and low percent root mean square difference. There are many
techniques of ECG data compression. Generally these techniques can be categorised
(Mamaghanian et al., 2011) as
1 direct time domain
2 transform domain
3 parameter extraction.
In direct time domain techniques, data samples of original signals are directly processed
for the compression. Examples of direct data compression techniques are: turning point
(TP) (Mueller, 1978), amplitude zone time epoch coding (AZTEC) (Cox et al., 1968),
Fan/SAPA (Ishijima et al., 1983), coordinate reduction time encoding system (CORTES)
(Abenstein and Tompkins, 1982) and entropy coding (Huffman, 1952). In transform
domain method, ECG signal is converted from time domain to frequency domain and
compression is performed by eliminating the insignificant spectral components. These
methods include discrete cosine transform (Narasimha and Peterson, 1978), discrete
wavelet transform (DWT) (Rajoub, 2002; Kolekar et al., 2013), Hermite transform
(Sandryhaila et al., 2012) etc. Parameter extraction method uses combination of direct
time domain and transform domain. In this method, particular features of the signal are
extracted, after that compression is performed. These methods are primarily based on

ECG data compression algorithm for tele-monitoring of cardiac patients 33
linear prediction and long-term prediction method (Nave and Cohen, 1993). These
methods include feature extraction using Hilbert transform (Bolton and Westphal, 1985),
curvature-based ECG signal compression (Kim et al., 2012) etc. Lai et al. (2013)
proposed an algorithm which is based on normalisation, DCT-IV calculation, amplitude
and sign bit separation, hybrid differential computation, non-uniform quantisation and
entropy coding. The algorithm uses 64 samples in each frame and processes them
one by one. Alam and Gupta (2014) proposed a differential pulse code modulation
(DPCM)-based ECG coder which compresses the ECG signal without compression of
QRS region. Sahoo et al. (2015) proposed an algorithm which uses empirical mode
decomposition, downsampling, DCT, window filtering and Huffman encoding.
Sadhukhan et al. (2015) proposed a compression algorithm based on adaptive bit
encoding of DFT coefficients. Real and imaginary parts of DFT coefficients are separated
and rounded, after that adaptive bit encoding is used to encode them. Mitra et al. (2012)
proposed an ECG data compression technique for GSM-based offline telecardiology
which was based on downsampling, single side difference generation, encoding and zero
sequence compression.
This paper proposed an efficient ECG signal compression and transmission algorithm
based on DCT and dual encoding technique. DCT converts ECG signal from time domain
to frequency domain and it compacts energy of the signal to lower order of frequency
coefficients. Using thresholding least significant coefficients are converted to zero.
Normalisation and rounding operations have been performed to create many repeated
data points. To improve further compression, dual encoding technique has been used
which comprised of run length encoding followed by Huffman encoding. Both the
encoding technique, encode data without any loss of information. This paper includes
five sections. Section 2 explains the proposed compression and decompression
methodology of ECG signal. Section 3 elaborates results and discussion and Section 4
includes conclusion and future scope part.
2 Proposed methodology
2.1 Compression and decompression procedures
Overview of the proposed algorithm has been shown in Figure 1 and Figure 2 which
elaborate compression and decompression procedures using block diagrams. To
implement the algorithm, 48 records of ECG signal have been used which are taken from
MIT-BIH database (Moody and Mark, 2001). All these records have1-channel, 360 Hz
of sampling frequency and 11-bit resolution’. Duration of each record of the ECG signal
is of 1 minute having 21,600 sample points. Before compression, each record of the ECG
signal occupies 45 kb of memory space and after compression average space occupied by
each record is 3.92 kb. The compression procedure can be followed step by step at
transmitter side of the tele-monitoring system and the decompression procedure at
receiver side. The first step of the compression process is Savitzky-Golay filtering
(Schafer, 2011) which smoothen the noisy ECG signal with preserving high frequency
component very well. Savitzky-Golay filter of order 3 and window size of 19 has been
used for filtering purpose. After filtering, downsampling has been performed to reduce
the data. To transform the downsampled ECG data from time domain to frequency
domain, discrete cosine transform (Narasimha and Peterson, 1978) has been used. A

34 C.K. Jha and M.H. Koleka
r
discrete cosine transform represents a finite amount of data in terms of sum of cosine
functions oscillating at different frequencies. DCT (Kolekar and Sengupta, 2004) is used
to concentrates energy of the signal in lower order of frequency coefficients. After
implementation of DCT, thresholding operation is applied to DCT coefficients to convert
least significant coefficients to zero. Normalisation with scale of 0–999 and rounding
operations (Mitra et al., 2012) produce a long string of repeated data points, so it can be
easily encoded by run length encoding. For normalisation and rounding operation, a
normalisation constant has been defined which can be represented as
()
999
Normalisation Constant,
max abs{c(i)}
k =
(1)
where c(i) is the array of thresholded transform coefficients.
Data array generated due to normalisation and rounding operation can be represented
as
()
d(i) round k * a(i)= (2)
Further the normalised and rounded data array is encoded by run length encoding and
Huffman encoding. Both the encoding techniques improve compression without any loss
of information. To decompress ECG signal inverse Huffman encoding, inverse run length
encoding, demoralisation and inverse DCT have been used. Spline interpolation (Sun
et al., 2007) is used as the inverse process of down sampling. After spline interpolation,
the reconstructed ECG signal is achieved.
2.2 Run length encoding
Run length encoding (Jha and Kolekar, 2016) has been used to encode normalised and
rounded transform coefficients of ECG signal. Normalised and rounded coefficients are
long runs of same data points and can be efficiently stored using run length encoding. For
example
X [22220000044433]=
Can be represented using run length encoding as
Y [24054332]=
X contains 14 elements and encoded signal Y contains 8 elements. Thus run length
encoding uses repetition of data and it provides improved compression performance.
2.3 Huffman encoding
Run length encoded data can be further compressed by Huffman encoding (Lee et al.,
2011) which is a probability-based encoding technique. It is a lossless encoding
technique which exploits repetition of data. In Huffman encoding, a probability-based
dictionary is generated and using that dictionary data is encoded. For reconstruction of
the ECG signal at the receiver side, both encoded data and dictionary must be transmitted
over the communication network. Huffman encoding improves compression performance
significantly.

ECG data compression algorithm for tele-monitoring of cardiac patients 35
Figure 1 Block diagram of compression process
ECG signal
Savitzky
Golay
filtering
Downsampling
Discrete cosine
transform
Thresholding
Run
length
encoding
Compressed
ECG signal
Normalisation
and rounding
Huffman
encoding
Figure 2 Block diagram of decompression process
Compressed
ECG signal
Inverse
Huffman
encoding
Inverse
run length
encoding
Denormalisation
Inverse DCT
Reconstructed
ECG signal
Spline
interpolation
2.4 Performance parameters
In literature many algorithms have been used for ECG data compression. Compression
performance of these algorithms can be evaluated by following performance parameters
(Lee et al., 2011):
1 Compression ratio (CR):
Size of Original ECG Signal (in bytes)
CR
Size of Compressed ECG Signal (in bytes)
=
(3)
2 Percent root-mean-square difference (PRD):
()
()
2
N1
n0
N1
2
n0
x(n) r(n)
PDR 100
x( )n
=
=
(4)
where x(n) is original ECG signal and r(n) is reconstructed ECG signal.

Citations
More filters
Journal ArticleDOI

An efficient compression of ECG signals using deep convolutional autoencoders

TL;DR: A new deep convolutional autoencoder (CAE) model for compressing ECG signals that can learn to use different ECG records automatically and allow secure data transfer in a low-dimensional form to remote medical centers is proposed.
Journal ArticleDOI

A Comparative Analysis of Methods for Evaluation of ECG Signal Quality after Compression

TL;DR: An overview of objective algorithms for the assessment of both ECG signal quality after compression and compression efficiency and a combination of these methods are recommended: PSim SDNN, QS, SNR1, MSE, PRDN1, MAX, STDERR, and WEDD SWT.
Journal ArticleDOI

Electrocardiogram data compression using DCT based discrete orthogonal Stockwell transform

TL;DR: A novel electrocardiogram (ECG) data compression algorithm which employs DCT based discrete orthogonal Stockwell transform which exploits the repetition of data instances to achieve higher compression without any relevant information loss is reported.
Journal ArticleDOI

Empirical Mode Decomposition and Wavelet Transform Based ECG Data Compression Scheme

TL;DR: A new electrocardiogram (ECG) data compression scheme which employs sifting function based empirical mode decomposition (EMD) and discrete wavelet transform and offers better compression performance with preserving the key features of the signal very well.
Journal ArticleDOI

Computationally Efficient Cosine Modulated Filter Bank Design for ECG Signal Compression

TL;DR: The proposed 8-channel uniform filter bank is used to detect the R-peak locations of the ECG signal and shows that beats of both signals ( original and reconstructed signals) are same.
References
More filters
Journal ArticleDOI

A method for the construction of minimum-redundancy codes

TL;DR: A minimum-redundancy code is one constructed in such a way that the average number of coding digits per message is minimized.
Journal ArticleDOI

The impact of the MIT-BIH Arrhythmia Database

TL;DR: The history of the database, its contents, what is learned about database design and construction, and some of the later projects that have been stimulated by both the successes and the limitations of the MIT-BIH Arrhythmia Database are reviewed.
Journal ArticleDOI

Compressed Sensing for Real-Time Energy-Efficient ECG Compression on Wireless Body Sensor Nodes

TL;DR: This paper quantifies the potential of the emerging compressed sensing (CS) signal acquisition/compression paradigm for low-complexity energy-efficient ECG compression on the state-of-the-art Shimmer WBSN mote and shows that CS represents a competitive alternative to state- of- the-art digital wavelet transform (DWT)-basedECG compression solutions in the context of WBSn-based ECG monitoring systems.
Journal ArticleDOI

AZTEC, a Preprocessing Program for Real-Time ECG Rhythm Analysis

TL;DR: A preprocessing program developed for real-time monitoring of the electrocardiogram by digital computer has proved useful for rhythm analysis.
Journal ArticleDOI

On the Computation of the Discrete Cosine Transform

TL;DR: An N -point discrete Fourier transform (DFT) algorithm can be used to evaluate a discrete cosine transform by a simple rearrangement of the input data.
Related Papers (5)