Author
E. Kresch
Bio: E. Kresch is an academic researcher from Villanova University. The author has contributed to research in topic(s): Data compression. The author has an hindex of 1, co-authored 1 publication(s) receiving 19 citation(s).
Topics: Data compression
Papers
More filters
TL;DR: A study of ECG compression using an upper bound on the percentage root mean square difference (PRD) is presented, which could be specified by the clinician after correlating the quality of the compressed versions of the ECG and the resulting PRD.
Abstract: The main goal of any electrocardiogram (ECG) compression algorithm is to reduce the bit rate while keeping the signal distortion at a clinically acceptable level. Percentage root mean square difference (PRD), the commonly used figure of merit, does not directly reveal whether the clinically significant ECG waveform information is preserved or not. We present the results of a study of ECG compression using an upper bound on the PRD. This bound is based on the initial performance of the algorithm and could be specified by the clinician after correlating the quality of the compressed versions of the ECG and the resulting PRD.
19 citations
Cited by
More filters
01 Dec 2001
TL;DR: The results showed that truncated SVD method can provide an efficient coding with high-compression ratios and demonstrated the method as an effective technique for ECG data storage or signals transmission.
Abstract: The method of truncated singular value decomposition (SVD) is proposed for electrocardiogram (ECG) data compression. The signal decomposition capability of SVD is exploited to extract the significant feature components of the ECG by decomposing the ECG into a set of basic patterns with associated scaling factors. The signal information is mostly concentrated within a certain number of singular values with related singular vectors due to the strong interbeat correlation among ECG cycles. Therefore, only the relevant parts of the singular triplets need to be retained as the compressed data for retrieving the original signals. The insignificant overhead can be truncated to eliminate the redundancy of ECG data compression. The Massachusetts Institute of Technology-Beth Israel Hospital arrhythmia database was applied to evaluate the compression performance and recoverability in the retrieved ECG signals. The approximate achievement was presented with an average data rate of 143.2 b/s with a relatively low reconstructed error. These results showed that the truncated SVD method can provide efficient coding with high-compression ratios. The computational efficiency of the SVD method in comparing with other techniques demonstrated the method as an effective technique for ECG data storage or signals transmission.
182 citations
TL;DR: A two-dimensional wavelet-based electrocardiogram (ECG) data compression method which employs a modified set partitioning in hierarchical trees (SPIHT) algorithm and achieves high compression ratio with relatively low distortion and is effective for various kinds of ECG morphologies.
Abstract: A two-dimensional (2-D) wavelet-based electrocardiogram (ECG) data compression method is presented which employs a modified set partitioning in hierarchical trees (SPIHT) algorithm. This modified SPIHT algorithm utilizes further the redundancy among medium- and high-frequency subbands of the wavelet coefficients and the proposed 2-D approach utilizes the fact that ECG signals generally show redundancy between adjacent beats and between adjacent samples. An ECG signal is cut and aligned to form a 2-D data array, and then 2-D wavelet transform and the modified SPIHT can be applied. Records selected from the MIT-BIH arrhythmia database are tested. The experimental results show that the proposed method achieves high compression ratio with relatively low distortion and is effective for various kinds of ECG morphologies.
128 citations
TL;DR: A new deep convolutional autoencoder (CAE) model for compressing ECG signals that can learn to use different ECG records automatically and allow secure data transfer in a low-dimensional form to remote medical centers is proposed.
Abstract: Background and objective Advances in information technology have facilitated the retrieval and processing of biomedical data. Especially with wearable technologies and mobile platforms, we are able to follow our healthcare data, such as electrocardiograms (ECG), in real time. However, the hardware resources of these technologies are limited. For this reason, the optimal storage and safe transmission of the personal health data is critical. This study proposes a new deep convolutional autoencoder (CAE) model for compressing ECG signals. Methods In this paper, a deep network structure of 27 layers consisting of encoder and decoder parts is designed. In the encoder section of this model, the signals are reduced to low-dimensional vectors; and in the decoder section, the signals are reconstructed. The deep learning approach provides the representations of the low and high levels of signals in the hidden layers of the model. Hence, the original signal can be reconstructed with minimal loss. Very different from traditional linear transformation methods, a deep compression approach implies that it can learn to use different ECG records automatically. Results The performance was evaluated on an experimental data set comprising 4800 ECG fragments from 48 unique clinical patients. The compression rate (CR) of the proposed model was 32.25, and the average PRD value was 2.73%. These favourable observation suggest that our deep model can allow secure data transfer in a low-dimensional form to remote medical centers. We present an effective compression approach that can potentially be used in wearable devices, e-health applications, telemetry and Holter systems.
102 citations
TL;DR: A new algorithm for electrocardiogram (ECG) compression based on the compression of the linearly predicted residuals of the wavelet coefficients of the signal, which reduces the bit rate while keeping the reconstructed signal distortion at a clinically acceptable level.
Abstract: This paper describes a new algorithm for electrocardiogram (ECG) compression. The main goal of the algorithm is to reduce the bit rate while keeping the reconstructed signal distortion at a clinically acceptable level. It is based on the compression of the linearly predicted residuals of the wavelet coefficients of the signal. In this algorithm, the input signal is divided into blocks and each block goes through a discrete wavelet transform; then the resulting wavelet coefficients are linearly predicted. In this way, a set of uncorrelated transform domain signals is obtained. These signals are compressed using various coding methods, including modified run-length and Huffman coding techniques. The error corresponding to the difference between the wavelet coefficients and the predicted coefficients is minimized in order to get the best predictor. The method is assessed through the use of percent root-mean square difference (PRD) and visual inspection measures. By this compression method, small PRD and high compression ratio with low implementation complexity are achieved. Finally, we have compared the performance of the ECG compression algorithm on data from the MIT-BIH database.
91 citations