scispace - formally typeset
Journal ArticleDOI

Prediction coding of ECG by nonlinear autoregressive model

TLDR
This paper shows that the proposed method using a nonlinear model is especially effective in improving the efficiency of coding of ECGs since the improvement of the coding efficiency is at most 0.1 bit with ECG coding methods using linear transforms.
Abstract
In linear prediction coding methods for ECGs using linear autoregressive models, the prediction accuracy of QRS waves is poor, which is not improved even when the prediction degree is set higher than second or third degree. In this paper, this is attributed to the fact that a QRS wave is produced by a nonlinear occurrence mechanism and ECGs contain nonlinear components that cannot be predicted by linear models. A nonlinear prediction coding method for ECGs using a layered neural network or a Volterra functional series, such as are used frequently to identify nonlinear systems, is proposed as a nonlinear autoregressive model. The accuracy of prediction of QRS complex is improved by using a nonlinear model, the average code length in the bit rate region greater than 3 bits is improved by about 0.1 to 0.3 bit, and superior coding efficiency is realized. This paper shows that the proposed method using a nonlinear model is especially effective in improving the efficiency of coding of ECGs since the improvement of the coding efficiency is at most 0.1 bit with ECG coding methods using linear transforms, such as linear prediction, orthogonal wavelet transforms, and the like. © 2000 Scripta Technica, Syst Comp Jpn, 31(7): 66–74, 2000

read more

Citations
More filters
Proceedings ArticleDOI

Data reconstruction for missing electrocardiogram using linear predictive coding

TL;DR: The proposed electrocardiogram reconstruction method based on a linear prediction technique yields good results on the heart rate variability (HRV) measure derivation and gives the time-domain HRV measures that are very close to the ground truths.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Journal ArticleDOI

Orthonormal bases of compactly supported wavelets

TL;DR: This work construct orthonormal bases of compactly supported wavelets, with arbitrarily high regularity, by reviewing the concept of multiresolution analysis as well as several algorithms in vision decomposition and reconstruction.
Journal ArticleDOI

On the approximate realization of continuous mappings by neural networks

K. Funahashi
- 01 May 1989 - 
TL;DR: It is proved that any continuous mapping can be approximately realized by Rumelhart-Hinton-Williams' multilayer neural networks with at least one hidden layer whose output functions are sigmoid functions.
Journal ArticleDOI

Testing for nonlinearity in time series: the method of surrogate data

TL;DR: In this article, a statistical approach for identifying nonlinearity in time series is described, which first specifies some linear process as a null hypothesis, then generates surrogate data sets which are consistent with this null hypothesis and finally computes a discriminating statistic for the original and for each of the surrogate sets.

Testing for nonlinearity in time series: The method of surrogate data

TL;DR: A statistical approach for identifying nonlinearity in time series which is demonstrated for numerical data generated by known chaotic systems, and applied to a number of experimental time series, which arise in the measurement of superfluids, brain waves, and sunspots.
Related Papers (5)