scispace - formally typeset
Search or ask a question
Book ChapterDOI

Morphological Analysis of ECG Holter Recordings by Support Vector Machines

TL;DR: The results shown in the paper prove that the method can classify pathologies observed not only in the QRS alterations but also in P (or F), S and T waves of electrocardiograms.
Abstract: A new method of automatic shape recognition of heartbeats from ECG Holter recordings is presented. The mathematical basis of this method is the theory of support vector machine, a new paradigm of learning machine. The method consists of the following steps: signal preprocessing by digital filters, segmentation of the Holter recording into a series of heartbeats by wavelet technique, support vector approximation of each heartbeat with the use of Gaussian kernels, support vector classification of heartbeats. The learning sets for classification are prepared by physician. Hence, we offer a learning machine as a computer-aided tool for medical diagnosis. This tool is flexible and may be tailored to the interest of physicians by setting up the learning samples. The results shown in the paper prove that our method can classify pathologies observed not only in the QRS alterations but also in P (or F), S and T waves of electrocardiograms. The advantages of our method are numerical efficiency and very high score of successful classification.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper describes the automatic extraction of the P, Q, R, S and T waves of electrocardiographic recordings (ECGs) through the combined use of a new machine-learning algorithm termed generalized orthogonal forward regression (GOFR) and of a specific parameterized function termed Gaussian mesa function (GMF).

51 citations


Cites background from "Morphological Analysis of ECG Holte..."

  • ...[19] S. Jankowski, J. Tijink, G. Vumbaca, B. Marco, G. Karpinski, Morphological Analysis of ECG Holter Recordings by Support Vector Machines, in: Proceedings of the Third International Symposium on Medical Data Analysis, Springer-Verlag, 2002....

    [...]

  • ...Markov Models [16–18] or Support Vector Machines [19]....

    [...]

  • ...[46] N. Cristianini, J. Shawe-Taylor, An Intoduction to Support Vector Machines, University Press, Cambridge, 2000....

    [...]

Proceedings ArticleDOI
01 Jan 2003
TL;DR: A new approach to computer-aided analysis of ECG Holter recordings that is a learning system: the pertinent features of the signal shape are automatically discovered upon the examples carefully selected and commented by cardiologists.
Abstract: The paper presents a new approach to computer-aided analysis of ECG Holter recordings. In contrast to existing tools it is a learning system: the pertinent features of the signal shape are automatically discovered upon the examples carefully selected and commented by cardiologists. Mathematical basis of our system is the theory of support vector machines that are applied for two tasks: signal approximation and pattern classification. Numerical procedures implement the algorithm of sequential minimal optimisation. The computer program is developed in Borland C++ Builder environment. The excellent performances of our approach, high rate of successful pattern recognition and computational efficiency, make use of our tools possible in clinical practice. The system is tested at the Chair and Department of Internal Medicine and Cardiology, Central Teaching Hospital in Warsaw, Poland.

24 citations

Journal Article
TL;DR: The approach improved risk stratification up to 95% based on SAECG due to the application of FIR filter, 6 new parameters and efficient statistical classifier, the support vector machine.
Abstract: OBJECTIVE We present the improved method of recognition of sustained ventricular tachycardia (SVT) based on new filtering technique (FIR), extended signal-averaged electrocardiography (SAECG) description by 9 parameters and the application of support vector machine (SVM) classifier. METHODS The dataset consisted of 376 patients (100 patients with sustained ventricular tachycardia after myocardial infarction (MI) labelled as class SVT+, 176 patients without sustained ventricular tachycardia after MI and 77 healthy persons, 50% of data were left for validation. The analysis of SAECG was performed by 2 types of filtration: low pass four-pole IIR Butterworth filter and FIR filter with Kaiser window. We calculated 3 commonly used SAECG parameters: hfQRS (ms), RMS40 (microV), LAS<40 microV(ms) and 6 new parameters: LAS<25 microV(ms) - duration of the low amplitude <25 microV signals at the end of QRS complex; RMS QRS(microV) - root mean square voltage of the filtered QRS complex; pRMS(microV) - root mean square voltage of the first 40 ms of filtered QRS complex; pLAS(ms) - duration of the low amplitude <40 microV signals in front of QRS complex; RMS t1(microV) - root mean square voltage of the last 10 ms the filtered QRS complex; RMS t2(microV) - root mean square voltage of the last 20 ms the filtered QRS complex. For the recognition of SVT+ class patients we used the SVM with the Gaussian kernel. RESULTS The results confirmed good generalization of obtained models. The recognition score (calculated as correct classification/total number of patients) of SVT+patients on data set containing 3 standard parameters (Butterworth filter) is 92.55%. The same score was obtained for data set containing 9 parameters (Butterworth filter). The best score (95.21%) was obtained for data set based on 9 parameters and FIR filter. CONCLUSION Our approach improved risk stratification up to 95% based on SAECG due to the application of FIR filter, 6 new parameters and efficient statistical classifier, the support vector machine.

12 citations


Cites methods from "Morphological Analysis of ECG Holte..."

  • ...The SVM classifiers were successfully applied to other recognition tasks in cardiology (9, 10)....

    [...]

Journal ArticleDOI
TL;DR: This research attacked the mode of action of EMTs by focusing on the response of the immune system to EMM, and found that a single dose of EMM can have an important effect on the severity of the disease.
Abstract: 1 Institute of Energy Problems of Chemical Physics, Russian Academy of Sciences Russia, 119334 Moscow, Leninsky prosp. 28, building 2 2 Institute of Biology and Chemistry, Moscow Pedagogical State University Russia, 129164, Moscow, Kibalchicha str. 6, building 1 3 Bakulev Scientific Center of Cardiovascular Surgery Russia, 121552, Moscow, Rublevskoe sh. 135 * Corresponding author: phone: +7 (915) 492-29-43 , e-mail: neurobiophys@gmail.com

8 citations


Cites background from "Morphological Analysis of ECG Holte..."

  • ...In contrast, the power fluctuations are more significant (see Part III of this paper), so it could not be applied for the correct morphological analysis of electrocardiograms [28,29]....

    [...]

Book ChapterDOI
10 Nov 2005
TL;DR: A modified version of principal component analysis of 3-channel Holter recordings that enables to construct one SVM linear classifier for the selected group of patients with arrhythmias that has perfect generalization properties is presented.
Abstract: The paper presents a modified version of principal component analysis of 3-channel Holter recordings that enables to construct one SVM linear classifier for the selected group of patients with arrhythmias. Our classifier has perfect generalization properties. We studied the discrimination of premature ventricular excitation from normal ones. The high score of correct classification (95%) is due to the orientation of the system of coordinates along the largest eigenvector of the normal heart action of every patient under study.

5 citations

References
More filters
01 Jan 1998
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Abstract: A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis of empirical data. Highly applicable to a variety of computer science and robotics fields, this book offers lucid coverage of the theory as a whole. Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

26,531 citations

Proceedings ArticleDOI
08 Feb 1999
TL;DR: Support vector machines for dynamic reconstruction of a chaotic system, Klaus-Robert Muller et al pairwise classification and support vector machines, Ulrich Kressel.
Abstract: Introduction to support vector learning roadmap. Part 1 Theory: three remarks on the support vector method of function estimation, Vladimir Vapnik generalization performance of support vector machines and other pattern classifiers, Peter Bartlett and John Shawe-Taylor Bayesian voting schemes and large margin classifiers, Nello Cristianini and John Shawe-Taylor support vector machines, reproducing kernel Hilbert spaces, and randomized GACV, Grace Wahba geometry and invariance in kernel based methods, Christopher J.C. Burges on the annealed VC entropy for margin classifiers - a statistical mechanics study, Manfred Opper entropy numbers, operators and support vector kernels, Robert C. Williamson et al. Part 2 Implementations: solving the quadratic programming problem arising in support vector classification, Linda Kaufman making large-scale support vector machine learning practical, Thorsten Joachims fast training of support vector machines using sequential minimal optimization, John C. Platt. Part 3 Applications: support vector machines for dynamic reconstruction of a chaotic system, Davide Mattera and Simon Haykin using support vector machines for time series prediction, Klaus-Robert Muller et al pairwise classification and support vector machines, Ulrich Kressel. Part 4 Extensions of the algorithm: reducing the run-time complexity in support vector machines, Edgar E. Osuna and Federico Girosi support vector regression with ANOVA decomposition kernels, Mark O. Stitson et al support vector density estimation, Jason Weston et al combining support vector and mathematical programming methods for classification, Bernhard Scholkopf et al.

5,506 citations

Book
01 Jan 1996
TL;DR: The Bayes Error and Vapnik-Chervonenkis theory are applied as guide for empirical classifier selection on the basis of explicit specification and explicit enforcement of the maximum likelihood principle.
Abstract: Preface * Introduction * The Bayes Error * Inequalities and alternatedistance measures * Linear discrimination * Nearest neighbor rules *Consistency * Slow rates of convergence Error estimation * The regularhistogram rule * Kernel rules Consistency of the k-nearest neighborrule * Vapnik-Chervonenkis theory * Combinatorial aspects of Vapnik-Chervonenkis theory * Lower bounds for empirical classifier selection* The maximum likelihood principle * Parametric classification *Generalized linear discrimination * Complexity regularization *Condensed and edited nearest neighbor rules * Tree classifiers * Data-dependent partitioning * Splitting the data * The resubstitutionestimate * Deleted estimates of the error probability * Automatickernel rules * Automatic nearest neighbor rules * Hypercubes anddiscrete spaces * Epsilon entropy and totally bounded sets * Uniformlaws of large numbers * Neural networks * Other error estimates *Feature extraction * Appendix * Notation * References * Index

3,598 citations

Journal Article
John Platt1
TL;DR: The sequential minimal optimization (SMO) algorithm as mentioned in this paper uses a series of smallest possible QP problems to solve a large QP problem, which avoids using a time-consuming numerical QP optimization as an inner loop.
Abstract: This paper proposes a new algorithm for training support vector machines: Sequential Minimal Optimization, or SMO. Training a support vector machine requires the solution of a very large quadratic programming (QP) optimization problem. SMO breaks this large QP problem into a series of smallest possible QP problems. These small QP problems are solved analytically, which avoids using a time-consuming numerical QP optimization as an inner loop. The amount of memory required for SMO is linear in the training set size, which allows SMO to handle very large training sets. Because matrix computation is avoided, SMO scales somewhere between linear and quadratic in the training set size for various test problems, while the standard chunking SVM algorithm scales somewhere between linear and cubic in the training set size. SMO’s computation time is dominated by SVM evaluation, hence SMO is fastest for linear SVMs and sparse data sets. On realworld sparse data sets, SMO can be more than 1000 times faster than the chunking algorithm.

2,856 citations

Journal ArticleDOI
TL;DR: Support vector machines are a family of machine learning methods originally introduced for the problem of classification and later generalized to various other situations, and are currently used in various domains of application, including bioinformatics, text categorization, and computer vision.
Abstract: Support vector machines (SVMs) are a family of machine learning methods, originally introduced for the problem of classification and later generalized to various other situations. They are based on principles of statistical learning theory and convex optimization, and are currently used in various domains of application, including bioinformatics, text categorization, and computer vision. Copyright © 2009 John Wiley & Sons, Inc. For further resources related to this article, please visit the WIREs website.

323 citations