scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Dissimilarity factor based classification of inferior myocardial infarction ECG

01 Jan 2016-pp 229-233
TL;DR: The promise of descriptive statistical tools as an alternative for medical signal analysis is shown by the use of a statistical index, namely, dissimilarity factor (D) for classification of normal and Inferior Myocardial Infarction (IMI) data, without the need of any direct clinical feature extraction.
Abstract: Electrocardiography (ECG) is popular non-invasive technique for preliminary level investigation on cardiovascular assessment. Computerized analysis of ECG can significantly contribute towards assisted diagnosis and in early detection of many cardiac diseases. Conventional automated ECG classifiers employing soft computing tools may suffer from the inaccuracies that may result in different clinical feature extraction stages. In this paper, we propose the use of a statistical index, namely, dissimilarity factor (D) for classification of normal and Inferior Myocardial Infarction (IMI) data, without the need of any direct clinical feature extraction. Time aligned ECG beats were obtained through filtering, wavelet decomposition processes, followed by PCA based beat enhancement to generate multivariate time series data. The T wave and QRS segments of IMI datasets from Lead II, III and aVF were extracted and compared with corresponding segments of healthy patients using Physionet ptbdb data. With 35 IMI datasets, the average composite dissimilarity factor Dc between normal data was found to be 0.39, and the same between normal and abnormal data were found to be 0.65. This paper shows the promise of descriptive statistical tools as an alternative for medical signal analysis.
Citations
More filters
Journal ArticleDOI
TL;DR: An overview of the methods proposed for automatic detection of ischemia and myocardial infarction using computer algorithms focuses on their historical evolution, the publicly available datasets that they have used to evaluate their performance, and the details of their algorithms for ECG and EHR analysis.
Abstract: There is a growing body of research focusing on automatic detection of ischemia and myocardial infarction (MI) using computer algorithms. In clinical settings, ischemia and MI are diagnosed using electrocardiogram (ECG) recordings as well as medical context including patient symptoms, medical history, and risk factors—information that is often stored in the electronic health records. The ECG signal is inspected to identify changes in the morphology such as ST-segment deviation and T-wave changes. Some of the proposed methods compute similar features automatically while others use nonconventional features such as wavelet coefficients. This review provides an overview of the methods that have been proposed in this area, focusing on their historical evolution, the publicly available datasets that they have used to evaluate their performance, and the details of their algorithms for ECG and EHR analysis. The validation strategies that have been used to evaluate the performance of the proposed methods are also presented. Finally, the paper provides recommendations for future research to address the shortcomings of the currently existing methods and practical considerations to make the proposed technical solutions applicable in clinical practice.

68 citations


Cites background or methods from "Dissimilarity factor based classifi..."

  • ...Gupta and Kundu [110] also used a fourth-order low-pass Butterworth filter with a cutoff frequency of 90 Hz prior to DWT filtering....

    [...]

  • ...In addition to DWT filtering, Gupta and Kundu [110] also used a fourth-order low-pass Butterworth filter with a cutoff frequency of 90 Hz prior to DWT filtering....

    [...]

  • ...In [110], Gupta and Kundu thresholded the combination of D2 and D3 wavelet coefficients using db6 mother wavelet to find the QRS peaks....

    [...]

  • ...This method has been employed in [17], [82], [110]–[113]....

    [...]

  • ...Gupta and Kundu [110] examined the absolute slope of the ECG wave in a window from 600 ms after the R wave to the end of the beat to find the T wave....

    [...]

Journal ArticleDOI
01 Oct 2022-Sensors
TL;DR: A systematic review of time series classification models and interpretation methods for biomedical applications found engineered features using time series methods that were subsequently fed into widely used machine learning classifiers were the most commonly used technique, and also most frequently achieved the best performance metrics.
Abstract: Background: Digital clinical measures collected via various digital sensing technologies such as smartphones, smartwatches, wearables, and ingestible and implantable sensors are increasingly used by individuals and clinicians to capture the health outcomes or behavioral and physiological characteristics of individuals. Time series classification (TSC) is very commonly used for modeling digital clinical measures. While deep learning models for TSC are very common and powerful, there exist some fundamental challenges. This review presents the non-deep learning models that are commonly used for time series classification in biomedical applications that can achieve high performance. Objective: We performed a systematic review to characterize the techniques that are used in time series classification of digital clinical measures throughout all the stages of data processing and model building. Methods: We conducted a literature search on PubMed, as well as the Institute of Electrical and Electronics Engineers (IEEE), Web of Science, and SCOPUS databases using a range of search terms to retrieve peer-reviewed articles that report on the academic research about digital clinical measures from a five-year period between June 2016 and June 2021. We identified and categorized the research studies based on the types of classification algorithms and sensor input types. Results: We found 452 papers in total from four different databases: PubMed, IEEE, Web of Science Database, and SCOPUS. After removing duplicates and irrelevant papers, 135 articles remained for detailed review and data extraction. Among these, engineered features using time series methods that were subsequently fed into widely used machine learning classifiers were the most commonly used technique, and also most frequently achieved the best performance metrics (77 out of 135 articles). Statistical modeling (24 out of 135 articles) algorithms were the second most common and also the second-best classification technique. Conclusions: In this review paper, summaries of the time series classification models and interpretation methods for biomedical applications are summarized and categorized. While high time series classification performance has been achieved in digital clinical, physiological, or biomedical measures, no standard benchmark datasets, modeling methods, or reporting methodology exist. There is no single widely used method for time series model development or feature interpretation, however many different methods have proven successful.

1 citations

Journal ArticleDOI
TL;DR: This paper aims at development of efficient ECG diagnosing system for detection of MI within small span, using novel filtering technique to remove the external noises present in ECG signal.
Abstract: Objectives: This paper aims at development of efficient ECG diagnosing system for detection of MI within small span, using novel filtering technique to remove the external noises present in ECG signal Methods/Statistical Analysis: The medical experts study the electrical activity of the human heart in order to detect heart diseases from the electrocardiogram (ECG) of the heart patients A Myocardial Infarction (MI) or Heart Attack is a heart disease that occurs due to a block (blood clot) in the pathway of one or more coronary blood vessels (arteries) which supplies blood to the heart muscle The abnormalities in the heart can be identified by the changes in the ECG signal The conventional approaches are time consuming & require too much time for the analysis of ECG signal In this paper new technique for filtering is being introduces for removing the external noises present in ECG signal Findings: The proposed approach evaluates the Power Spectral Density of noise filtered bands and then classification is done by using classifier The classifier performs the comparison between the features of query database and the features of sample database and reveals the type of heart disease By using proposed technique in this paper, the diagnosing accuracy is increased up to 9682% Application/Improvements: This proposes technique is best suited for all modern ECG instrument for better accuracy and analysis of ECG signal

1 citations

Proceedings ArticleDOI
16 May 2016
TL;DR: This study presents a novel approach to define the formal information of heartbeat and detect the R-wave case in the workspace of Electrocardiogram (ECG) using a recursive algorithm based on partition and intensity.
Abstract: This study presents a novel approach to define the formal information of heartbeat and detect the R-wave case in the workspace of Electrocardiogram (ECG). Analyze of QRS wave location gives opportunity to define the heart abnormality A few methods were developed to detection of rhythm disorder by using classical classification methods. The method proposed in this study is a recursive algorithm based on partition and intensity. In this study, MIT-BIH rhythm disorder database was used. The position of R waves detected by using this new method has high reliability. It is able to evaluate heartbeat with high accuracy thanks to proposed method. Detection of QRS wave location was achieved with 95% accuracy rates.
References
More filters
Journal ArticleDOI
TL;DR: An algorithm based on wavelet transforms (WT's) has been developed for detecting ECG characteristic points and the relation between the characteristic points of ECG signal and those of modulus maximum pairs of its WT's is illustrated.
Abstract: An algorithm based on wavelet transforms (WT's) has been developed for detecting ECG characteristic points. With the multiscale feature of WT's, the QRS complex can be distinguished from high P or T waves, noise, baseline drift, and artifacts. The relation between the characteristic points of ECG signal and those of modulus maximum pairs of its WT's is illustrated. By using this method, the detection rate of QRS complexes is above 99.8% for the MIT/BIH database and the P and T waves can also be detected, even with serious base line drift and noise. >

1,637 citations


"Dissimilarity factor based classifi..." refers background in this paper

  • ...A high volume of publication is available on various ECG feature extraction methodologies [6-8], among which, transform domain methods have gained popularity in the last decade due to their ability to magnify specific events in the data in a different domain and to suppress unwanted information....

    [...]

Journal ArticleDOI
TL;DR: Several ECG applications are reviewed where PCA techniques have been successfully employed, including data compression, ST-T segment analysis for the detection of myocardial ischemia and abnormalities in ventricular repolarization, extraction of atrial fibrillatory waves for detailed characterization of atrium fibrillation, and analysis of body surface potential maps.
Abstract: This paper reviews the current status of principal component analysis in the area of ECG signal processing. The fundamentals of PCA are briefly described and the relationship between PCA and Karhunen-Loeve transform is explained. Aspects on PCA related to data with temporal and spatial correlations are considered as adaptive estimation of principal components is. Several ECG applications are reviewed where PCA techniques have been successfully employed, including data compression, ST-T segment analysis for the detection of myocardial ischemia and abnormalities in ventricular repolarization, extraction of atrial fibrillatory waves for detailed characterization of atrial fibrillation, and analysis of body surface potential maps.

322 citations


"Dissimilarity factor based classifi..." refers methods in this paper

  • ...PCA has found wide applications in medical signal processing, namely, ECG enhancement, classification and compression [14-16]....

    [...]

Journal ArticleDOI
TL;DR: It is shown that the application of the generalized eigenvalue decomposition is an improved extension of conventional source separation techniques, specifically customized for ECG signals.
Abstract: In this letter, we propose the application of the generalized eigenvalue decomposition for the decomposition of multichannel electrocardiogram (ECG) recordings. The proposed method uses a modified version of a previously presented measure of periodicity and a phase-wrapping of the RR-interval, for extracting the ldquomost periodicrdquo linear mixtures of a recorded dataset. It is shown that the method is an improved extension of conventional source separation techniques, specifically customized for ECG signals. The method is therefore of special interest for the decomposition and compression of multichannel ECG, and for the removal of maternal ECG artifacts from fetal ECG recordings.

208 citations

Journal ArticleDOI
TL;DR: The NLPCA techniques are used to classify each segment into one of two classes: normal and abnormal (ST+, ST-, or artifact) and test results show that using only two nonlinear components and a training set of 1000 normal samples from each file produce a correct classification rate.
Abstract: The detection of ischemic cardiac beats from a patient's electrocardiogram (EGG) signal is based on the characteristics of a specific part of the beat called the ST segment. The correct classification of the beats relies heavily on the efficient and accurate extraction of the ST segment features. An algorithm is developed for this feature extraction based on nonlinear principal component analysis (NLPCA). NLPCA is a method for nonlinear feature extraction that is usually implemented by a multilayer neural network. It has been observed to have better performance, compared with linear principal component analysis (PCA), in complex problems where the relationships between the variables are not linear. In this paper, the NLPCA techniques are used to classify each segment into one of two classes: normal and abnormal (ST+, ST-, or artifact). During the algorithm training phase, only normal patterns are used, and for classification purposes, we use only two nonlinear features for each ST segment. The distribution of these features is modeled using a radial basis function network (RBFN). Test results using the European ST-T database show that using only two nonlinear components and a training set of 1000 normal samples from each file produce a correct classification rate of approximately 80% for the normal beats and higher than 90% for the ischemic beats.

174 citations


"Dissimilarity factor based classifi..." refers background in this paper

  • ...or a suitable combination of these [9-11] or, statistical pattern recognition techniques [12-13]....

    [...]

Journal ArticleDOI
TL;DR: A modified combined wavelet transform technique that has been developed to analyse multilead electrocardiogram signals for cardiac disease diagnostics and two alternate diagnostic criteria have been used to check the diagnostic authenticity of the test results.
Abstract: This paper deals with a modified combined wavelet transform technique that has been developed to analyse multilead electrocardiogram signals for cardiac disease diagnostics. Two wavelets have been used, i.e. a quadratic spline wavelet (QSWT) for QRS detection and the Daubechies six coefficient (DU6) wavelet for P and T detection. After detecting the fundamental electrocardiogram waves, the desired electrocardiogram parameters for disease diagnostics are extracted. The software has been validated by extensive testing using the CSE DS-3 database and the MIT/BIH database. A procedure has been evolved using electrocardiogram parameters with a point scoring system for diagnosis of cardiac diseases, namely tachycardia, bradycardia left ventricular hypertrophy, and right ventricular hypertrophy. As the diagnostic results are not yet disclosed by the CSE group, two alternate diagnostic criteria have been used to check the diagnostic authenticity of the test results. The consistency and reliability of the identifi...

169 citations


"Dissimilarity factor based classifi..." refers background in this paper

  • ...A high volume of publication is available on various ECG feature extraction methodologies [6-8], among which, transform domain methods have gained popularity in the last decade due to their ability to magnify specific events in the data in a different domain and to suppress unwanted information....

    [...]