scispace - formally typeset
Search or ask a question

Showing papers in "Computing in Cardiology in 2011"


Journal Article
TL;DR: The aim of the PhysioNet/Computing in Cardiology Challenge 2011 was to develop an efficient algorithm able to run within a mobile phone, that can provide useful feedback in the process of acquiring a diagnostically useful 12-lead ECG recordings.
Abstract: The aim of the PhysioNet/Computing in Cardiology Challenge 2011 was to develop an efficient algorithm able to run within a mobile phone, that can provide useful feedback in the process of acquiring a diagnostically useful 12-lead ECG recordings. PhysioNet provided a large set of ECG records for use in the Challenge, together with an open-source sample application that can run on an Android phone, and can classify ECGs as acceptable or unacceptable. A total of 49 teams and individuals participated in challenge, which entailed three events. In event 1, participants developed algorithms for classifying ECGs with respect to quality, and submitted their algorithms' classifications of 500 ECGs, obtaining 89–93% accuracy using variety of methods. In event 2, participants submitted Java implementations of their algorithms to be used in the sample mobile application; we tested these in two reference mobile phones using the same data set and scoring method as in event 1, obtaining 80–91% accuracy. Event 3 was similar to event 2, but was conducted using a set of ECGs not available for study by the participants, and the scoring was a function of both accuracy and mobile phone processing speed; in this event, similar levels of accuracy were achieved with average execution times of less than 2 seconds on the reference phones.

138 citations


Journal Article
A H Chen1, S Y Huang1, P S Hong1, C H Cheng1, E J Lin1 
TL;DR: The HDPS system developed in this study is a novel approach that can be used in the classification of heart disease and includes an artificial neural network algorithm for classifying heart disease based on clinical features.
Abstract: The diagnosis of heart disease in most cases depends on a complex combination of clinical and pathological data. Because of this complexity, there exists a significant amount of interest among clinical professionals and researchers regarding the efficient and accurate prediction of heart disease. In this paper, we develop a heart disease predict system that can assist medical professionals in predicting heart disease status based on the clinical data of patients. Our approaches include three steps. Firstly, we select 13 important clinical features, i.e., age, sex, chest pain type, trestbps, cholesterol, fasting blood sugar, resting ecg, max heart rate, exercise induced angina, old peak, slope, number of vessels colored, and thal. Secondly, we develop an artificial neural network algorithm for classifying heart disease based on these clinical features. The accuracy of prediction is near 80%. Finally, we develop a user-friendly heart disease predict system (HDPS). The HDPS system will be consisted of multiple features, including input clinical data section, ROC curve display section, and prediction performance display section (execute time, accuracy, sensitivity, specificity, and predict result). Our approaches are effective in predicting the heart disease of a patient. The HDPS system developed in this study is a novel approach that can be used in the classification of heart disease.

94 citations


Journal Article
TL;DR: Investigation of various heart rate variability measures for detecting mental stress by using ultra short term HRV analysis revealed that mRR, mHR, normalized LF, difference between normalized LF and normalized HF, and SVI were effective measures for mental stress state and normal state classification.
Abstract: Mental stress is one of the well known major risk factors for many diseases such as hypertension, coronary artery disease, heart attack, etc. Conventionally, detecting mental stress in an individual is performed by interviews and/or questionnaires. In this study, we have investigated various heart rate variability (HRV) measures for detecting mental stress by using ultra short term HRV analysis. A number of HRV measures were investigated, e.g, Mean of heart rates (mHR), Mean of RR intervals (mRR), Power spectra in Very Low (VLF), Low (LF), and High (HF) frequency ranges, Symphatovagal balance index (SVI), etc. Experiments involved 60 segments of RR interval time series signals during mental stress state and normal state. Results revealed that the following HRV measures: mRR, mHR, normalized LF, difference between normalized LF and normalized HF, and SVI were effective measures for mental stress state and normal state classification.

91 citations


Journal Article
TL;DR: Nine AF detection algorithms were selected from literature and evaluated with the same protocol in order to study their performance under different conditions and showed the highest sensitivity and specificity was achieved with methods based on analysis of irregularity of RR interval.
Abstract: Automatic detection of Atrial Fibrillation (AF) is necessary for the long-term monitoring of patients who are suspected to have AF. Several methods for AF detection exist in the literature. These methods are mainly based on two different characteristics of AF ECGs: the irregularity of RR intervals (RRI) and the fibrillatory electrical Atrial Activity (AA). The electrical AA is characterized by the absence of the P-wave (PWA) and special frequency properties (FSA). Nine AF detection algorithms were selected from literature and evaluated with the same protocol in order to study their performance under different conditions. Results showed that the highest sensitivity (Se=97.64%) and specificity (Sp=96.08%) was achieved with methods based on analysis of irregularity of RR interval, while combining RR and atrial activity analysis gave the highest positive predictive value (PPV=92.75%). Algorithms based on RR irregularity were also the most robust against noise (Se=85.79% and Sp=81.90% for SNR=0dB; and Se=82.52% and Sp=40.47% for SNR=−5dB).

91 citations


Journal Article
TL;DR: An algorithm to detect poor quality ECGs collected in low-resource environments is described and was entered in the PhysioNet/Computing in Cardiology Challenge 2011 ‘Improving the quality of ECG collected using mobile phones’.
Abstract: An algorithm to detect poor quality ECGs collected in low-resource environments is described (and was entered in the PhysioNet/Computing in Cardiology Challenge 2011 ‘Improving the quality of ECGs collected using mobile phones’). The algorithm is based on previously published signal quality metrics, with some novel additions, designed for intensive care monitoring. The algorithms have been adapted for use on short (10s) 12-lead ECGs. The metrics quantify spectral energy distribution, higher order moments and inter-channel and inter-algorithm agreement. Six metrics are produced for each channel (72 features in all) and presented to machine learning algorithms for training on the provided labeled data (Set-a) for the challenge. (Binary labels were available, indicating whether the data were acceptable or unacceptable for clinical interpretation.) We re-annotated all the data in Set-a as well as those in Set-b (the test data) using two independent annotators, and a third for adjudication of differences. Events were balanced and the 1000 subjects in Set-a were used to train the classifiers. We compared four classifiers: Linear Discriminant Analysis, Na¨ive Bayes, a Support Vector Machine (SVM) and a Multi-Layer Perceptron artificial neural network classifiers. The SVM and MLP provided the best (and almost equivalent) classification accuracies of 99% on the training data (Set-a) and 95% on the test data (Set-b). The binary classification results (acceptable or unacceptable) were then submitted as an entry into the PhysioNet Computing in Cardiology Competition 2011. Before the competition deadline, we scored 92.6% on the unseen test data (0.6% less than the winning entry). After improving labelling inconsistencies and errors we achieved 94.0%, the highest overall score of all competition entrants

68 citations


Journal Article
TL;DR: This work de-veloped an on-line detector of driver's drowsiness based on HRV analysis and identified drowsy minutes with a sensitivity and predictive positive value of 0.85 and 0.93, respectively, using 25 features.
Abstract: It is estimated that 10–30% of road fatalities are related to drowsy driving or driver fatigue. Driver's drowsiness detection based on biological and vehicle signals is being studied in preventive car safety. Autonomous Nervous System (ANS) activity, which can be measured non-invasively from the Heart Rate Variability (HRV) signal obtained from surface ECG, presents alterations during stress, ex-trem fatigue and drowsiness episodes. Our hypothesis is that these alterations manifest on HRV. In this work we de-velope an on-line detector of driver's drowsiness based on HRV analysis. Two databases have been analyzed: one of driving simulation in which subjects were sleep deprived, and the other of real situation with no sleep deprivation. An external observer annotated each minute of the recordings as drowsy or awake, and constitutes our reference. The proposed detector classified drowsy minutes with a sensitivity of 0.85 and a predictive positive value of 0.93, using 25 features.

56 citations


Journal Article
TL;DR: ECG signals are watermarked with patient biomedical information in order to confirm patient/ECG linkage integrity and it is found that a marginal amount of signal distortion that is sufficient to hold the patient information, will not affect the overall quality of the ECG.
Abstract: In Wireless telecardiology applications, an ECG signal is often transmitted without any patient details which are often supplied separately as clear text. This allows the possibility of confusion of link between signal and identity (for example, with wireless signal collision attacks). ECG data transmission can be more robustly tied to either patient identity or other patient meta-data if this meta-data is embedded within the ECG signal itself when sent. In this paper ECG signals are watermarked with patient biomedical information in order to confirm patient/ECG linkage integrity. Several cases have been tested with different degrees of signal modification due to watermarking. These show its effect on the diagnostic value of the signal (for example, the PRD as an error measure). It is found that a marginal amount of signal distortion that is sufficient to hold the patient information, will not affect the overall quality of the ECG. The proposed system will not increase the size of host signals, nor change its scaling nor bandwidth. In addition, its low complexity makes it suitable for power-limited wearable computing and sensor-net applications.

55 citations


Journal Article
Chen, Huang, Hong, Cheng, Lin 

50 citations


Journal Article
TL;DR: A number of heuristic rules that can be used to detect the most common problems in ECG recordings are explored, designed to be simple enough that they can easily be tested in real time on a mobile phone.
Abstract: It is possible, using a smart phone or similar device, to collect ECGs from patients in remote locations, storing the results to be analyzed later. In this situation, however, the person collecting the ECG may not have the time or the necessary training to evaluate the quality of the recording at the time it is collected. It is useful for the device itself to analyze the recorded signals and provide feedback to the user about their quality. This paper explores a number of heuristic rules that can be used to detect the most common problems in ECG recordings. These rules are designed to be simple enough that they can easily be tested in real time on a mobile phone. A combination of several of these rules is able to correctly detect a majority of poor-quality ECGs, as demonstrated using the PhysioNet/CinC 2011 Challenge database.

48 citations


Journal Article
Chengyu Liu1, Peng Li1, Lina Zhao, Feifei Liu1, Ruxiang Wang1 
TL;DR: A real-time signal quality assessment method for ECGs collected using mobile phones and two indices, Sensitivity and Specificity, are defined to evaluate the validity of this paper's method and the results are 90.67% and 89.78% respectively.
Abstract: Considering that the uncertainty noise produced the decline in the quality of ECGs, this paper proposed a real-time signal quality assessment method for ECGs collected using mobile phones. The method defines four “flags” to denote different type problems duo to the poor quality of ECGs: using flag1 to detect if there is a misplaced electrode; using flag2 to detect if there is a huge impulse; using flag3 to denote if there is a strong Gauss noise; and using flag4 to denote if there is a detector error of R-wave peaks by the template matching method. Then based on the values of four ”flags”, we calculate the single signal quality index (SSQI) for each ECG and the integrative signal quality index (ISQI) for twelve-lead ECGs. The range of ISQI is between 0 and 12 inclusively. High value of ISQI means good quality of the ECGs. Each ECG record would be assigned to two groups according to ISQI, acceptable and unacceptable group. We define two indices, Sensitivity and Specificity, to evaluate the validity of this paper's method and the results are 90.67% and 89.78% respectively.

47 citations


Journal Article
Inaki Romero1
TL;DR: The performance of PCA and ICA in the context of cleaning noisy ECGs in ambulatory conditions was investigated in this article, where the output of a beat detection algorithm was applied to both the output signal after PCA/ICA filtering and compared to the detections in the signal before filtering.
Abstract: The performance of PCA and ICA in the context of cleaning noisy ECGs in ambulatory conditions was investigated. With this aim, ECGs with artificial motion artifacts were generated by combining clean 8-channel ECGs with 8-channel noise signals at SNR values ranging from 10 down to −10 dB. For each SNR, 600 different simulated ECGs of 10-second length were selected. 8-channel PCA and ICA were applied and then inverted after selecting a subset of components. In order to evaluate the performance of PCA and ICA algorithms, the output of a beat detection algorithm was applied to both the output signal after PCA/ICA filtering and compared to the detections in the signal before filtering. Applying both PCA and ICA and retaining the optimal component subset, yielded sensitivity (Se) of 100% for all SNR values studied. In terms of Positive predictivity (+P), applying PCA, yielded to an improvement for all SNR values as compared to no cleaning (+P=95.45% vs. 83.09% for SNR=0dB; +P=56.87% vs. 48.81% for SNR=−10dB). However, ICA filtering gave a higher improvement in +P for all SNR values (+P=100.00% for SNR=0dB; +P=61.38% for SNR=−10dB). An automatic method for selecting the components was proposed. By using this method, both PCA and ICA gave an improvement as compared to no filtering over all SNR values. ICA had a better performance (SNR=−5dB, improvement in +P of 8.33% for PCA and 22.92% for ICA).

Journal Article
TL;DR: This contribution invoke EDTs to assess the usability of ECGs, a classification which relies on the usage of simple spectral features which were derived directly from individual ECG channels.
Abstract: For various biomedical applications, an automated quality assessment is an essential but also complex task. Ensembles of decision trees (EDTs) have proven to be a suitable choice for such classification tasks. Within this contribution we invoke EDTs to assess the usability of ECGs. Our classification relies on the usage of simple spectral features which were derived directly from individual ECG channels. EDTs are generated by bootstrap aggregating while invoking the concept of random forrests. Though their simplicity, the trained ensemble classifiers turned out to be a very robust choice yielding an accuracy of 90.4 %. Therewith, the proposed method offers a good tradeoff bewteen accuracy and computational simplicity. Further improving the accuracy, however, turns out to be hardly feasible considering the chosen feature space.

Journal Article
TL;DR: A measure for mobile ECG quality assessment based on a) basic signal quality properties, b) number of crossing points in between different leads, and c) QRS-amplitude vs. noise-AMplitude ratio is developed.
Abstract: State-of-the-art mobile ECG recorders are usually not intended to be used by untrained personnel or by patients themselves. For that purpose, a suitable graphical user interface that provides real-time feedback concerning the signal quality is required. We have developed a measure for mobile ECG quality assessment based on a) basic signal quality properties (amplitude, spikes, constant signal portions), b) number of crossing points in between different leads, and c) QRS-amplitude vs. noise-amplitude ratio. An advanced algorithm and a simplified Android algorithm were implemented and evaluated by taking part in the Computing in Cardiology Challenge 2011. Our advanced algorithm achieved a score of 0.916 (4th place) in Event 1 of the Computing in Cardiology Challenge 2011. The simplified Android algorithm achieved a score of 0.834 (6th place) in Event 2 and a score of the 0.873 (1st place) in Event 3 of the challenge.


Journal Article
TL;DR: In response to Physionet 2011 challenge, various time series techniques are explored for their potentials in evaluating quality of an ECG, including timedomain analysis, frequency domain analysis, joint time-frequency analysis, self correlation, cross correlation, and entropy analysis.
Abstract: ECG, measuring body-surface electrical waves generated in the heart, is the golden standard for diagnosis of various cardiovascular diseases. The 2011 Physionet challenge visions mobile phones that can be used to collect and analyse ECG records. Such devices are particularly useful in underdeveloped regions, which have a large population size but lack adequate primary care capacity. Signals collected using mobile phones can be sent via mobile network to experienced doctors for further diagnosis. In response to Physionet 2011 challenge, we explore various time series techniques for their potentials in evaluating quality of an ECG, including time domain analysis, frequency domain analysis, joint time-frequency analysis, self correlation, cross correlation, and entropy analysis. Two algorithms are developed based on these techniques. The first algorithm consists of multi-stage tests. A record that passes all tests is regarded as of acceptable quality. In the second algorithm, results from various analyses are assembled into a matrix, which measures the regularity of the ECG. The quality of the ECG is then measured by the spectrum radius of the Matrix of Regularity. Since spectrum radius is continuous, the results can lead to continuous grades of ECGs. The algorithms are tested using training data from Physionet. Influences of various parameters are examined.

Journal Article
TL;DR: An algorithm that can assess the quality of an ECG designed for an Android-based platform is proposed that discriminates between ECGs of good and bad quality, which could help diagnose patients earlier and reduce associated treatment costs.
Abstract: An algorithm to determine quality of ECGs can enable inexperienced nurses and paramedics to record ECGs of sufficient diagnostic quality. We propose an algorithm that can assess the quality of an ECG designed for an Android-based platform. The algorithm is based on previously established ECG quality metrics for quantifying ECG quality but designed in a way to make it efficient to run on a mobile platform. Using the training data set the proposed algorithm obtained a sensitivity of 91% and a specificity of 85%. Testing against the test data sets, resulted in a score of 0.88 (events 1 and 2) and 0.79 (events 3). The proposed algorithm discriminates between ECGs of good and bad quality, which could help diagnose patients earlier and reduce associated treatment costs.

Journal Article
TL;DR: With modifications to the algorithm to cater to a very short length of data, this method is able to differentiate with accuracy the usability of ECG data in training set A and test set B.
Abstract: The Physionet Challenge [1] focused on discerning between usable and unusable electrocardiography (ECG) data tele-medically from mobile embedded devices. Based on our publications [2,3,4], we have designed a method to determine the quality of ECG data and its usability using an adaptation of the Tompkins et al [5] real time QRS detection algorithm. With our modifications to the algorithm to cater to a very short length of data, out method is able to differentiate with accuracy the usability of ECG data in training set A as well as test set B.

Journal Article
TL;DR: It turns out that complicatedly computed features provide only small information gain and therefore it is used for SVM classifier only time-lagged covariance matrix elements, which provide useful information about signal structure in time.
Abstract: In response to the PhysioNet/CinC Challenge 2011: Improving the quality of ECGs collected using mobile phones we have developed an algorithm based on a decision support system. It combines couple of simple rules — in order to discard recordings of obviously low quality (i.e. high-amplitude noise, detached electrodes) with more sophisticated support vector machine (SVM) classification that deals with more difficult cases where simple rules are inefficient. It turns out that complicatedly computed features provide only small information gain and therefore we used for SVM classifier only time-lagged covariance matrix elements, which provide useful information about signal structure in time. Our results are 0.836.

Journal Article
TL;DR: A method for respiratory signal estimation from the pulse photoplethysmographic (PPG) signal based on combination of three parameters present in this signal: pulse rate variability, pulse amplitude variability and pulse width variability is presented.
Abstract: A method for respiratory signal estimation from the pulse photoplethysmographic (PPG) signal is presented. The method is based on combination of three parameters present in this signal: pulse rate variability, pulse amplitude variability and pulse width variability. Evaluation is performed over a database containing electrocardiographic (ECG), PPG and respiratory signals simultaneously recorded in 17 subjects during a tilt table test, obtaining a respiratory rate estimation error of −0.26±7.30% (−2.11±14.49 mHz). These results are comparable or outperform those obtained from other methods which involve the ECG, so it is possible to have reliable respiration estimates from just the PPG.

Journal Article
TL;DR: An automatic coronary tree labeling algorithm is developed for labeling the extracted branches with their anatomical names for CCTA datasets by means of a statistical coronary tree model.
Abstract: In this paper, an automatic coronary tree labeling algorithm is developed for labeling the extracted branches with their anatomical names for CCTA datasets. A two-step matching algorithm is implemented by means of a statistical coronary tree model. The main branches are first identified in a registration step. Then all the segments including proximal, middle and distal parts of the main branches and all side-branches in the coronary tree are labeled. Additional clinical criteria are used to generate the final result. Fifty-eight CCTA datasets with right-dominant coronary trees were used for the evaluation. Compared with manually corrected results by an expert, 37 labels (4.76%) in the automatic results were needed to be changed or removed. For the remaining 741 labels obtained by the automatic method, the average overlap measurement between the expert reference and automatic results was 91.41%.

Journal Article
TL;DR: FuzzyMEn uses the membership degree of fuzzy function instead of the 0–1 judgment of Heaviside function as used in the ApEn and SampEn to improve the stability of traditional entropy measures through introducing the concept of fuzzy sets theory.
Abstract: Traditional entropy measures, such as Approximate Entropy (ApEn) and Sample Entropy (SampEn), are widely used for analyzing heart rate variability (HRV) signals in clinical cardiovascular disease studies. Nevertheless, traditional entropy measures have a poor statistical stability due to the 0–1 judgment of Heaviside function. The objective of this study is to introduce a new entropy measure — Fuzzy Measure Entropy (FuzzyMEn) in order to improve the stability of traditional entropy measures through introducing the concept of fuzzy sets theory. By drawing on Chen et al's research in fuzzy entropy (FuzzyEn), FuzzyMEn uses the membership degree of fuzzy function instead of the 0–1 judgment of Heaviside function as used in the ApEn and SampEn. Simultaneity, FuzzyMEn utilizes the fuzzy local and fuzzy global measure entropy to reflect the whole complexity implied in physiological signals and improves the limitation of FuzzyEn, which only focus on the local complexity. Detailed contrastive analysis and discussion of ApEn, SampEn, FuzzyEn and FuzzyMEn were also given in this study.


Journal Article
TL;DR: This work describes a data-driven approach, using a combination of machine learning algorithms to solve the 2011 Physionet/Computing in Cardiology (CinC) challenge — identifying data collection problems at 12 leads electrocardiography (ECG).
Abstract: We describe a data-driven approach, using a combination of machine learning algorithms to solve the 2011 Physionet/Computing in Cardiology (CinC) challenge — identifying data collection problems at 12 leads electrocardiography (ECG). Our data-driven approach reaches an internal (cross-validation) accuracy of almost 93% on the training set, and accuracy of 91.2% on the test set.

Journal Article
TL;DR: An algorithm that uses five simple rules, detecting the most common distortions of the ECG signal in the out of hospital environment is developed, using five if-then rules for easy implementation and reasonably swift code on the mobile device.
Abstract: Work presented in this paper was undertaken in response to the PhysioNet/CinC Challenge 2011: Improving the quality of ECGs collected using mobile phones. For the purpose of this challenge we have developed an algorithm that uses five simple rules, detecting the most common distortions of the ECG signal in the out of hospital environment. Using five if-then rules arranges for easy implementation and reasonably swift code on the mobile device. Our results on test set B were well-outside the top ten algorithms (Best score: 0.932; Our score: 0.828). Nevertheless our algorithm placed second among those providing open-source code for evaluation on the data set C, where neither data nor scores were released to the participants before the end of the challenge. The difference in the scores of the top two algorithms was minimal (Best score: 0.873; Our score: 0.872). As a consequence, relative success of simple algorithm on undisclosed set C raises questions about the over-fitting of more sophisticated algorithms — question that is hovering above many recently published results of automated methods for medical applications.

Journal Article
TL;DR: From all the quality estimators studied, one novel parameter: Kurtosis, gave the best performance overall the tests and gave high correlation with the signal SNR, but did not have a high dynamic range.
Abstract: During the process of measurement, the ECG signal suffers from several noises, artifacts and interferences, which reduce its quality. Automatic signal quality estimation could permit indentify when the level of noise is high to avoid wrong ECG interpretation. With this aim, several methods have been proposed in the literature. This work assesses eleven different methods for estimation of ECG signal quality available in literature. In addition, three new methods are proposed. These methods were evaluated in a simulated database containing ECGs with different types and levels of noise with SNR values ranging from −20 to 20 dB. From all the quality estimators studied, one novel parameter: Kurtosis, gave the best performance overall the tests. It gave high correlation with the signal SNR (0.95±0.00) and high correlation with the output of a beat detector (Positive Predictivity=0.97±0.00) and high resolution in time (10 seconds of signal length). However kurtosis did not have a high dynamic range. Some methods require some knowledge about the ECG signal (like the position of the R peak) and therefore are not suitable for applications with high levels of noise.

Journal Article
TL;DR: The AR-spectra of the band-pass filtered diastolic heart sound showed that the frequency distribution shifted towards lower frequencies in the case of CAD, and the cause of these changes might be due to variations in ventricular filling patterns.
Abstract: The aim of the current study was to study the low-frequency power distribution of diastolic heart sounds in patients with coronary artery disease (CAD). Heart sound recordings were made from the 4th intercostal space in 132 patients referred for elective coronary angiography. CAD patients were defined as subjects with at least one stenosis with a diameter reduction of at least 50% as identified with quantitative coronary angiography. The diastolic heart sounds were analyzed using short-time Fourier transform (STFT) and autoregressive (AR) models. The STFT analyses showed that the energy below 100 Hz was increased approximately 150 ms after the second heart sound in CAD patients. The AR-spectra of the band-pass filtered (20–100 Hz) diastolic heart sound showed that the frequency distribution shifted towards lower frequencies in the case of CAD. The cause of these changes might be due to variations in ventricular filling patterns.

Journal Article
TL;DR: In this article, a wavelet transform technique is used to determine the beginning and end points of each cardiac cycle by using wavelet transforms and then, first and second heart sounds within the cycles are identified over the PCG signal by paying attention to the spectral properties of the sounds.
Abstract: In this paper, we present a novel algorithm for pediatric heart sound segmentation, incorporated into a graphical user interface. The algorithm employs both the Electrocardiogram (ECG) and Phonocardiogram (PCG) signals for an efficient segmentation under pathological circumstances. First, the ECG signal is invoked in order to determine the beginning and end points of each cardiac cycle by using wavelet transform technique. Then, first and second heart sounds within the cycles are identified over the PCG signal by paying attention to the spectral properties of the sounds. The algorithm is applied on 120 recordings of normal and pathological children, totally containing 1976 cardiac cycles. The accuracy of the segmentation algorithm is 97% for S 1 and 94% for S 2 identification while all the cardiac cycles are correctly determined.

Journal Article
TL;DR: The aim of the research is to propose a prototype of wearable wireless monitoring device optimized to supervising the patient and examine the influence of movement on the heart rate during normal daily activities.
Abstract: Health monitoring and body area network (BAN) applications require wireless intelligent monitoring devices and information systems The aim of our research is to propose a prototype of wearable wireless monitoring device optimized to supervising the patient and examine the influence of movement on the heart rate during normal daily activities Main purpose of the proposed system consists in simultaneous acquisition and automatic analysis of two bipolar ECG and three-axes acceleration (ACC) signals measured by means of wireless, battery-operated prototype of Revitus ECG module The processing of the ECG and ACC data is performed by the custom-developed software installed on PC All recorded information is uploaded to a purposely-designed medical web server for the storage and display as a web page for authorized doctors or patient's family The system was tested on 10 healthy volunteers Each of them was monitored during common daily activities

Journal Article
TL;DR: An automatic ECG quality inspection method based on the conversion of an ECG into a VCG and back again into a reconstructed ECG is devised, giving a correct interpretation of the quality of the ECGs of 92.2% which corresponded to a sensitivity of 97.0 and a specificity of 75.1%.
Abstract: We estimate that as much as 5% of all recorded ECGs worldwide may, to some degree, suffer from poor signal quality or incorrect electrode positioning, which often interferes with correct interpretation of the ECG. Proper training of ECG technicians and regular inspection of signal quality is necessary to achieve a high standard. Due to the large amounts of ECGs recorded daily, we devised an automatic ECG quality inspection method based on the conversion of an ECG into a VCG and back again into a reconstructed ECG. Incorrectly placed electrodes as well as different types of noise can be detected with a high level of accuracy. We used this method to assess the quality of the ECGs in the learning set of the Physionet/Computing in Cardiology Challenge 2011, giving a correct interpretation of the quality of the ECGs of 92.2% which corresponded to a sensitivity of 97.0 and a specificity of 75.1%