scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Photoplethysmograph Signal Reconstruction based on a Novel Motion Artifact Detection-Reduction Approach. Part II: Motion and Noise Artifact Removal

TL;DR: It is shown that the proposed IMAR approach can reliably reconstruct MNA corrupted data segments, as the estimated HR and SpO2 values do not significantly deviate from the uncorrupted reference measurements.
Abstract: We introduce a new method to reconstruct motion and noise artifact (MNA) contaminated photoplethysmogram (PPG) data. A method to detect MNA corrupted data is provided in a companion paper. Our reconstruction algorithm is based on an iterative motion artifact removal (IMAR) approach, which utilizes the singular spectral analysis algorithm to remove MNA artifacts so that the most accurate estimates of uncorrupted heart rates (HRs) and arterial oxygen saturation (SpO2) values recorded by a pulse oximeter can be derived. Using both computer simulations and three different experimental data sets, we show that the proposed IMAR approach can reliably reconstruct MNA corrupted data segments, as the estimated HR and SpO2 values do not significantly deviate from the uncorrupted reference measurements. Comparison of the accuracy of reconstruction of the MNA corrupted data segments between our IMAR approach and the time-domain independent component analysis (TD-ICA) is made for all data sets as the latter method has been shown to provide good performance. For simulated data, there were no significant differences in the reconstructed HR and SpO2 values starting from 10 dB down to −15 dB for both white and colored noise contaminated PPG data using IMAR; for TD-ICA, significant differences were observed starting at 10 dB. Two experimental PPG data sets were created with contrived MNA by having subjects perform random forehead and rapid side-to-side finger movements show that; the performance of the IMAR approach on these data sets was quite accurate as non-significant differences in the reconstructed HR and SpO2 were found compared to non-contaminated reference values, in most subjects. In comparison, the accuracy of the TD-ICA was poor as there were significant differences in reconstructed HR and SpO2 values in most subjects. For non-contrived MNA corrupted PPG data, which were collected with subjects performing walking and stair climbing tasks, the IMAR significantly outperformed TD-ICA as the former method provided HR and SpO2 values that were non-significantly different than MNA free reference values.
Citations
More filters
Journal ArticleDOI
23 Dec 2015-Sensors
TL;DR: The results show that the SpaMA method has a potential for PPG-based HR monitoring in wearable devices for fitness tracking and health monitoring during intense physical activities and dynamics of heart rate variability can be accurately captured.
Abstract: Accurate estimation of heart rates from photoplethysmogram (PPG) signals during intense physical activity is a very challenging problem. This is because strenuous and high intensity exercise can result in severe motion artifacts in PPG signals, making accurate heart rate (HR) estimation difficult. In this study we investigated a novel technique to accurately reconstruct motion-corrupted PPG signals and HR based on time-varying spectral analysis. The algorithm is called Spectral filter algorithm for Motion Artifacts and heart rate reconstruction (SpaMA). The idea is to calculate the power spectral density of both PPG and accelerometer signals for each time shift of a windowed data segment. By comparing time-varying spectra of PPG and accelerometer data, those frequency peaks resulting from motion artifacts can be distinguished from the PPG spectrum. The SpaMA approach was applied to three different datasets and four types of activities: (1) training datasets from the 2015 IEEE Signal Process. Cup Database recorded from 12 subjects while performing treadmill exercise from 1 km/h to 15 km/h; (2) test datasets from the 2015 IEEE Signal Process. Cup Database recorded from 11 subjects while performing forearm and upper arm exercise. (3) Chon Lab dataset including 10 min recordings from 10 subjects during treadmill exercise. The ECG signals from all three datasets provided the reference HRs which were used to determine the accuracy of our SpaMA algorithm. The performance of the SpaMA approach was calculated by computing the mean absolute error between the estimated HR from the PPG and the reference HR from the ECG. The average estimation errors using our method on the first, second and third datasets are 0.89, 1.93 and 1.38 beats/min respectively, while the overall error on all 33 subjects is 1.86 beats/min and the performance on only treadmill experiment datasets (22 subjects) is 1.11 beats/min. Moreover, it was found that dynamics of heart rate variability can be accurately captured using the algorithm where the mean Pearson’s correlation coefficient between the power spectral densities of the reference and the reconstructed heart rate time series was found to be 0.98. These results show that the SpaMA method has a potential for PPG-based HR monitoring in wearable devices for fitness tracking and health monitoring during intense physical activities.

147 citations


Cites background or methods from "Photoplethysmograph Signal Reconstr..."

  • ...HR monitoring using PPG signals has many advantages compared to using traditional ECG sensors, such as simpler hardware implementation, lower cost, and no need for daily application of electrodes [4]....

    [...]

  • ...[4] proposed a motion artifact removal algorithm using SSA....

    [...]

  • ...Some of the popular BSS techniques are independent component analysis (ICA) [17], canonical correlation analysis (CCA) [18], principle component analysis (PCA) [19], and singular spectrum analysis (SSA) [4,20]....

    [...]

  • ...Salehizadeh et al. [4] proposed a motion artifact removal algorithm using SSA....

    [...]

  • ...TROIKA has two extra stages of signal decomposition and reconstruction using singular spectrum analysis (SSA) and it then applies temporal difference operations on the SSA-reconstructed PPG. SSA components are compared to the accelerometer signals and those components with close frequencies to the accelerometer signals are discarded and the rest are used to reconstruct the signal....

    [...]

Journal ArticleDOI
13 Oct 2017-Sensors
TL;DR: The performance of two wearable devices, based on electrocardiography (ECG) and photoplethysmography (PPG), are compared with hospital ECG using an existing seizure detection algorithm, and seizure detection performance using the wrist-worn PPG device was considerably lower.
Abstract: Electrocardiography has added value to automatically detect seizures in temporal lobe epilepsy (TLE) patients. The wired hospital system is not suited for a long-term seizure detection system at home. To address this need, the performance of two wearable devices, based on electrocardiography (ECG) and photoplethysmography (PPG), are compared with hospital ECG using an existing seizure detection algorithm. This algorithm classifies the seizures on the basis of heart rate features, extracted from the heart rate increase. The algorithm was applied to recordings of 11 patients in a hospital setting with 701 h capturing 47 (fronto-)temporal lobe seizures. The sensitivities of the hospital system, the wearable ECG device and the wearable PPG device were respectively 57%, 70% and 32%, with corresponding false alarms per hour of 1.92, 2.11 and 1.80. Whereas seizure detection performance using the wrist-worn PPG device was considerably lower, the performance using the wearable ECG is proven to be similar to that of the hospital ECG.

122 citations


Cites methods from "Photoplethysmograph Signal Reconstr..."

  • ...Noise removal algorithms should be tested on the PPG data to reconstruct the HRV information [26]....

    [...]

Journal ArticleDOI
01 Apr 2017
TL;DR: Various services, applications, and systems that have been developed based on WMSs are discussed and a list of desirable design goals that WMS-based systems should satisfy are suggested.
Abstract: Wearable medical sensors (WMSs) are garnering ever-increasing attention from both the scientific community and the industry. Driven by technological advances in sensing, wireless communication, and machine learning, WMS-based systems have begun transforming our daily lives. Although WMSs were initially developed to enable low-cost solutions for continuous health monitoring, the applications of WMS-based systems now range far beyond health care. Several research efforts have proposed the use of such systems in diverse application domains, e.g., education, human-computer interaction, and security. Even though the number of such research studies has grown drastically in the last few years, the potential challenges associated with their design, development, and implementation are neither well-studied nor well-recognized. This article discusses various services, applications, and systems that have been developed based on WMSs and sheds light on their design goals and challenges. We first provide a brief history of WMSs and discuss how their market is growing. We then discuss the scope of applications of WMS-based systems. Next, we describe the architecture of a typical WMS-based system and the components that constitute such a system, and their limitations. Thereafter, we suggest a list of desirable design goals that WMS-based systems should satisfy. Finally, we discuss various research directions related to WMSs and how previous research studies have attempted to address the limitations of the components used in WMS-based systems and satisfy the desirable design goals.

112 citations

Journal ArticleDOI
TL;DR: The PWF analysis seems to be a suitable method for PPG signal quality determination, real-time annotation, data compression, and calculation of additional pulse wave metrics such as amplitude, duration, and rise time.
Abstract: Photoplethysmography has been used in a wide range of medical devices for measuring oxygen saturation, cardiac output, assessing autonomic function, and detecting peripheral vascular disease. Artifacts can render the photoplethysmogram (PPG) useless. Thus, algorithms capable of identifying artifacts are critically important. However, the published PPG algorithms are limited in algorithm and study design. Therefore, the authors developed a novel embedded algorithm for real-time pulse waveform (PWF) segmentation and artifact detection based on a contour analysis in the time domain. This paper provides an overview about PWF and artifact classifications, presents the developed PWF analysis, and demonstrates the implementation on a 32-bit ARM core microcontroller. The PWF analysis was validated with data records from 63 subjects acquired in a sleep laboratory, ergometry laboratory, and intensive care unit in equal parts. The output of the algorithm was compared with harmonized experts’ annotations of the PPG with a total duration of 31.5 h. The algorithm achieved a beat-to-beat comparison sensitivity of 99.6%, specificity of 90.5%, precision of 98.5%, and accuracy of 98.3%. The interrater agreement expressed as Cohen's kappa coefficient was 0.927 and as F-measure was 0.990. In conclusion, the PWF analysis seems to be a suitable method for PPG signal quality determination, real-time annotation, data compression, and calculation of additional pulse wave metrics such as amplitude, duration, and rise time.

70 citations

Journal ArticleDOI
TL;DR: An approach based on using the time–frequency spectrum of PPG to first detect the MNA-corrupted data and next discard the nonusable part of the corrupted data, which consistently provided higher detection rates than the other three methods, with accuracies greater than 95% for all data.
Abstract: Motion and noise artifacts (MNAs) impose limits on the usability of the photoplethysmogram (PPG), particularly in the context of ambulatory monitoring. MNAs can distort PPG, causing erroneous estimation of physiological parameters such as heart rate (HR) and arterial oxygen saturation (SpO2). In this study, we present a novel approach, “TifMA,” based on using the time–frequency spectrum of PPG to first detect the MNA-corrupted data and next discard the nonusable part of the corrupted data. The term “nonusable” refers to segments of PPG data from which the HR signal cannot be recovered accurately. Two sequential classification procedures were included in the TifMA algorithm. The first classifier distinguishes between MNA-corrupted and MNA-free PPG data. Once a segment of data is deemed MNA-corrupted, the next classifier determines whether the HR can be recovered from the corrupted segment or not. A support vector machine (SVM) classifier was used to build a decision boundary for the first classification task using data segments from a training dataset. Features from time–frequency spectra of PPG were extracted to build the detection model. Five datasets were considered for evaluating TifMA performance: (1) and (2) were laboratory-controlled PPG recordings from forehead and finger pulse oximeter sensors with subjects making random movements, (3) and (4) were actual patient PPG recordings from UMass Memorial Medical Center with random free movements and (5) was a laboratory-controlled PPG recording dataset measured at the forehead while the subjects ran on a treadmill. The first dataset was used to analyze the noise sensitivity of the algorithm. Datasets 2-4 were used to evaluate the MNA detection phase of the algorithm. The results from the first phase of the algorithm (MNA detection) were compared to results from three existing MNA detection algorithms: the Hjorth, kurtosis-Shannon entropy, and time-domain variability-SVM approaches. This last is an approach recently developed in our laboratory. The proposed TifMA algorithm consistently provided higher detection rates than the other three methods, with accuracies greater than 95% for all data. Moreover, our algorithm was able to pinpoint the start and end times of the MNA with an error of less than 1 s in duration, whereas the next-best algorithm had a detection error of more than 2.2 s. The final, most challenging, dataset was collected to verify the performance of the algorithm in discriminating between corrupted data that were usable for accurate HR estimations and data that were nonusable. It was found that on average 48% of the data segments were found to have MNA, and of these, 38% could be used to provide reliable HR estimation.

67 citations

References
More filters
Book
01 May 1986
TL;DR: In this article, the authors present a graphical representation of data using Principal Component Analysis (PCA) for time series and other non-independent data, as well as a generalization and adaptation of principal component analysis.
Abstract: Introduction * Properties of Population Principal Components * Properties of Sample Principal Components * Interpreting Principal Components: Examples * Graphical Representation of Data Using Principal Components * Choosing a Subset of Principal Components or Variables * Principal Component Analysis and Factor Analysis * Principal Components in Regression Analysis * Principal Components Used with Other Multivariate Techniques * Outlier Detection, Influential Observations and Robust Estimation * Rotation and Interpretation of Principal Components * Principal Component Analysis for Time Series and Other Non-Independent Data * Principal Component Analysis for Special Types of Data * Generalizations and Adaptations of Principal Component Analysis

17,446 citations

Book ChapterDOI
01 Jan 2001
TL;DR: In this paper, the clssical filleting and prediclion problem is re-examined using the Bode-Shannon representation of random processes and the?stat-tran-sition? method of analysis of dynamic systems.
Abstract: The clssical filleting and prediclion problem is re-examined using the Bode-Shannon representation of random processes and the ?stat-tran-sition? method of analysis of dynamic systems. New result are: (1) The formulation and Methods of solution of the problm apply, without modification to stationary and nonstationary stalistics end to growing-memory and infinile -memory filters. (2) A nonlinear difference (or differential) equalion is dericed for the covariance matrix of the optimal estimalion error. From the solution of this equation the coefficients of the difference, (or differential) equation of the optimal linear filter are obtained without further caleulations. (3) Tke fillering problem is shoum to be the dual of the nois-free regulator problem. The new method developed here, is applied to do well-known problems, confirming and extending, earlier results. The discussion is largely, self-contatained, and proceeds from first principles; basic concepts of the theory of random processes are reviewed in the Appendix.

15,391 citations

Journal ArticleDOI
TL;DR: An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time and may actually be seen as an extension of the principal component analysis (PCA).

8,522 citations

Book
01 Jun 1992
TL;DR: In this article, a time-discrete approximation of deterministic Differential Equations is proposed for the stochastic calculus, based on Strong Taylor Expansions and Strong Taylor Approximations.
Abstract: 1 Probability and Statistics- 2 Probability and Stochastic Processes- 3 Ito Stochastic Calculus- 4 Stochastic Differential Equations- 5 Stochastic Taylor Expansions- 6 Modelling with Stochastic Differential Equations- 7 Applications of Stochastic Differential Equations- 8 Time Discrete Approximation of Deterministic Differential Equations- 9 Introduction to Stochastic Time Discrete Approximation- 10 Strong Taylor Approximations- 11 Explicit Strong Approximations- 12 Implicit Strong Approximations- 13 Selected Applications of Strong Approximations- 14 Weak Taylor Approximations- 15 Explicit and Implicit Weak Approximations- 16 Variance Reduction Methods- 17 Selected Applications of Weak Approximations- Solutions of Exercises- Bibliographical Notes

6,284 citations