scispace - formally typeset
Search or ask a question

Showing papers in "Physiological Measurement in 2016"


Journal ArticleDOI
TL;DR: A public heart sound database, assembled for an international competition, the PhysioNet/Computing in Cardiology (CinC) Challenge 2016, which comprises nine different heart sound databases sourced from multiple research groups around the world is described.
Abstract: In the past few decades, analysis of heart sound signals (i.e. the phonocardiogram or PCG), especially for automated heart sound segmentation and classification, has been widely studied and has been reported to have the potential value to detect pathology accurately in clinical applications. However, comparative analyses of algorithms in the literature have been hindered by the lack of high-quality, rigorously validated, and standardized open databases of heart sound recordings. This paper describes a public heart sound database, assembled for an international competition, the PhysioNet/Computing in Cardiology (CinC) Challenge 2016. The archive comprises nine different heart sound databases sourced from multiple research groups around the world. It includes 2435 heart sound recordings in total collected from 1297 healthy subjects and patients with a variety of conditions, including heart valve disease and coronary artery disease. The recordings were collected from a variety of clinical or nonclinical (such as in-home visits) environments and equipment. The length of recording varied from several seconds to several minutes. This article reports detailed information about the subjects/patients including demographics (number, age, gender), recordings (number, location, state and time length), associated synchronously recorded signals, sampling frequency and sensor type used. We also provide a brief summary of the commonly used heart sound segmentation and classification methods, including open source code provided concurrently for the Challenge. A description of the PhysioNet/CinC Challenge 2016, including the main aims, the training and test sets, the hand corrected annotations for different heart sound states, the scoring mechanism, and associated open source code are provided. In addition, several potential benefits from the public heart sound database are discussed.

477 citations


Journal ArticleDOI
TL;DR: The primary aim was to determine how closely algorithms agreed with a gold standard RR measure when operating under ideal conditions, and to provide a toolbox of algorithms and data to allow future researchers to conduct reproducible comparisons of algorithms.
Abstract: Over 100 algorithms have been proposed to estimate respiratory rate (RR) from the electrocardiogram (ECG) and photoplethysmogram (PPG). As they have never been compared systematically it is unclear which algorithm performs the best. Our primary aim was to determine how closely algorithms agreed with a gold standard RR measure when operating under ideal conditions. Secondary aims were: (i) to compare algorithm performance with IP, the clinical standard for continuous respiratory rate measurement in spontaneously breathing patients; (ii) to compare algorithm performance when using ECG and PPG; and (iii) to provide a toolbox of algorithms and data to allow future researchers to conduct reproducible comparisons of algorithms. Algorithms were divided into three stages: extraction of respiratory signals, estimation of RR, and fusion of estimates. Several interchangeable techniques were implemented for each stage. Algorithms were assembled using all possible combinations of techniques, many of which were novel. After verification on simulated data, algorithms were tested on data from healthy participants. RRs derived from ECG, PPG and IP were compared to reference RRs obtained using a nasal-oral pressure sensor using the limits of agreement (LOA) technique. 314 algorithms were assessed. Of these, 270 could operate on either ECG or PPG, and 44 on only ECG. The best algorithm had 95% LOAs of -4.7 to 4.7 bpm and a bias of 0.0 bpm when using the ECG, and -5.1 to 7.2 bpm and 1.0 bpm when using PPG. IP had 95% LOAs of -5.6 to 5.2 bpm and a bias of -0.2 bpm. Four algorithms operating on ECG performed better than IP. All high-performing algorithms consisted of novel combinations of time domain RR estimation and modulation fusion techniques. Algorithms performed better when using ECG than PPG. The toolbox of algorithms and data used in this study are publicly available.

252 citations


Journal ArticleDOI
TL;DR: An automated approach to classify activity bouts recorded in activPAL 'Events' files as 'sleep'/non-wear (or not) and on a valid day ( or not) is developed based on a simple algorithm for use with 24 h wear protocols in adults.
Abstract: The activPAL monitor, often worn 24 h d−1, provides accurate classification of sitting/reclining posture. Without validated automated methods, diaries—burdensome to participants and researchers—are commonly used to ensure measures of sedentary behaviour exclude sleep and monitor non-wear. We developed, for use with 24 h wear protocols in adults, an automated approach to classify activity bouts recorded in activPAL 'Events' files as 'sleep'/non-wear (or not) and on a valid day (or not). The approach excludes long periods without posture change/movement, adjacent low-active periods, and days with minimal movement and wear based on a simple algorithm. The algorithm was developed in one population (STAND study; overweight/obese adults 18–40 years) then evaluated in AusDiab 2011/12 participants (n = 741, 44% men, aged >35 years, mean ± SD 58.5 ± 10.4 years) who wore the activPAL3™ (7 d, 24 h d−1 protocol). Algorithm agreement with a monitor-corrected diary method (usual practice) was tested in terms of the classification of each second as waking wear (Kappa; κ) and the average daily waking wear time, on valid days. The algorithm showed 'almost perfect' agreement (κ > 0.8) for 88% of participants, with a median kappa of 0.94. Agreement varied significantly (p < 0.05, two-tailed) by age (worsens with age) but not by gender. On average, estimated wear time was approximately 0.5 h d−1 higher than by the diary method, with 95% limits of agreement of approximately this amount ±2 h d−1. In free-living data from Australian adults, a simple algorithm developed in a different population showed 'almost perfect' agreement with the diary method for most individuals (88%). For several purposes (e.g. with wear standardisation), adopting a low burden, automated approach would be expected to have little impact on data quality. The accuracy for total waking wear time was less and algorithm thresholds may require adjustments for older populations.

171 citations


Journal ArticleDOI
TL;DR: This article presents a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines, and shows that both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio.
Abstract: Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.

134 citations


Journal ArticleDOI
TL;DR: The need and specifications for building a new open reference database of NI-FECG signals and the need for new algorithms to be benchmarked on the same database, employing the same evaluation statistics are emphasised.
Abstract: Non-Invasive foetal electrocardiography (NI-FECG) represents an alternative foetal monitoring technique to traditional Doppler ultrasound approaches, that is non-invasive and has the potential to provide additional clinical information. However, despite the significant advances in the field of adult ECG signal processing over the past decades, the analysis of NI-FECG remains challenging and largely unexplored. This is mainly due to the relatively low signal-to-noise ratio of the FECG compared to the maternal ECG, which overlaps in both time and frequency. This article is intended to be used by researchers as a practical guide to NI-FECG signal processing, in the context of the above issues. It reviews recent advances in NI-FECG research including: publicly available databases, NI-FECG extraction techniques for foetal heart rate evaluation and morphological analysis, NI-FECG simulators and the methodology and statistics for assessing the performance of the extraction algorithms. Reference to the most recent work is given, recent findings are highlighted in the form of intermediate summaries, references to open source code and publicly available databases are provided and promising directions for future research are motivated. In particular we emphasise the need and specifications for building a new open reference database of NI-FECG signals, and the need for new algorithms to be benchmarked on the same database, employing the same evaluation statistics. Finally we motivate the need for research in NI-FECG to address morphological analysis, since this represent one of the most promising avenues for this foetal monitoring modality.

105 citations


Journal ArticleDOI
TL;DR: A video-based respiration monitoring method that automatically detects a respiratory region of interest (RoI) and signal using a camera and seems to provide a valid solution to ECG in confined motion scenarios, though precision may be reduced for neonates.
Abstract: Vital signs monitoring is ubiquitous in clinical environments and emerging in home-based healthcare applications. Still, since current monitoring methods require uncomfortable sensors, respiration rate remains the least measured vital sign. In this paper, we propose a video-based respiration monitoring method that automatically detects a respiratory region of interest (RoI) and signal using a camera. Based on the observation that respiration induced chest/abdomen motion is an independent motion system in a video, our basic idea is to exploit the intrinsic properties of respiration to find the respiratory RoI and extract the respiratory signal via motion factorization. We created a benchmark dataset containing 148 video sequences obtained on adults under challenging conditions and also neonates in the neonatal intensive care unit (NICU). The measurements obtained by the proposed video respiration monitoring (VRM) method are not significantly different from the reference methods (guided breathing or contact-based ECG; p-value = 0.6), and explain more than 99% of the variance of the reference values with low limits of agreement (-2.67 to 2.81 bpm). VRM seems to provide a valid solution to ECG in confined motion scenarios, though precision may be reduced for neonates. More studies are needed to validate VRM under challenging recording conditions, including upper-body motion types.

89 citations


Journal ArticleDOI
TL;DR: The Hilbert adaptive beat identification technique can be used for real-time continuous unobtrusive cardiac monitoring, smartphone cardiography, and in wearable devices aimed at health and well-being applications.
Abstract: Heart rate monitoring helps in assessing the functionality and condition of the cardiovascular system. We present a new real-time applicable approach for estimating beat-to-beat time intervals and heart rate in seismocardiograms acquired from a tri-axial microelectromechanical accelerometer. Seismocardiography (SCG) is a non-invasive method for heart monitoring which measures the mechanical activity of the heart. Measuring true beat-to-beat time intervals from SCG could be used for monitoring of the heart rhythm, for heart rate variability analysis and for many other clinical applications. In this paper we present the Hilbert adaptive beat identification technique for the detection of heartbeat timings and inter-beat time intervals in SCG from healthy volunteers in three different positions, i.e. supine, left and right recumbent. Our method is electrocardiogram (ECG) independent, as it does not require any ECG fiducial points to estimate the beat-to-beat intervals. The performance of the algorithm was tested against standard ECG measurements. The average true positive rate, positive prediction value and detection error rate for the different positions were, respectively, supine (95.8%, 96.0% and ≃0.6%), left (99.3%, 98.8% and ≃0.001%) and right (99.53%, 99.3% and ≃0.01%). High correlation and agreement was observed between SCG and ECG inter-beat intervals (r > 0.99) for all positions, which highlights the capability of the algorithm for SCG heart monitoring from different positions. Additionally, we demonstrate the applicability of the proposed method in smartphone based SCG. In conclusion, the proposed algorithm can be used for real-time continuous unobtrusive cardiac monitoring, smartphone cardiography, and in wearable devices aimed at health and well-being applications.

80 citations


Journal ArticleDOI
TL;DR: The application of a new hyperspectral camera system for the quick and robust recording of remission spectra in the combined VIS and NIR spectral range with high spectral and spatial resolution is introduced.
Abstract: The monitoring of free flaps, free transplants or organs for transplantation still poses a problem in medicine Available systems for the measurement of perfusion and oxygenation can only perform localized measurements and usually need contact with the tissue Contact free hyperspectral imaging and near-infrared spectroscopy (NIRS) for the analysis of tissue oxygenation and perfusion have been used in many scientific studies with good results But up to now the clinical and scientific application of this technology has been hindered by the lack of hyperspectral measurement systems usable in clinical practice We will introduce the application of a new hyperspectral camera system for the quick and robust recording of remission spectra in the combined VIS and NIR spectral range with high spectral and spatial resolution This new system can be applied for the clinical monitoring of free flaps and organs providing high quality oxygenation and perfusion images

77 citations


Journal ArticleDOI
TL;DR: A stacked contractive denoising auto-encoder (CDAE) is developed to build a deep neural network (DNN) for noise reduction, which can significantly improve the expression of ECG signals through multi-level feature extraction.
Abstract: As a primary diagnostic tool for cardiac diseases, electrocardiogram (ECG) signals are often contaminated by various kinds of noise, such as baseline wander, electrode contact noise and motion artifacts. In this paper, we propose a contractive denoising technique to improve the performance of current denoising auto-encoders (DAEs) for ECG signal denoising. Based on the Frobenius norm of the Jacobean matrix for the learned features with respect to the input, we develop a stacked contractive denoising auto-encoder (CDAE) to build a deep neural network (DNN) for noise reduction, which can significantly improve the expression of ECG signals through multi-level feature extraction. The proposed method is evaluated on ECG signals from the bench-marker MIT-BIH Arrhythmia Database, and the noises come from the MIT-BIH noise stress test database. The experimental results show that the new CDAE algorithm performs better than the conventional ECG denoising method, specifically with more than 2.40 dB improvement in the signal-to-noise ratio (SNR) and nearly 0.075 to 0.350 improvements in the root mean square error (RMSE).

66 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared time and frequency-domain variability indexes obtained by pulse rate variability (PRV) series extracted from video-photoplethysmography signals (vPPG) were compared with HRV parameters extracted from ECG signals.
Abstract: In this paper, classical time– and frequency-domain variability indexes obtained by pulse rate variability (PRV) series extracted from video-photoplethysmography signals (vPPG) were compared with heart rate variability (HRV) parameters extracted from ECG signals. The study focuses on the analysis of the changes observed during a rest-to-stand manoeuvre (a mild sympathetic stimulus) performed on 60 young, normal subjects (age: years). The objective is to evaluate if video-derived PRV indexes may replace HRV in the assessment of autonomic responses to external stimulation. Video recordings were performed with a GigE Sony XCG-C30C camera and analyzed offline to extract the vPPG signal. A new method based on zero-phase component analysis (ZCA) was employed in combination with a fully-automatic method for detection and tracking of region of interest (ROI) located on the forehead, the cheek and the nose. Results show an overall agreement between time and frequency domain indexes computed on HRV and PRV series. However, some differences exist between resting and standing conditions. During rest, all the indexes computed on HRV and PRV series were not statistically significantly different (p > 0.05), and showed high correlation (Pearson's r > 0.90). The agreement decreases during standing, especially for the high-frequency, respiration-related parameters such as RMSSD (r = 0.75), pNN50 (r = 0.68) and HF power (r = 0.76). Finally, the power in the LF band (n.u.) was observed to increase significantly during standing by both HRV ( versus (n.u.); rest versus standing) and PRV ( versus (n.u.); rest versus standing) analysis, but such an increase was lower in PRV parameters than that observed by HRV indexes. These results provide evidence that some differences exist between variability indexes extracted from HRV and video-derived PRV, mainly in the HF band during standing. However, despite these differences video-derived PRV indexes were able to evince the autonomic responses expected by the sympathetic stimulation induced by the rest-to-stand manoeuvre.

59 citations


Journal ArticleDOI
TL;DR: Simulation data suggest that stable estimates of group-level means can be obtained from as few as one randomly selected monitoring day from a sampled week, on the other hand, estimates using non-random selection of weekend days may be significantly biased.
Abstract: To determine the number and distribution of days required to produce stable group-level estimates of a 7 d mean for common accelerometer-derived activity measures. Data from the 2003–2006 NHANES were used in this analysis. The sample included 986 youth (6–19 year) and 2532 adults (≥20 year) with 7 d of ≥10 h of wear. Accelerometer measures included minutes of inactive, light physical activity, moderate-to-vigorous physical activity (MVPA); and total activity counts/d. Twenty-five alternative protocols were bootstrapped with 50 000 samples drawn for each protocol. Alternative protocols included: 1–6 random days, Saturday plus 1–5 random weekdays (WD), Sunday plus 1–5 random WD, 1 random weekend day (WE) plus 1–5 WD, and both WE plus 1–4 random WD. Relative difference was calculated between the 7 d mean and alternative protocol mean (((alternative protocol mean – 7 d mean)/7 d mean) * 100). Adult MVPA is used as an example; however, similar trends were observed across age groups and variables except adult inactive time, which was stable across protocols. The 7 d mean for adult MVPA was 44.1(0.9) min d−1. The mean bias for any 1–6 random days ranged from −0.0(0.3) to 0.0(0.2) min d−1 with a relative difference of −0.1 to 0.0%. For protocols with non-random components, bias ranged from −1.4(0.2) to 0.6(0.1) min d−1 with relative difference ranging from −7.2 to 3.1%. Simulation data suggest that stable estimates of group-level means can be obtained from as few as one randomly selected monitoring day from a sampled week. On the other hand, estimates using non-random selection of weekend days may be significantly biased. Purposeful sampling that disproportionally forces inclusion of weekend data in analyses should be discouraged.

Journal ArticleDOI
TL;DR: The results show that the inclusion of PAT reduced the standard deviation of the difference from 14.07 to 13.52 mmHg, and two groups of features were identified that contributed more to SBP estimation compared to other features.
Abstract: In this work, a model to estimate systolic blood pressure (SBP) using photoplethysmography (PPG) and electrocardiography (ECG) is proposed. Data from 19 subjects doing a 40 min exercise was analyzed. Reference SBP was measured at the finger based on the volume-clamp principle. PPG signals were measured at the finger and forehead. After an initialization process for each subject at rest, the model estimated SBP every 30 s for the whole period of exercise. In order to build this model, 18 features were extracted from PPG signals by means of its waveform, first derivative, second derivative, and frequency spectrum. In addition, pulse arrival time (PAT) was derived as a feature from the combination of PPG and ECG. After evaluating four regression models, we chose multiple linear regression (MLR) to combine all derived features to estimate SBP. The contribution of each feature was quantified using its normalized weight in the MLR. To evaluate the performance of the model, we used a leave-one-subject-out cross validation. With the aim of exploring the potential of the model, we investigated the influences of the inclusion of PAT, regression models, measurement sites (finger and forehead), and posture change (lying, sitting, and standing). The results show that the inclusion of PAT reduced the standard deviation (SD) of the difference from 14.07 to 13.52 mmHg. There was no significant difference in the estimation performance between the model using finger- and forehead-derived PPG signals. Separate models are necessary for different postures. The optimized model using finger-derived PPG signals during physical exercise had a performance with a mean difference of 0.43 mmHg, an SD of difference of 13.52 mmHg, and median correlation coefficients of 0.86. Furthermore, we identified two groups of features that contributed more to SBP estimation compared to other features. One group consists of our proposed features depicting beat morphology. The other comprises existing features depicting the dicrotic notch. The present work demonstrates promising results of the SBP estimation model during physical exercise.

Journal ArticleDOI
TL;DR: Examination of the effect of an accelerometer-based wearable(s) (accW) location, walking speed, age and algorithms on gait characteristics showed that algorithm adjustment did not influence agreement between results obtained at different locations.
Abstract: Wearables such as accelerometers are emerging as powerful tools for quantifying gait in various environments. Flexibility in wearable location may improve ease of use and data acquisition during instrumented testing. However, change of location may impact algorithm functionality when evaluating associated gait characteristics. Furthermore, this may be exacerbated by testing protocol (different walking speed) and age. Therefore, the aim of this study was to examine the effect of an accelerometer-based wearable(s) (accW) location, walking speed, age and algorithms on gait characteristics. Forty younger (YA) and 40 older adults (OA) were recruited. Participants wore accW positioned at the chest, waist and lower back (L5, gold standard) and were asked to walk continuously for 2 min at preferred and fast speeds. Two algorithms, previously validated for accW located on L5, were used to quantify step time and step length. Mean, variability and asymmetry gait characteristics were estimated for each location with reference to L5. To examine impact of locations and speed on algorithm-dependant characteristic evaluation, adjustments were made to the temporal algorithm. Absolute, relative agreement and difference between measurements at different locations and L5 were assessed. Mean step time and length evaluated from the chest showed excellent agreement compared to L5 for both age groups and speeds. Agreement between waist and L5 was excellent for mean step time for both speeds and age groups, good for mean step length at both speeds for YA and at preferred speed for OA. Step time and length asymmetry evaluated from the chest showed moderate agreement for YA only. Lastly, results showed that algorithm adjustment did not influence agreement between results obtained at different locations. Mean spatiotemporal characteristics can be robustly quantified from accW at the locations used in this study irrespective of speed and age; this is not true when estimating variability and asymmetry characteristics.

Journal ArticleDOI
TL;DR: Investigation of regional lung function using EIT enables the assessment of spatial and temporal heterogeneity of ventilation distribution during bronchodilator reversibility testing, allowing the estimation of the natural disease progression and therapy effects on the regional and not only global level.
Abstract: The measurement of rapid regional lung volume changes by electrical impedance tomography (EIT) could determine regional lung function in patients with obstructive lung diseases during pulmonary function testing (PFT). EIT examinations carried out before and after bronchodilator reversibility testing could detect the presence of spatial and temporal ventilation heterogeneities and analyse their changes in response to inhaled bronchodilator on the regional level. We examined seven patients suffering from chronic asthma (49 ± 19 years, mean age ± SD) using EIT at a scan rate of 33 images s−1 during tidal breathing and PFT with forced full expiration. The patients were studied before and 5, 10 and 20 min after bronchodilator inhalation. Seven age- and sex-matched human subjects with no lung disease history served as a control study group. The spatial heterogeneity of lung function measures was quantified by the global inhomogeneity indices calculated from the pixel values of tidal volume, forced expiratory volume in one second (FEV1), forced vital capacity (FVC), peak flow and forced expiratory flow between 25% and 75% of FVC as well as histograms of pixel FEV1/FVC values. Temporal heterogeneity was assessed using the pixel values of expiration times needed to exhale 75% and 90% of pixel FVC. Regional lung function was more homogeneous in the healthy subjects than in the patients with asthma. Spatial and temporal ventilation distribution improved in the patients with asthma after the bronchodilator administration as evidenced mainly by the histograms of pixel FEV1/FVC values and pixel expiration times. The examination of regional lung function using EIT enables the assessment of spatial and temporal heterogeneity of ventilation distribution during bronchodilator reversibility testing. EIT may become a new tool in PFT, allowing the estimation of the natural disease progression and therapy effects on the regional and not only global level.

Journal ArticleDOI
TL;DR: An algorithm that classifies whether a generated cardiac arrhythmia alarm is true or false is proposed, trained and evaluated with the PhysioNet/Computing in Cardiology Challenge 2015 data set.
Abstract: In this paper, we propose an algorithm that classifies whether a generated cardiac arrhythmia alarm is true or false. The large number of false alarms in intensive care is a severe issue. The noise peaks caused by alarms can be high and in a noisy environment nurses can experience stress and fatigue. In addition, patient safety is compromised because reaction time of the caregivers to true alarms is reduced. The data for the algorithm development consisted of records of electrocardiogram (ECG), arterial blood pressure, and photoplethysmogram signals in which an alarm for either asystole, extreme bradycardia, extreme tachycardia, ventricular fibrillation or flutter, or ventricular tachycardia occurs. First, heart beats are extracted from every signal. Next, the algorithm selects the most reliable signal pair from the available signals by comparing how well the detected beats match between different signals based on [Formula: see text]-score and selecting the best match. From the selected signal pair, arrhythmia specific features, such as heart rate features and signal purity index are computed for the alarm classification. The classification is performed with five separate Random Forest models. In addition, information on the local noise level of the selected ECG lead is added to the classification. The algorithm was trained and evaluated with the PhysioNet/Computing in Cardiology Challenge 2015 data set. In the test set the overall true positive rates were 93 and 95% and true negative rates 80 and 83%, respectively for events with no information and events with information after the alarm. The overall challenge scores were 77.39 and 81.58.

Journal ArticleDOI
TL;DR: It is suggested that FD analysis could be used as a complementary variable providing further information on central mechanisms with respect to CV in fatiguing contractions, suggesting a greater increase in motor unit synchronization with ageing.
Abstract: Over the past decade, linear and nonlinear surface electromyography (EMG) variables highlighting different components of fatigue have been developed. In this study, we tested fractal dimension (FD) and conduction velocity (CV) rate of changes as descriptors, respectively, of motor unit synchronization and peripheral manifestations of fatigue. Sixteen elderly (69 ± 4 years) and seventeen young (23 ± 2 years) physically active men (almost 3-5 h of physical activity per week) executed one knee extensor contraction at 70% of a maximal voluntary contraction for 30 s. Muscle fiber CV and FD were calculated from the multichannel surface EMG signal recorded from the vastus lateralis and medialis muscles. The main findings were that the two groups showed a similar rate of change of CV, whereas FD rate of change was higher in the young than in the elderly group. The trends were the same for both muscles. CV findings highlighted a non-different extent of peripheral manifestations of fatigue between groups. Nevertheless, FD rate of change was found to be steeper in the elderly than in the young, suggesting a greater increase in motor unit synchronization with ageing. These findings suggest that FD analysis could be used as a complementary variable providing further information on central mechanisms with respect to CV in fatiguing contractions.

Journal ArticleDOI
TL;DR: The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices, and the rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples.
Abstract: The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75 × 10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

Journal ArticleDOI
TL;DR: A large database containing uniform recordings was constructed to allow more robust estimates of normative ranges and also assess the influence of age and sex, which was found to influence baseline haemodynamic and transfer function analysis parameters.
Abstract: Normative values of physiological parameters hold significance in modern day clinical decision-making. Lack of such normative values has been a major hurdle in the translation of research into clinical practice. A large database containing uniform recordings was constructed to allow more robust estimates of normative ranges and also assess the influence of age and sex. Doppler recordings were performed on healthy volunteers in the same laboratory, using similar protocols and equipment. Beat-to-beat blood pressure, heart-rate, electrocardiogram, and end-tidal CO2 were measured continuously. Bilateral insonation of the middle cerebral arteries (MCAs) was performed using TCD following a 15 min stabilisation, and a 5 min baseline recording. Good quality Doppler recordings for both MCAs were obtained in 129 participants (57 female) with a median age of 57 years (range 20-82). Age was found to influence baseline haemodynamic and transfer function analysis parameters. Cerebral blood flow velocity and critical closing pressure were the only sex-related differences found, which was significantly higher in females than males. Normative values for cerebral haemodynamic parameters have been defined in a large, healthy population. Such age/sex-defined normal values can be used to reduce the burden of collecting additional control data in future studies, as well as to identify disease-associated changes.

Journal ArticleDOI
TL;DR: This work will introduce, explain and review some of the most important coupling measures and classify them according to their origin and capabilities in the light of physiological analyses, and address time dependent couplings as identified using a recent approach employing ensembles of time series.
Abstract: Health is one of the most important non-material assets and thus also has an enormous influence on material values, since treating and preventing diseases is expensive. The number one cause of death worldwide today originates in cardiovascular diseases. For these reasons the aim of understanding the functions and the interactions of the cardiovascular system is and has been a major research topic throughout various disciplines for more than a hundred years. The purpose of most of today's research is to get as much information as possible with the lowest possible effort and the least discomfort for the subject or patient, e.g. via non-invasive measurements. A family of tools whose importance has been growing during the last years is known under the headline of coupling measures. The rationale for this kind of analysis is to identify the structure of interactions in a system of multiple components. Important information lies for example in the coupling direction, the coupling strength, and occurring time lags. In this work, we will, after a brief general introduction covering the development of cardiovascular time series analysis, introduce, explain and review some of the most important coupling measures and classify them according to their origin and capabilities in the light of physiological analyses. We will begin with classical correlation measures, go via Granger-causality-based tools, entropy-based techniques (e.g. momentary information transfer), nonlinear prediction measures (e.g. mutual prediction) to symbolic dynamics (e.g. symbolic coupling traces). All these methods have contributed important insights into physiological interactions like cardiorespiratory coupling, neuro-cardio-coupling and many more. Furthermore, we will cover tools to detect and analyze synchronization and coordination (e.g. synchrogram and coordigram). As a last point we will address time dependent couplings as identified using a recent approach employing ensembles of time series. The scope of this review, as opposed to various other excellent reviews like (Hlavackova-Schindler et al Phys. Rep. 441 1-46, Kramer et al 2004 Phys. Rev. E 70 1-10, Lombardi 2000 Circulation 101 8-10, Porta et al 2000 Am. J. Physiol.: Heart and Circulatory Physiol. 279 H2558-67, Schelter et al 2006 J. Neurosci. Methods 152 210-9), is to give a broader overview over existing coupling measures and where to look to find the most appropriate tool for a given situation. The review will comprise a test of one representative of the most important coupling measure groups using a simple toy model to illustrate some essential features of the tools. At the end we will summarise the performance of each measure and offer some advice on when to use which method.

Journal ArticleDOI
TL;DR: The findings demonstrate that both heart disease and the calibration interval can influence the accuracy of PTT-based BP estimation and should be taken into consideration to improve estimation accuracy.
Abstract: Continuous blood pressure (BP) measurement without a cuff is advantageous for the early detection and prevention of hypertension. The pulse transit time (PTT) method has proven to be promising for continuous cuffless BP measurement. However, the problem of accuracy is one of the most challenging aspects before the large-scale clinical application of this method. Since PTT-based BP estimation relies primarily on the relationship between PTT and BP under certain assumptions, estimation accuracy will be affected by cardiovascular disorders that impair this relationship and by the calibration frequency, which may violate these assumptions. This study sought to examine the impact of heart disease and the calibration interval on the accuracy of PTT-based BP estimation. The accuracy of a PTT-BP algorithm was investigated in 37 healthy subjects and 48 patients with heart disease at different calibration intervals, namely 15 min, 2 weeks, and 1 month after initial calibration. The results showed that the overall accuracy of systolic BP estimation was significantly lower in subjects with heart disease than in healthy subjects, but diastolic BP estimation was more accurate in patients than in healthy subjects. The accuracy of systolic and diastolic BP estimation becomes less reliable with longer calibration intervals. These findings demonstrate that both heart disease and the calibration interval can influence the accuracy of PTT-based BP estimation and should be taken into consideration to improve estimation accuracy.

Journal ArticleDOI
TL;DR: The 2015 PhysioNet/Computing in Cardiology Challenge provides a set of 1250 multi-parameter ICU data segments associated with critical arrhythmia alarms, and challenges the general research community to address the issue of false alarm suppression using all available signals.
Abstract: High false alarm rates in the ICU decrease quality of care by slowing staff response times while increasing patient delirium through noise pollution. The 2015 Physio-Net/Computing in Cardiology Challenge provides a set of 1,250 multi-parameter ICU data segments associated with critical arrhythmia alarms, and challenges the general research community to address the issue of false alarm suppression using all available signals. Each data segment was 5 minutes long (for real time analysis), ending at the time of the alarm. For retrospective analysis, we provided a further 30 seconds of data after the alarm was triggered. A total of 750 data segments were made available for training and 500 were held back for testing. Each alarm was reviewed by expert annotators, at least two of whom agreed that the alarm was either true or false. Challenge participants were invited to submit a complete, working algorithm to distinguish true from false alarms, and received a score based on their program's performance on the hidden test set. This score was based on the percentage of alarms correct, but with a penalty that weights the suppression of true alarms five times more heavily than acceptance of false alarms. We provided three example entries based on well-known, open source signal processing algorithms, to serve as a basis for comparison and as a starting point for participants to develop their own code. A total of 38 teams submitted a total of 215 entries in this year's Challenge. This editorial reviews the background issues for this Challenge, the design of the Challenge itself, the key achievements, and the follow-up research generated as a result of the Challenge, published in the concurrent special issue of Physiological Measurement. Additionally we make some recommendations for future changes in the field of patient monitoring as a result of the Challenge.

Journal ArticleDOI
TL;DR: Bandwidth estimations based on the instantaneous frequency (IF) and amplitude (IA) spectra of the modes indicate that the proposed VMD-based features have sufficient class discrimination capability regarding ECG beats.
Abstract: It is a difficult process to detect abnormal heart beats, known as arrhythmia, in long-term ECG recording. Thus, computer-aided diagnosis systems have become a supportive tool for helping physicians improve the diagnostic accuracy of heartbeat detection. This paper explores the bandwidth properties of the modes obtained using variational mode decomposition (VMD) to classify arrhythmia electrocardiogram (ECG) beats. VMD is an enhanced version of the empirical mode decomposition (EMD) algorithm for analyzing non-linear and non-stationary signals. It decomposes the signal into a set of band-limited oscillations called modes. ECG signals from the MIT-BIH arrhythmia database are decomposed using VMD, and the amplitude modulation bandwidth B AM, the frequency modulation bandwidth B FM and the total bandwidth B of the modes are used as feature vectors to detect heartbeats such as normal (N), premature ventricular contraction (V), left bundle branch block (L), right bundle branch block (R), paced beat (P) and atrial premature beat (A). Bandwidth estimations based on the instantaneous frequency (IF) and amplitude (IA) spectra of the modes indicate that the proposed VMD-based features have sufficient class discrimination capability regarding ECG beats. Moreover, the extracted features using the bandwidths (B AM, B FM and B) of four modes are used to evaluate the diagnostic accuracy rates of several classifiers such as the k-nearest neighbor classifier (k-NN), the decision tree (DT), the artificial neural network (ANN), the bagged decision tree (BDT), the AdaBoost decision tree (ABDT) and random sub-spaced k-NN (RSNN) for N, R, L, V, P, and A beats. The performance of the proposed VMD-based feature extraction with a BDT classifier has accuracy rates of 99.06%, 99.00%, 99.40%, 99.51%, 98.72%, 98.71%, and 99.02% for overall, N-, R-, L-, V-, P-, and A-type ECG beats, respectively.

Journal ArticleDOI
TL;DR: It is shown that it is feasible to reduce sound pressure levels using architectural modifications in the intensive care unit (ICU) setting, including a noise reduction and day-night variations of sound level.
Abstract: Noise is a proven cause of wakefulness and qualitative sleep disturbance in critically ill patients. A sound pressure level reduction can improve sleep quality, but there are no studies showing the feasibility of such a noise reduction in the intensive care unit (ICU) setting. Considering all available evidence, we redesigned two ICU rooms with the aim of investigating the physiological and clinical impact of a healing environment, including a noise reduction and day-night variations of sound level. Within an experimental design, we recorded 96 h of sound-pressure levels in standard ICU rooms and the modified ICU rooms. In addition, we performed a sound source observation by human observers. Our results show that we reduced A-weighted equivalent sound pressure levels and maximum sound pressure levels with our architectural interventions. During night-time, the modification led to a significant decrease in 50 dB threshold overruns from 65.5% to 39.9% (door side) and from 50% to 10.5% (window side). Sound peaks of more than 60 decibels were significantly reduced from 62.0% to 26.7% (door side) and 59.3% to 30.3% (window side). Time-series analysis of linear trends revealed a significantly more distinct day-night pattern in the modified rooms with lower sound levels during night-times. Observed sound sources during night revealed four times as many talking events in the standard room compared to the modified room. In summary, we show that it is feasible to reduce sound pressure levels using architectural modifications.

Journal ArticleDOI
TL;DR: An extension of the Graz consensus reconstruction algorithm for EIT (GREIT) to 3D and open-source tools are developed to evaluate its performance as a function of the choice of stimulation and measurement pattern.
Abstract: Most applications of thoracic EIT use a single plane of electrodes on the chest from which a transverse image 'slice' is calculated. However, interpretation of EIT images is made difficult by the large region above and below the electrode plane to which EIT is sensitive. Volumetric EIT images using two (or more) electrode planes should help compensate, but are little used currently. The Graz consensus reconstruction algorithm for EIT (GREIT) has become popular in lung EIT. One shortcoming of the original formulation of GREIT is its restriction to reconstruction onto a 2D planar image. We present an extension of the GREIT algorithm to 3D and develop open-source tools to evaluate its performance as a function of the choice of stimulation and measurement pattern. Results show 3D GREIT using two electrode layers has significantly more uniform sensitivity profiles through the chest region. Overall, the advantages of 3D EIT are compelling.

Journal ArticleDOI
TL;DR: The orientation dependent difference in SNR in EEG and MEG gradually changed as the sources were located deeper, where the interictal spikes generated higher SNRs in EEG compared to those in MEG for all source orientations.
Abstract: Simultaneous electroencephalography (EEG) and magnetoencephalography (MEG) recordings of neuronal activity from epileptic patients reveal situations in which either EEG or MEG or both modalities show visible interictal spikes. While different signal-to-noise ratios (SNRs) of the spikes in EEG and MEG have been reported, a quantitative relation of spike source orientation and depth as well as the background brain activity to the SNR has not been established. We investigated this quantitative relationship for both dipole and patch sources in an anatomically realistic cortex model. Altogether, 5600 dipole and 3300 patch sources were distributed on the segmented cortical surfaces of two volunteers. The sources were classified according to their quantified depths and orientations, ranging from 20 mm to 60 mm below the skin surface and radial and tangential, respectively. The source time-courses mimicked an interictal spike, and the simulated background activity emulated resting activity. Simulations were conducted with individual three-compartment boundary element models. The SNR was evaluated for 128 EEG, 102 MEG magnetometer, and 204 MEG gradiometer channels. For superficial dipole and superficial patch sources, EEG showed higher SNRs for dominantly radial orientations, and MEG showed higher values for dominantly tangential orientations. Gradiometers provided higher SNR than magnetometers for superficial sources, particularly for those with dominantly tangential orientations. The orientation dependent difference in SNR in EEG and MEG gradually changed as the sources were located deeper, where the interictal spikes generated higher SNRs in EEG compared to those in MEG for all source orientations. With deep sources, the SNRs in gradiometers and magnetometers were of the same order. To better detect spikes, both EEG and MEG should be used.

Journal ArticleDOI
TL;DR: Transfer function analysis was used and two-step criteria were adopted to accept estimates of ARI as valid, showing their ability to identify conditions where ARI estimates should be rejected, for example due to CBFV step responses lacking physiological plausibility.
Abstract: The autoregulation index (ARI) can reflect the effectiveness of cerebral blood flow (CBF) control in response to dynamic changes in arterial blood pressure (BP), but objective criteria for its validation have not been proposed. Monte Carlo simulations were performed by generating 5 min long random input/output signals that mimic the properties of mean beat-to-beat BP and CBF velocity (CBFV) as usually obtained by non-invasive measurements in the finger (Finometer) and middle cerebral artery (transcranial Doppler ultrasound), respectively. Transfer function analysis (TFA) was used to estimate values of ARI by optimal fitting of template curves to the output (or CBFV) response to a step change in input (or BP). Two-step criteria were adopted to accept estimates of ARI as valid. The 95% confidence limit of the mean coherence function (0.15-0.25 Hz) ([Formula: see text]) was estimated from 15 000 runs, resulting in [Formula: see text] = 0.190 when using five segments of data, each with 102.4 s (512 samples) duration (Welch's method). This threshold for acceptance was dependent on the TFA settings and increased when using segments with shorter duration (51.2 s). For signals with mean coherence above the critical value, the 5% confidence limit of the normalised mean square error (NMSEcrit) for fitting the step response to Tieck's model, was found to be approximately 0.30 and independent of the TFA settings. Application of these criteria to physiological and clinical sets of data showed their ability to identify conditions where ARI estimates should be rejected, for example due to CBFV step responses lacking physiological plausibility. A larger number of recordings were rejected from acute ischaemic stroke patients than for healthy volunteers. More work is needed to validate this procedure with different physiological conditions and/or patient groups. The influence of non-stationarity in BP and CBFV signals should also be investigated.

Journal ArticleDOI
TL;DR: Evidence is provided that NI-FECG technology enables accurate extraction of the fetal QT interval in term laboring women and is lower than the lowest adult root mean square error observed in related adult QT studies.
Abstract: Non-invasive fetal electrocardiography (NI-FECG) is a promising alternative continuous fetal monitoring method that has the potential to allow morphological analysis of the FECG. However, there are a number of challenges associated with the evaluation of morphological parameters from the NI-FECG, including low signal to noise ratio of the NI-FECG and methodological challenges for getting reference annotations and evaluating the accuracy of segmentation algorithms. This work aims to validate the measurement of the fetal QT interval in term laboring women using a NI-FECG electrocardiogram monitor. Fetal electrocardiogram data were recorded from 22 laboring women at term using the NI-FECG and an invasive fetal scalp electrode simultaneously. A total of 105 one-minute epochs were selected for analysis. Three pediatric electrophysiologists independently annotated individual waveforms and averaged waveforms from each epoch. The intervals measured on the averaged cycles taken from the NI-FECG and the fetal scalp electrode showed a close agreement; the root mean square error between all corresponding averaged NI-FECG and fetal scalp electrode beats was 13.6 ms, which is lower than the lowest adult root mean square error of 16.1 ms observed in related adult QT studies. These results provide evidence that NI-FECG technology enables accurate extraction of the fetal QT interval.

Journal ArticleDOI
TL;DR: The design, construction, and testing of a multi-channel fetal electrocardiogram (fECG) signal generator based on LabVIEW, which enables manifestations of hypoxic states to be monitored while complying with clinical recommendations for classifications in cardiotocography (CTG) and fECG ST segment analysis (STAN).
Abstract: This paper describes the design, construction, and testing of a multi-channel fetal electrocardiogram (fECG) signal generator based on LabVIEW. Special attention is paid to the fetal heart development in relation to the fetus' anatomy, physiology, and pathology. The non-invasive signal generator enables many parameters to be set, including fetal heart rate (FHR), maternal heart rate (MHR), gestational age (GA), fECG interferences (biological and technical artifacts), as well as other fECG signal characteristics. Furthermore, based on the change in the FHR and in the T wave-to-QRS complex ratio (T/QRS), the generator enables manifestations of hypoxic states (hypoxemia, hypoxia, and asphyxia) to be monitored while complying with clinical recommendations for classifications in cardiotocography (CTG) and fECG ST segment analysis (STAN). The generator can also produce synthetic signals with defined properties for 6 input leads (4 abdominal and 2 thoracic). Such signals are well suited to the testing of new and existing methods of fECG processing and are effective in suppressing maternal ECG while non-invasively monitoring abdominal fECG. They may also contribute to the development of a new diagnostic method, which may be referred to as non-invasive trans-abdominal CTG + STAN. The functional prototype is based on virtual instrumentation using the LabVIEW developmental environment and its associated data acquisition measurement cards (DAQmx). The generator also makes it possible to create synthetic signals and measure actual fetal and maternal ECGs by means of bioelectrodes.

Journal ArticleDOI
TL;DR: Traditional and novel approaches to HRV analyses are applied and compared in order to evaluate their relative merits in the assessment of ANS activity and the influence of sleep position and significant differences were observed.
Abstract: Autonomic nervous system (ANS) balance is a key factor in homeostatic control of cardiac activity, breathing and certain reflex reactions such as coughing, sneezing and swallowing and thus plays a crucial role for survival. ANS impairment has been related to many neonatal pathologies, including sudden infant death syndrome (SIDS). Moreover, some conditions have been identified as risk factors for SIDS, such as prone sleep position. There is an urgent need for timely and non-invasive assessment of ANS function in at-risk infants. Systematic measurement of heart rate variability (HRV) offers an optimal approach to access indirectly both sympathetic and parasympathetic influences on ANS functioning. In this paper, data from premature infants collected in a sleep physiology laboratory in the NICU are presented: traditional and novel approaches to HRV analyses are applied and compared in order to evaluate their relative merits in the assessment of ANS activity and the influence of sleep position. Indices from time domain and nonlinear approaches contributed as markers of physiological development in premature infants. Moreover, significant differences were observed as a function of sleep position.

Journal ArticleDOI
TL;DR: One limitation of non-contact PPG is discussed here, specifically looking at the topology and optical variations of the skin and how this impacts upon the ability to extract a photoplethysmogram when a subject moves horizontally across the field of view of the detector.
Abstract: Non-contact photoplethysmography (PPG) provides multiple benefits over in-contact methods, but is not as tolerant to motion due to the lack of mechanical coupling between the subject and sensor. One limitation of non-contact photoplethysmography is discussed here, specifically looking at the topology and optical variations of the skin and how this impacts upon the ability to extract a photoplethysmogram when a subject moves horizontally across the field of view of the detector (a panning motion). When this occurs it is shown that whilst the general relationships between the speed of traversal, detection area and resultant signal quality can be found, the quality of signal in each individual case is determined by the properties of the area of skin chosen.