scispace - formally typeset
Open AccessJournal ArticleDOI

Noise and complexity in human postural control: Interpreting the different estimations of entropy

TLDR
Examination of noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties, suggests long-range correlations should be removed from CoP data prior to calculating entropy.
Abstract
BACKGROUND: Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. METHODS AND FINDINGS: The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. CONCLUSIONS: The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

The Appropriate Use of Approximate Entropy and Sample Entropy with Short Data Sets

TL;DR: The results demonstrate that both ApEn and SampEn are extremely sensitive to parameter choices, especially for very short data sets, N ≤ 200, and should be used with extreme caution when choosing parameters for experimental studies with both algorithms.
Journal ArticleDOI

Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations.

TL;DR: It is found that entropy measures can only differentiate changes of specific types in cardiac dynamics and that appropriate preprocessing is vital for correct estimation and interpretation.
Journal ArticleDOI

Task-Dependent Postural Control Throughout the Lifespan

TL;DR: Routine activities performed while standing and walking require the ability to appropriately and continuously modulate postural movements as a function of a concurrent task.
Journal ArticleDOI

Recurrence Quantification Analysis of Human Postural Fluctuations in Older Fallers and Non-fallers

TL;DR: The results showed that RQA outputs quantifying predictability of COP fluctuations and Shannon entropy of recurrence plot diagonal line length distribution, were significantly higher in fallers, only for ML direction.
Journal ArticleDOI

The complexity of daily life walking in older adult community-dwelling fallers and non-fallers

TL;DR: It is suggested that RCME and RMPE can be used to improve the assessment of fall risk in older people and higher complexity was found in the vertical and mediolateral directions in the non-fallers for both entropy metrics.
References
More filters
Journal ArticleDOI

Physiological time-series analysis using approximate entropy and sample entropy

TL;DR: A new and related complexity measure is developed, sample entropy (SampEn), and a comparison of ApEn and SampEn is compared by using them to analyze sets of random numbers with known probabilistic character, finding SampEn agreed with theory much more closely than ApEn over a broad range of conditions.
Journal ArticleDOI

Approximate entropy as a measure of system complexity.

TL;DR: Analysis of a recently developed family of formulas and statistics, approximate entropy (ApEn), suggests that ApEn can classify complex systems, given at least 1000 data values in diverse settings that include both deterministic chaotic and stochastic processes.
Journal ArticleDOI

Quantification of scaling exponents and crossover phenomena in nonstationary heartbeat time series

TL;DR: A new method--detrended fluctuation analysis (DFA)--for quantifying this correlation property in non-stationary physiological time series is described and application of this technique shows evidence for a crossover phenomenon associated with a change in short and long-range scaling exponents.
Journal ArticleDOI

Determining embedding dimension for phase-space reconstruction using a geometrical construction

TL;DR: The issue of determining an acceptable minimum embedding dimension is examined by looking at the behavior of near neighbors under changes in the embedding dimensions from d\ensuremath{\rightarrow}d+1 by examining the manner in which noise changes the determination of ${\mathit{d}}_{\math it{E}}$.
Book

Dynamic Patterns: The Self-Organization of Brain and Behavior

TL;DR: In this article, the authors present a theory of self-organization of behaviour in the human brain, focusing on the brain's ability to learn and adapt to the external world.
Related Papers (5)