Journal ArticleDOI
Standardized measurement error: A universal metric of data quality for averaged event-related potentials.
Reads0
Chats0
TLDR
In this paper, the authors proposed the standardized measurement error (SME), which is a special case of the standard error of measurement and can be applied to virtually any value that is derived from averaged ERP waveforms.Abstract:
Event-related potentials (ERPs) can be very noisy, and yet, there is no widely accepted metric of ERP data quality. Here, we propose a universal measure of data quality for ERP research-the standardized measurement error (SME)-which is a special case of the standard error of measurement. Whereas some existing metrics provide a generic quantification of the noise level, the SME quantifies the data quality (precision) for the specific amplitude or latency value being measured in a given study (e.g., the peak latency of the P3 wave). It can be applied to virtually any value that is derived from averaged ERP waveforms, making it a universal measure of data quality. In addition, the SME quantifies the data quality for each individual participant, making it possible to identify participants with low-quality data and "bad" channels. When appropriately aggregated across individuals, SME values can be used to quantify the combined impact of the single-trial EEG noise and the number of trials being averaged together on the effect size and statistical power in a given experiment. If SME values were regularly included in published articles, researchers could identify the recording and analysis procedures that produce the highest data quality, which could ultimately lead to increased effect sizes and greater replicability across the field.read more
Citations
More filters
Journal ArticleDOI
Data quality and reliability metrics for event-related potentials (ERPs): The utility of subject-level reliability.
TL;DR: In this paper, the authors review three types of measurements metrics: data quality, group-level internal consistency, and subject level internal consistency and demonstrate how failing to consider data quality and internal consistency can undermine statistical inferences.
Journal ArticleDOI
The Data-Processing Multiverse of Event-Related Potentials (ERPs): A Roadmap for the Optimization and Standardization of ERP Processing and Reduction Pipelines
TL;DR: In this paper, a multiverse analysis of a data processing pipeline examines the impact of a large set of different reasonable choices to determine the robustness of effects, such as the effect of different decisions on between-trial standard deviations and between-condition differences (i.e., experimental effects).
Journal ArticleDOI
Using generalizability theory and the ERP Reliability Analysis (ERA) Toolbox for assessing test-retest reliability of ERP scores part 1: Algorithms, framework, and implementation.
TL;DR: The ERP Reliability Analysis (ERA) toolbox as discussed by the authors is designed for estimating ERP score reliability using generalizability (G) theory, which is well suited for ERPs.
Posted ContentDOI
Introducing RELAX (the Reduction of Electroencephalographic Artifacts): A fully automated pre-processing pipeline for cleaning EEG data - Part 1: Algorithm and Application to Oscillations
N. Bailey,Mana Biabani,A. Hill,Aleksandra Miljevic,N. Rogasch,Brooke McQueen,O. Murphy,Pb. Fitzgerald +7 more
TL;DR: RelAX (the Reduction of Electroencephalographic Artifacts), an automated EEG cleaning pipeline implemented within EEGLAB that reduces all artifact types, is developed and recommended for data cleaning across EEG studies.
Journal ArticleDOI
Automated Pipeline for Infants Continuous EEG (APICE): A flexible pipeline for developmental cognitive studies
TL;DR: In this paper , an automated pipeline for infants continuous EEG (APICE) is proposed, which is fully automated, flexible, and modular for artifact detection and data preprocessing on continuous EEG data.
References
More filters
Journal ArticleDOI
Why Most Published Research Findings Are False
TL;DR: In this paper, the authors discuss the implications of these problems for the conduct and interpretation of research and conclude that the probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and the ratio of true to no relationships among the relationships probed in each scientifi c fi eld.
Journal ArticleDOI
ERPLAB: an open-source toolbox for the analysis of event-related potentials
TL;DR: ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others.
Book
Bootstrap Methods: A Guide for Practitioners and Researchers
TL;DR: This chapter discusses bootstrapping in the context of clinical trial analysis, and discusses the Bootstrap Percentile method, which was used in the case of Tendril DX lead Clinical Trial Analysis.
Journal ArticleDOI
Measurement of ERP latency differences: a comparison of single-participant and jackknife-based scoring methods.
TL;DR: Computer simulations used to evaluate different procedures for measuring changes in the onset latency of a representative range of event-related components found the jackknife-based approach provided the most accurate method and the greatest statistical power, with no inflation of Type I error rate.
Book
Human auditory evoked potentials
TL;DR: Introduction: Past, Present and Potential Recording Evoked Potentials: Means to an End Frequency-Domain: Music of the Hemispheres Finding Sources: Forwards and Backwards Sounds to Charm the Brain Interpreting the Waveforms: Time and Uncertainty Electrocochleography: From Song to Synapse Auditory Brainstem Responses: Peaks Along the Way Middle Latency Responses - The Brain and the Brawn
Related Papers (5)
EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.
Arnaud Delorme,Scott Makeig +1 more
How to get statistically significant effects in any ERP experiment (and why you shouldn't)
Steven J. Luck,Nicholas Gaspelin +1 more