scispace - formally typeset
P

Peter E. Clayson

Researcher at University of South Florida

Publications -  58
Citations -  2404

Peter E. Clayson is an academic researcher from University of South Florida. The author has contributed to research in topics: Cognition & Eriksen flanker task. The author has an hindex of 24, co-authored 50 publications receiving 1866 citations. Previous affiliations of Peter E. Clayson include University of California, Los Angeles & Brigham Young University.

Papers
More filters
Journal ArticleDOI

Making sense of all the conflict: a theoretical review and critique of conflict-related ERPs.

TL;DR: There is considerable evidence that amplitude of the ERN is sensitive to the degree of response conflict, consistent with a role in conflict monitoring, and it remains unclear, however, to what degree contextual, individual, affective, and motivational factors influence ERN amplitudes and how ERN Amplitudes are related to regulative changes in behavior.
Journal ArticleDOI

How does noise affect amplitude and latency measurement of event‐related potentials (ERPs)? A methodological critique and simulation study

TL;DR: Results indicated mean amplitude was the most robust against increases in background noise and the adaptive mean measure was more biased, but represented an efficient estimator of the true ERP signal particularly for individual-subject latency variability.
Journal ArticleDOI

Conflict adaptation and sequential trial effects: support for the conflict monitoring theory.

TL;DR: Results indicate that RTs and ERP measures are sensitive to modulations of cognitive control associated with conflict across multiple congruent and incongruent trials.
Journal ArticleDOI

ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

TL;DR: A detailed description of the conceptual framework of G theory is provided using examples relevant to ERP researchers, the algorithms needed to estimate ERP score reliability are presented, and a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory are provided.
Journal ArticleDOI

Psychometric considerations in the measurement of event-related brain potentials: Guidelines for measurement and reporting.

TL;DR: The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses and advocates the use of generalizability theory for estimating score dependability as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research.