scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Making ERP research more transparent: Guidelines for preregistration

TL;DR: In this article, the authors present an overview of the problems associated with undisclosed analytic flexibility, discuss why and how EEG researchers would benefit from adopting preregistration, provide guidelines and examples on how to preregister data preprocessing and analysis steps in typical ERP studies, and conclude by discussing possibilities and limitations of this open science practice.
About: This article is published in International Journal of Psychophysiology.The article was published on 2021-03-04 and is currently open access. It has received 26 citations till now. The article focuses on the topics: Hindsight bias.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors review three types of measurements metrics: data quality, group-level internal consistency, and subject level internal consistency and demonstrate how failing to consider data quality and internal consistency can undermine statistical inferences.

32 citations

Journal ArticleDOI
TL;DR: In this article, the authors argue that most methodological reform attempts suffer from similar mistakes and over-generalizations to the ones they aim to address, and they argue that this can be attributed in part to lack of formalism and first principles.
Abstract: Current attempts at methodological reform in sciences come in response to an overall lack of rigor in methodological and scientific practices in experimental sciences. However, most methodological reform attempts suffer from similar mistakes and over-generalizations to the ones they aim to address. We argue that this can be attributed in part to lack of formalism and first principles. Considering the costs of allowing false claims to become canonized, we argue for formal statistical rigor and scientific nuance in methodological reform. To attain this rigor and nuance, we propose a five-step formal approach for solving methodological problems. To illustrate the use and benefits of such formalism, we present a formal statistical analysis of three popular claims in the metascientific literature: (i) that reproducibility is the cornerstone of science; (ii) that data must not be used twice in any analysis; and (iii) that exploratory projects imply poor statistical practice. We show how our formal approach can inform and shape debates about such methodological claims.

25 citations

Journal ArticleDOI
TL;DR: In this article , the authors provide an integrated overview of community-developed resources that can support collaborative, open, reproducible, replicable, robust and generalizable neuroimaging throughout the entire research cycle from inception to publication.

24 citations

Journal ArticleDOI
TL;DR: In this paper , the authors provide recommendations for the use of these methods, with a focus on foundational aspects of frequency domain and time-frequency analyses, and provide publication guidelines, which aim to foster replication and scientific rigor, assist new researchers who wish to enter the field of brain oscillations, and facilitate communication among authors, reviewers, and editors.
Abstract: Since its beginnings in the early 20th century, the psychophysiological study of human brain function has included research into the spectral properties of electrical and magnetic brain signals. Now, dramatic advances in digital signal processing, biophysics, and computer science have enabled increasingly sophisticated methodology for neural time series analysis. Innovations in hardware and recording techniques have further expanded the range of tools available to researchers interested in measuring, quantifying, modeling, and altering the spectral properties of neural time series. These tools are increasingly used in the field, by a growing number of researchers who vary in their training, background, and research interests. Implementation and reporting standards also vary greatly in the published literature, causing challenges for authors, readers, reviewers, and editors alike. The present report addresses this issue by providing recommendations for the use of these methods, with a focus on foundational aspects of frequency domain and time-frequency analyses. It also provides publication guidelines, which aim to (1) foster replication and scientific rigor, (2) assist new researchers who wish to enter the field of brain oscillations, and (3) facilitate communication among authors, reviewers, and editors.

24 citations

Journal ArticleDOI
TL;DR: In this paper , the authors present a critical review of iEEG research practices in a didactic framework for newcomers, as well addressing issues encountered by proficient researchers, and suggest potential guidelines for working with the data and answer frequently asked questions based on the most widespread practices.

23 citations

References
More filters
Journal ArticleDOI
TL;DR: FieldTrip is an open source software package that is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data.
Abstract: This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

7,963 citations

Journal ArticleDOI
TL;DR: Quantitative procedures for computing the tolerance for filed and future null results are reported and illustrated, and the implications are discussed.
Abstract: For any given research area, one cannot tell how many studies have been conducted but never reported. The extreme view of the "file drawer problem" is that journals are filled with the 5% of the studies that show Type I errors, while the file drawers are filled with the 95% of the studies that show nonsignificant results. Quantitative procedures for computing the tolerance for filed and future null results are reported and illustrated, and the implications are discussed. (15 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)

7,159 citations

Journal ArticleDOI
Ziva Kunda1
TL;DR: It is proposed that motivation may affect reasoning through reliance on a biased set of cognitive processes--that is, strategies for accessing, constructing, and evaluating beliefs--that are considered most likely to yield the desired conclusion.
Abstract: It is proposed that motivation may affect reasoning through reliance on a biased set of cognitive processes—that is, strategies for accessing, constructing, and evaluating beliefs. The motivation to be accurate enhances use of those beliefs and strategies that are considered most appropriate, whereas the motivation to arrive at particular conclusions enhances use of those that are considered most likely to yield the desired conclusion. There is considerable evidence that people are more likely to arrive at conclusions that they want to arrive at, but their ability to do so is constrained by their ability to construct seemingly reasonable justifications for these conclusions. These ideas can account for a wide variety of research concerned with motivated reasoning. The notion that goals or motives affect reasoning has a long and controversial history in social psychology. The propositions that motives may affect perceptions (Erdelyi, 1974), attitudes (Festinger, 1957), and attributions (Heider, 1958) have been put forth by some psychologists and challenged by others. Although early researchers and theorists took it for granted that motivation may cause people to make self-serving attributions and permit them to believe what they want to believe because they want to believe it, this view, and the research used to uphold it, came under concentrated criticism in the 1970s. The major and most damaging criticism of the motivational view was that all research purported to demonstrate motivated reasoning could be reinterpreted in entirely cognitive, nonmotivational terms (Miller & Ross, 1975; Nisbett & Ross, 1980). Thus people could draw self-serving conclusions not because they wanted to but because these conclusions seemed more plausible, given their prior beliefs and expectancies. Because both cognitive and motivational accounts could be generated for any empirical study, some theorists argued that the hot versus cold cognition controversy could not be solved, at least in the attribution paradigm (Ross & Fletcher, 1985; Tetlock & Levi, 1982). One reason for the persistence of this controversy lies in the failure of researchers to explore the mechanisms underlying motivated reasoning. Recently, several authors have attempted to rectify this neglect (Kruglanski & Freund, 1983; Kunda, 1987; Pyszczynski & Greenberg, 1987; Sorrentino & Higgins, 1986). All these authors share a view of motivation as having its effects through cognitive processes: People rely on cognitive processes and representations to arrive at their desired conclusions, but motivation plays a role in determining which of these will be used on a given occasion.

6,643 citations

Journal ArticleDOI
TL;DR: This paper forms a null hypothesis and shows that the nonparametric test controls the false alarm rate under this null hypothesis, enabling neuroscientists to construct their own statistical test, maximizing the sensitivity to the expected effect.

6,502 citations

Journal ArticleDOI
TL;DR: Using maximum entropy approximations of differential entropy, a family of new contrast (objective) functions for ICA enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.
Abstract: Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. We use a combination of two different approaches for linear ICA: Comon's information theoretic approach and the projection pursuit approach. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast functions for ICA. These contrast functions enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model, and it is shown how to choose contrast functions that are robust and/or of minimum variance. Finally, we introduce simple fixed-point algorithms for practical optimization of the contrast functions.

6,144 citations