scispace - formally typeset
M

Marley W. Watkins

Researcher at Baylor University

Publications -  174
Citations -  7920

Marley W. Watkins is an academic researcher from Baylor University. The author has contributed to research in topics: Wechsler Intelligence Scale for Children & Wechsler Adult Intelligence Scale. The author has an hindex of 46, co-authored 173 publications receiving 6935 citations. Previous affiliations of Marley W. Watkins include Pennsylvania State University & Arizona State University.

Papers
More filters
Journal ArticleDOI

ADHD and Achievement Meta-Analysis of the Child, Adolescent, and Adult Literatures and a Concomitant Study With College Students

TL;DR: A meta-analysis of the published literature since 1990 to determine the magnitude of achievement problems associated with attention-deficit/hyperactivity disorder and the impact of moderator variables on ADHD and achievement found student ratings were as predictive as parent ratings.
Journal ArticleDOI

Exploratory Factor Analysis: A Guide to Best Practice:

TL;DR: Exploratory Factor Analysis (EFA) is a multivariate statistical method that has become a fundamental tool in the development and validation of psychological theories and measurements as discussed by the authors, however, it is not suitable for the analysis of human subjects.
Journal ArticleDOI

Parent and teacher ratings of attention-deficit/hyperactivity disorder symptoms: Factor structure and normative data.

TL;DR: The fit of a correlated, 2-factor structure of ADHD was examined and was confirmed for both parent and teacher ratings and was invariant across child gender, age, informant, informant gender, and language.
Journal ArticleDOI

Long-Term Stability of the Wechsler Intelligence Scale for Children - Third Edition

TL;DR: Long-term stability of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) was investigated with a sample of 344 students from 2 school districts twice evaluated for special education eligibility at an average interval of 2.84 years.
Journal ArticleDOI

Interobserver Agreement in Behavioral Research: Importance and Calculation

TL;DR: In this article, Cohen's kappa has been proposed as the more psychometrically sound statistic for assessing interobserver agreement, despite repeated admonitions and empirical evidence indicating that it is not the most statistically sound statistic to determine inter-server agreement due to its inability to take chance into account.