scispace - formally typeset
Open AccessJournal ArticleDOI

Conceptual, methodological, and measurement factors that disqualify use of measurement invariance techniques to detect informant discrepancies in youth mental health assessments

Reads0
Chats0
TLDR
In this article , the authors provide an overview of conceptual, methodological, and measurement factors that should prevent researchers from applying measurement invariance techniques to detect informant discrepancies in youth mental health assessments.
Abstract
On page 1 of his classic text, Millsap (2011) states, “Measurement invariance is built on the notion that a measuring device should function the same way across varied conditions, so long as those varied conditions are irrelevant [emphasis added] to the attribute being measured.” By construction, measurement invariance techniques require not only detecting varied conditions but also ruling out that these conditions inform our understanding of measured domains (i.e., conditions that do not contain domain-relevant information). In fact, measurement invariance techniques possess great utility when theory and research inform their application to specific, varied conditions (e.g., cultural, ethnic, or racial background of test respondents) that, if not detected, introduce measurement biases, and, thus, depress measurement validity (e.g., academic achievement and intelligence). Yet, we see emerging bodies of work where scholars have “put the cart before the horse” when it comes to measurement invariance, and they apply these techniques to varied conditions that, in fact, may reflect domain-relevant information. These bodies of work highlight a larger problem in measurement that likely cuts across many areas of scholarship. In one such area, youth mental health, researchers commonly encounter a set of conditions that nullify the use of measurement invariance, namely discrepancies between survey reports completed by multiple informants, such as parents, teachers, and youth themselves (i.e., informant discrepancies). In this paper, we provide an overview of conceptual, methodological, and measurement factors that should prevent researchers from applying measurement invariance techniques to detect informant discrepancies. Along the way, we cite evidence from the last 15 years indicating that informant discrepancies reflect domain-relevant information. We also apply this evidence to recent uses of measurement invariance techniques in youth mental health. Based on prior evidence, we highlight the implications of applying these techniques to multi-informant data, when the informant discrepancies observed within these data might reflect domain-relevant information. We close by calling for a moratorium on applying measurement invariance techniques to detect informant discrepancies in youth mental health assessments. In doing so, we describe how the state of the science would need to fundamentally “flip” to justify applying these techniques to detect informant discrepancies in this area of work.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

The Operations Triad Model and Youth Mental Health Assessments: Catalyzing a Paradigm Shift in Measurement Validation

TL;DR: In this paper , the authors proposed a paradigm (Classifying Observations Necessitates Theory, Epistemology, and Testing) that addresses problems with using the Multi-Trait Multi-Method Matrix (MTMM) in youth mental health research.
Journal ArticleDOI

Introduction to the Special Issue. A Dozen Years of Demonstrating That Informant Discrepancies are More Than Measurement Error: Toward Guidelines for Integrating Data from Multi-Informant Assessments of Youth Mental Health

TL;DR: A recent review of the last 12 years of research and theory on informant discrepancies as discussed by the authors highlights limitations inherent to the most commonly used strategies for integrating multi-informant data in youth mental health.
Journal ArticleDOI

Integrating multi-informant reports of youth mental health: A construct validation test of Kraemer and colleagues’ (2003) Satellite Model

TL;DR: In this article , the authors presented the first construct validation test of the Satellite Model, which leverages principal components analysis (PCA) and strategic selection of informants to instantiate situational specificity in measurement, namely components reflecting variance attributable to the context in which informants observe behavior, the perspective from which they observe behavior (e.g., self/other), and behavior that manifests across contexts and perspectives (i.e., trait).
Journal ArticleDOI

Editorial Statement About JCCAP’s 2023 Special Issue on Informant Discrepancies in Youth Mental Health Assessments: Observations, Guidelines, and Future Directions Grounded in 60 Years of Research

TL;DR: De Los Reyes et al. as discussed by the authors focused on the most common outcome of these approaches, namely the significant discrepancies that arise when comparing estimates from any two informant's reports (i.e., informant discrepancies).
Journal ArticleDOI

Evidence-Based Assessment in Special Education Research: Advancing the Use of Evidence in Assessment Tools and Empirical Processes

TL;DR: In this paper , an empirically grounded framework, the Operations Triad Model (OTM), is proposed to support evidence-based assessment in the articulation of relevant educational theory.
References
More filters
Book

Statistical Power Analysis for the Behavioral Sciences

TL;DR: The concepts of power analysis are discussed in this paper, where Chi-square Tests for Goodness of Fit and Contingency Tables, t-Test for Means, and Sign Test are used.
Journal ArticleDOI

Estimating the reproducibility of psychological science

Alexander A. Aarts, +290 more
- 28 Aug 2015 - 
TL;DR: A large-scale assessment suggests that experimental reproducibility in psychology leaves a lot to be desired, and correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Journal ArticleDOI

Child/adolescent behavioral and emotional problems: implications of cross-informant correlations for situational specificity.

TL;DR: Etude de la coherence entre differentes sources (269 echantillons utilisees dans 119 etudes) concernant les evaluations des problemes affectifs et comportementaux d'enfants et d'adolescents âges de 1 1/2 a 19 ans.
Journal ArticleDOI

Measurement Invariance, Factor Analysis and Factorial Invariance.

TL;DR: In this article, structural bias, weak measurement invariance, strong factorial invariance (SFI), and factorial robustness have been defined and defined for employment/admissions testing and salary equity.
Related Papers (5)