scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Do programs designed to train working memory, other executive functions, and attention benefit children with ADHD? A meta-analytic review of cognitive, academic, and behavioral outcomes.

01 Dec 2013-Clinical Psychology Review (Clin Psychol Rev)-Vol. 33, Iss: 8, pp 1237-1252
TL;DR: Meta-analytic results indicate that claims regarding the academic, behavioral, and cognitive benefits associated with extant cognitive training programs are unsupported in ADHD, and leave open the possibility that cognitive training techniques designed to improve empirically documented executive function deficits may benefit children with ADHD.
About: This article is published in Clinical Psychology Review.The article was published on 2013-12-01. It has received 436 citations till now. The article focuses on the topics: Cognitive remediation therapy & Executive functions.
Citations
More filters
Journal ArticleDOI
TL;DR: Extensive evidence that brain-training interventions improve performance on the trained tasks, less evidence that such interventions improved performance on closely related tasks, and little evidence that training enhances performance on distantly related tasks or that training improves everyday cognitive performance are found.
Abstract: In 2014, two groups of scientists published open letters on the efficacy of brain-training interventions, or "brain games," for improving cognition. The first letter, a consensus statement from an international group of more than 70 scientists, claimed that brain games do not provide a scientifically grounded way to improve cognitive functioning or to stave off cognitive decline. Several months later, an international group of 133 scientists and practitioners countered that the literature is replete with demonstrations of the benefits of brain training for a wide variety of cognitive and everyday activities. How could two teams of scientists examine the same literature and come to conflicting "consensus" views about the effectiveness of brain training?In part, the disagreement might result from different standards used when evaluating the evidence. To date, the field has lacked a comprehensive review of the brain-training literature, one that examines both the quantity and the quality of the evidence according to a well-defined set of best practices. This article provides such a review, focusing exclusively on the use of cognitive tasks or games as a means to enhance performance on other tasks. We specify and justify a set of best practices for such brain-training interventions and then use those standards to evaluate all of the published peer-reviewed intervention studies cited on the websites of leading brain-training companies listed on Cognitive Training Data (www.cognitivetrainingdata.org), the site hosting the open letter from brain-training proponents. These citations presumably represent the evidence that best supports the claims of effectiveness.Based on this examination, we find extensive evidence that brain-training interventions improve performance on the trained tasks, less evidence that such interventions improve performance on closely related tasks, and little evidence that training enhances performance on distantly related tasks or that training improves everyday cognitive performance. We also find that many of the published intervention studies had major shortcomings in design or analysis that preclude definitive conclusions about the efficacy of training, and that none of the cited studies conformed to all of the best practices we identify as essential to drawing clear conclusions about the benefits of brain training for everyday activities. We conclude with detailed recommendations for scientists, funding agencies, and policymakers that, if adopted, would lead to better evidence regarding the efficacy of brain-training interventions.

754 citations


Cites result from "Do programs designed to train worki..."

  • ...This finding motivated many replication attempts, but several recent meta-analyses found little evidence that Cogmed reduced these symptoms when raters were blind to condition (Cortese et al., 2015; Rapport et al., 2013; Sonuga-Barke et al., 2013)....

    [...]

Journal ArticleDOI
TL;DR: It is concluded that working memory training programs appear to produce short-term, specific training effects that do not generalize to measures of “real-world” cognitive skills.
Abstract: It has been claimed that working memory training programs produce diverse beneficial effects. This article presents a meta-analysis of working memory training studies (with a pretest-posttest design and a control group) that have examined transfer to other measures (nonverbal ability, verbal ability, word decoding, reading comprehension, or arithmetic; 87 publications with 145 experimental comparisons). Immediately following training there were reliable improvements on measures of intermediate transfer (verbal and visuospatial working memory). For measures of far transfer (nonverbal ability, verbal ability, word decoding, reading comprehension, arithmetic) there was no convincing evidence of any reliable improvements when working memory training was compared with a treated control condition. Furthermore, mediation analyses indicated that across studies, the degree of improvement on working memory measures was not related to the magnitude of far-transfer effects found. Finally, analysis of publication bias shows that there is no evidential value from the studies of working memory training using treated controls. The authors conclude that working memory training programs appear to produce short-term, specific training effects that do not generalize to measures of “real-world” cognitive skills. These results seriously question the practical and theoretical importance of current computerized working memory programs as methods of training working memory skills.

584 citations

Journal ArticleDOI
TL;DR: This work reviews the current state of knowledge of EF impairments associated with psychopathology and limitations to the previous research in light of recent advances in understanding and measuring EF and offers concrete suggestions for improving EF assessment.
Abstract: Executive function (EF) is essential for successfully navigating nearly all of our daily activities. Of critical importance for clinical psychological science, EF impairments are associated with most forms of psychopathology. However, despite the proliferation of research on EF in clinical populations, with notable exceptions clinical and cognitive approaches to EF have remained largely independent, leading to failures to apply theoretical and methodological advances in one field to the other field and hindering progress. First, we review the current state of knowledge of EF impairments associated with psychopathology and limitations to the previous research in light of recent advances in understanding and measuring EF. Next, we offer concrete suggestions for improving EF assessment. Last, we suggest future directions, including integrating modern models of EF with state of the art, hierarchical models of dimensional psychopathology as well as translational implications of EF-informed research on clinical science.

575 citations

Journal ArticleDOI
TL;DR: Cognitive training had limited effects on ADHD symptoms according to assessments based on blinded measures, and approaches targeting multiple neuropsychological processes may optimize the transfer of effects from cognitive deficits to clinical symptoms.
Abstract: Objective The authors performed meta-analyses of randomized controlled trials to examine the effects of cognitive training on attention-deficit/hyperactivity disorder (ADHD) symptoms, neuropsychological deficits, and academic skills in children/adolescents with ADHD. Method The authors searched Pubmed, Ovid, Web of Science, ERIC, and CINAHAL databases through May 18, 2014. Data were aggregated using random-effects models. Studies were evaluated with the Cochrane risk of bias tool. Results Sixteen of 695 nonduplicate records were analyzed (759 children with ADHD). When all types of training were considered together, there were significant effects on total ADHD (standardized mean difference [SMD] = 0.37, 95% CI = 0.09–0.66) and inattentive symptoms (SMD = 0.47, 95% CI = 0.14–0.80) for reports by raters most proximal to the treatment setting (i.e., typically unblinded). These figures decreased substantially when the outcomes were provided by probably blinded raters (ADHD total: SMD = 0.20, 95% CI = 0.01–0.40; inattention: SMD = 0.32, 95% CI = −0.01 to 0.66). Effects on hyperactivity/impulsivity symptoms were not significant. There were significant effects on laboratory tests of working memory (verbal: SMD = 0.52, 95% CI = 0.24–0.80; visual: SMD = 0.47, 95% CI = 0.23–0.70) and parent ratings of executive function (SMD = 0.35, 95% CI = 0.08–0.61). Effects on academic performance were not statistically significant. There were no effects of working memory training, specifically on ADHD symptoms. Interventions targeting multiple neuropsychological deficits had large effects on ADHD symptoms rated by most proximal assessors (SMD = 0.79, 95% CI = 0.46–1.12). Conclusion Despite improving working memory performance, cognitive training had limited effects on ADHD symptoms according to assessments based on blinded measures. Approaches targeting multiple neuropsychological processes may optimize the transfer of effects from cognitive deficits to clinical symptoms.

434 citations


Cites background or methods from "Do programs designed to train worki..."

  • ...A second meta-analysis by Rapport et al.,9 published more recently and exploring a wider range of outcomes, found similar effects....

    [...]

  • ...In recent years, cognitive training has been investigated as a potential ADHD treatment.(9) Building on evidence of brain plasticity from rehabilitation science and contemporary developmental neuroscience, cognitive training is premised on the notion that key brain networks implicated in ADHD can be strengthened, and the cognitive processes they subserve improved, through controlled exposures to information processing tasks....

    [...]

  • ...Moreover, to increase statistical power, Rapport et al.(9) also included non-RCTs and pooled across design types, making effect size estimates of the effects of cognitive training on ADHD core symptoms and related neuropsychological impairment difficult to interpret....

    [...]

  • ...Moreover, to increase statistical power, Rapport et al.9 also included non-RCTs and pooled across design types, making effect size estimates of the effects of cognitive training on ADHD core symptoms and related neuropsychological impairment difficult to interpret....

    [...]

Journal ArticleDOI
TL;DR: CCT is associated with improvement in depressive symptoms and everyday functioning, though produces inconsistent effects on cognition, as determined by a systematic review and meta-analysis.

215 citations


Cites background from "Do programs designed to train worki..."

  • ..., 2006) as well as in a variety of diagnostic conditions including attention-deficit hyperactivity disorder (Rapport et al., 2013), schizophrenia (Wykes et al....

    [...]

References
More filters
Book
01 Dec 1969
TL;DR: The concepts of power analysis are discussed in this paper, where Chi-square Tests for Goodness of Fit and Contingency Tables, t-Test for Means, and Sign Test are used.
Abstract: Contents: Prefaces. The Concepts of Power Analysis. The t-Test for Means. The Significance of a Product Moment rs (subscript s). Differences Between Correlation Coefficients. The Test That a Proportion is .50 and the Sign Test. Differences Between Proportions. Chi-Square Tests for Goodness of Fit and Contingency Tables. The Analysis of Variance and Covariance. Multiple Regression and Correlation Analysis. Set Correlation and Multivariate Methods. Some Issues in Power Analysis. Computational Procedures.

115,069 citations

Journal ArticleDOI

49,129 citations


"Do programs designed to train worki..." refers result in this paper

  • ...This enhanced performance, albeit marginal and detectable only by statistical analysis (Cohen, 1988), warrants scrutiny given that nearly three-fourths of the studies reporting far transfer cognitive performance outcomes either failed to incorporate near transfer measures (27%) or reported far transfer effects (46%) that were similar to or of greater magnitude than their near transfer effects....

    [...]

Journal ArticleDOI
TL;DR: In this paper, an adjusted rank correlation test is proposed as a technique for identifying publication bias in a meta-analysis, and its operating characteristics are evaluated via simulations, and the test statistic is a direct statistical analogue of the popular funnel-graph.
Abstract: An adjusted rank correlation test is proposed as a technique for identifying publication bias in a meta-analysis, and its operating characteristics are evaluated via simulations. The test statistic is a direct statistical analogue of the popular "funnel-graph." The number of component studies in the meta-analysis, the nature of the selection mechanism, the range of variances of the effect size estimates, and the true underlying effect size are all observed to be influential in determining the power of the test. The test is fairly powerful for large meta-analyses with 75 component studies, but has only moderate power for meta-analyses with 25 component studies. However, in many of the configurations in which there is low power, there is also relatively little bias in the summary effect size estimate. Nonetheless, the test must be interpreted with caution in small meta-analyses. In particular, bias cannot be ruled out if the test is not significant. The proposed technique has potential utility as an exploratory tool for meta-analysts, as a formal procedure to complement the funnel-graph.

13,373 citations

Journal ArticleDOI
TL;DR: The results suggest that it is important to recognize both the unity and diversity ofExecutive functions and that latent variable analysis is a useful approach to studying the organization and roles of executive functions.

12,182 citations


"Do programs designed to train worki..." refers background in this paper

  • ...…Miyake et al., 2000) – which are identified consistently in meta-analytic (Dickstein et al., 2006; Willcutt et al., 2005) and factor analytic reviews (Miyake et al., 2000), supported by a strong genetic basis (Friedman et al., 2008), and shown to be developmentally contiguous (Huizinga, Dolan, &…...

    [...]

  • ...…play a critical role in supporting executive functions (EF), an umbrella term for higher-order cognitive processes such as working memory, set shifting, and inhibitory control that enable goal directed behavior and novel problem solving (Garon, Bryson, & Smith, 2008; Miyake et al., 2000)....

    [...]

  • ..., 2005) and factor analytic reviews (Miyake et al., 2000), supported by a strong genetic basis (Friedman et al....

    [...]

  • ...Longitudinal developmental research reveals three primary executive functions – working memory, inhibition, and set shifting (Garon et al., 2008; Miyake et al., 2000) – which are identified consistently in meta-analytic (Dickstein et al., 2006; Willcutt et al., 2005) and factor analytic reviews…...

    [...]

  • ...These anatomical structures play a critical role in supporting executive functions (EF), an umbrella term for higher-order cognitive processes such as working memory, set shifting, and inhibitory control that enable goal directed behavior and novel problem solving (Garon, Bryson, & Smith, 2008; Miyake et al., 2000)....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a rank-based data augmentation technique is proposed for estimating the number of missing studies that might exist in a meta-analysis and the effect that these studies might have had on its outcome.
Abstract: We study recently developed nonparametric methods for estimating the number of missing studies that might exist in a meta-analysis and the effect that these studies might have had on its outcome. These are simple rank-based data augmentation techniques, which formalize the use of funnel plots. We show that they provide effective and relatively powerful tests for evaluating the existence of such publication bias. After adjusting for missing studies, we find that the point estimate of the overall effect size is approximately correct and coverage of the effect size confidence intervals is substantially improved, in many cases recovering the nominal confidence levels entirely. We illustrate the trim and fill method on existing meta-analyses of studies in clinical trials and psychometrics.

9,163 citations