scispace - formally typeset
Journal ArticleDOI

Does Research Design Affect Study Outcomes in Criminal Justice

TLDR
In this article, the authors examined the relationship between research design and study outcomes in a broad review of research evidence on crime and justice commissioned by the National Institute of Justice and found that design does have a systematic effect on outcomes in criminal justice studies.
Abstract
Does the type of research design used in a crime and justice study influence its conclusions? Scholars agree in theory that randomized experimental studies have higher internal validity than do nonrandomized studies. But there is not consensus regarding the costs of using nonrandomized studies in coming to conclusions regarding criminal justice interventions. To examine these issues, the authors look at the relationship between research design and study outcomes in a broad review of research evidence on crime and justice commissioned by the National Institute of Justice. Their findings suggest that design does have a systematic effect on outcomes in criminal justice studies. The weaker a design, indicated by internal validity, the more likely a study is to report a result in favor of treatment and the less likely it is to report a harmful effect of treatment. Even when comparing randomized studies with strong quasi-experimental research designs, systematic and statistically significant differences are obs...

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments

TL;DR: In this article, the authors discuss the mechanics of instrumental variables and the qualities that make for a good instrument, devoting particular attention to instruments derived from "natural experiments" for causal relationships.
Book

The Handbook of Research Synthesis and Meta-Analysis

TL;DR: A meta-analysis is literally an analysis of analyses, but conventionally the term is defined as "analysis of analyses of analyses" as discussed by the authors, which is what we use in this paper.
Journal ArticleDOI

Effectiveness of school-based programs to reduce bullying: a systematic and meta-analytic review

TL;DR: The meta-analysis showed that, overall, school-based anti-bullying programs are effective: on average, bullying decreased by 20–23% and victimization decreased by 17–20%, and the time is ripe to mount a new program of research on the effectiveness of anti- bullying programs based on these findings.
Journal ArticleDOI

The positive effects of cognitive–behavioral programs for offenders: A meta-analysis of factors associated with effective treatment

TL;DR: A meta-analysis of 58 experimental and quasi-experimental studies of the effects of cognitive-behavioral therapy on the recidivism of adult and juvenile offenders confirmed prior positive findings and explored a range of potential moderators to identify factors associated with variation in treatment effects as discussed by the authors.
References
More filters
Journal ArticleDOI

Statistical Methods in Psychology Journals: Guidelines and Explanations

TL;DR: The Task Force on Statistical Inference (TFSI) of the American Psychological Association (APA) as discussed by the authors was formed to discuss the application of significance testing in psychology journals and its alternatives, including alternative underlying models and data transformation.
Posted Content

Evaluating the Econometric Evaluations of Training Programs with Experimental Data

TL;DR: The National Supported Work Program employed an experimental design that randomly assigned some participants into a treatment group, receiving training, and the rest into a control group receiving no training as mentioned in this paper, and the difference between the post-training earnings of the two groups provided an unbiased estimate of the impact of the program.
Journal ArticleDOI

The efficacy of psychological, educational, and behavioral treatment. Confirmation from meta-analysis.

TL;DR: In contrast, meta-analytic reviews show a strong, dramatic pattern of positive overall effects that cannot readily be explained as artifacts of metaanalytic technique or generalized placebo effects.
Posted Content

Evaluating the Econometric Evaluations of Training Programs with Experimental Data

TL;DR: In this article, the authors compared the results of an employment and training program that was run as a field experiment, in which the participants were randomly assigned into a treatment or a control group, and compared these results to the estimates that might have been produced by an econometrician who evaluated the program using the same Econometric procedures that have been used in the program evaluation literature.