EconPapers    
Economics at your fingertips  
 

Assessing and Strengthening Evidence-Based Program Registries’ Usefulness for Social Service Program Replication and Adaptation

Christopher S. Horne

Evaluation Review, 2017, vol. 41, issue 5, 407-435

Abstract: Background: Government and private funders increasingly require social service providers to adopt program models deemed “evidence based,†particularly as defined by evidence-based program registries, such as What Works Clearinghouse and National Registry of Evidence-Based Programs and Practices. These registries summarize the evidence about programs’ effectiveness, giving near-exclusive priority to evidence from experimental-design evaluations. The registries’ goal is to aid decision making about program replication, but critics suspect the emphasis on evidence from experimental-design evaluations, while ensuring strong internal validity, may inadvertently undermine that goal, which requires strong external validity as well. Objective: The objective of this study is to determine the extent to which the registries’ reports provide information about context-specific program implementation factors that affect program outcomes and would thus support decision making about program replication and adaptation. Method: A research-derived rubric was used to rate the extent of context-specific reporting in the population of seven major registries’ evidence summaries ( N = 55) for youth development programs. Findings: Nearly all (91%) of the reports provide context-specific information about program participants, but far fewer provide context-specific information about implementation fidelity and other variations in program implementation (55%), the program’s environment (37%), costs (27%), quality assurance measures (22%), implementing agencies (19%), or staff (15%). Conclusion: Evidence-based program registries provide insufficient information to guide context-sensitive decision making about program replication and adaptation. Registries should supplement their evidence base with nonexperimental evaluations and revise their methodological screens and synthesis-writing protocols to prioritize reporting—by both evaluators and the registries themselves—of context-specific implementation factors that affect program outcomes.

Keywords: evidence based; registry; program evaluation; program replication; program adaptation; external validity; evaluation use; evaluation utilization; contextual evaluation; social services; youth development (search for similar items in EconPapers)
Date: 2017
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0193841X15625014 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:evarev:v:41:y:2017:i:5:p:407-435

DOI: 10.1177/0193841X15625014

Access Statistics for this article

More articles in Evaluation Review
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:evarev:v:41:y:2017:i:5:p:407-435