EconPapers    
Economics at your fingertips  
 

Informing Estimates of Program Effects for Studies of Mathematics Professional Development Using Teacher Content Knowledge Outcomes

Geoffrey Phelps, Benjamin Kelcey, Nathan Jones and Shuangshuang Liu

Evaluation Review, 2016, vol. 40, issue 5, 383-409

Abstract: Mathematics professional development is widely offered, typically with the goal of improving teachers’ content knowledge, the quality of teaching, and ultimately students’ achievement. Recently, new assessments focused on mathematical knowledge for teaching (MKT) have been developed to assist in the evaluation and improvement of mathematics professional development. This study presents empirical estimates of average program change in MKT and its variation with the goal of supporting the design of experimental trials that are adequately powered to detect a specified program effect. The study drew on a large database representing five different assessments of MKT and collectively 326 professional development programs and 9,365 teachers. Results from cross-classified hierarchical growth models found that standardized average change estimates across the five assessments ranged from a low of 0.16 standard deviations ( SD s) to a high of 0.26 SD s. Power analyses using the estimated pre- and posttest change estimates indicated that hundreds of teachers are needed to detect changes in knowledge at the lower end of the distribution. Even studies powered to detect effects at the higher end of the distribution will require substantial resources to conduct rigorous experimental trials. Empirical benchmarks that describe average program change and its variation provide a useful preliminary resource for interpreting the relative magnitude of effect sizes associated with professional development programs and for designing adequately powered trials.

Keywords: mathematics; professional development; mathematical knowledge for teaching; program evaluation; group randomized trial; program effects (search for similar items in EconPapers)
Date: 2016
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0193841X16665024 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:evarev:v:40:y:2016:i:5:p:383-409

DOI: 10.1177/0193841X16665024

Access Statistics for this article

More articles in Evaluation Review
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:evarev:v:40:y:2016:i:5:p:383-409