CENTER FOR STATISTICAL CONSULTING & COACHING ON THE DESIGN OF RANDOMIZED TRIALS IN EDUCATION


 

Without any doubt, successful education is a fundamental pillar of a society’s prosperity. Across the globe, policymakers, practitioners, and researchers increasingly prioritize evidence-based education to improve learning and teaching (Organisation for Economic Co-operation and Development, 2007; Pellegrini & Vivanet, 2021; Slavin, 2002). This also applies to Germany (Kultusministerkonferenz, 2016).

The overarching goal of CoCoaPower is to advance these efforts by strengthening our national capacities to generate robust empirical evidence on the impacts of educational and psychological interventions, as well as innovative programs, products, and services, which can be derived from strong randomized trials (RTs).

“Strong” means well-designed, well-implemented, and well-analyzed (Spybrook, 2013). CoCoaPower focuses on laying the groundwork for the first aspect of a strong RT, namely a good design. After all, valid and reliable causal inferences about the actual effects of interventions presuppose RTs that demonstrate sufficient statistical power and precision—or, put differently, adequate design sensitivity (Lipsey, 1990).

 

CoCoaPower seeks to enhance the methodological quality of individually randomized trials (IRTs), multisite randomized trials (MSRTs), and cluster randomized trials (CRTs). Specifically, CoCoaPower pools and provides resources and expertise on design decisions in general and power analysis in particular.

Contact us!

CoCoaPower is organized around two main strands:

STATISTICAL CONSULTING SERVICE

We directly support researchers in planning rigorous RTs.

We provide…

  • concrete results obtained from power analysis tailored to researchers’ target study designs.
  • advice on general design decisions (e.g., regarding the choice of a specific design or the selection of covariates).

STATISTICAL COACHING INFRASTRUCTURE

We train researchers to develop well-designed RTs.

We provide…

  • specialized workshops on power analysis for various RT designs.
  • web-based tutorials on power analysis that guide researchers through the steps necessary to determine the relevant quantities of a robust study design.
Contact us!

BACKGROUND

CoCoaPower builds on previous and ongoing work in the context of MULTI-DES, a project series funded by the German Research Foundation (Grant 392108331). MULTI-DES is dedicated to compiling an extensive database of single- and multilevel design parameters and effect size benchmarks for K-12 students’ cognitive and socio-emotional outcomes in the German education system. These design parameters and effect size benchmarks are key to planning sufficiently powered and precise RTs and interpreting their results.

CoCoaPower is funded through a grant awarded to Larry V. Hedges by the Yidan Prize Foundation in 2018.

 

 

Send us your message. We’ll get in touch with you.

 

References

Kultusministerkonferenz. (2016). Gesamtstrategie der Kultusministerkonferenz zum Bildungsmonitoring [Overall strategy of the Standing Conference of the Ministers of Education and Cultural Affairs for educational monitoring]. Wolters Kluwer. https://www.kmk.org/fileadmin/Dateien/veroeffentlichungen_beschluesse/2015/2015_06_11-Gesamtstrategie-Bildungsmonitoring.pdf
Lipsey, M. W. (1990). Design sensitivity: Statistical power for experimental research. SAGE Publications.
Organisation for Economic Co-operation and Development. (2007). Evidence in education: Linking research and policy. OECD Publishing. https://doi.org/10.1787/9789264033672-en
Pellegrini, M., & Vivanet, G. (2021). Evidence-based policies in education: Initiatives and challenges in Europe. ECNU Review of Education, 4(1), 25–45. https://doi.org/10.1177/2096531120924670
Slavin, R. E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31(7), 15–21. https://doi.org/10.3102/0013189X031007015
Spybrook, J. (2013). Introduction to special issue on design parameters for cluster randomized trials in education. Evaluation Review, 37(6), 435–444. https://doi.org/10.1177/0193841X14527758