2025-26 Seminars and Recordings
October 3, 2025, 11 am CST
A Preliminary Data Analysis Workflow for Meta-Analysis of Dependent Effect Sizes
-
Speaker: Dr. James Pustejovsky, University of Wisconsin Madison
-
Description: In education and other fields, meta-analyses routinely involve dependent effect size estimates and hierarchical data structures. Statistical methods for analyzing dependent effect sizes are now well developed, but there has been less attention to the initial stages of data analysis, prior to formal modeling. In this talk, I will describe a generic workflow for preliminary, exploratory analyses of meta-analytic databases, which focuses on validating the integrity of the input data and informing decisions about subsequent statistical modeling. The workflow entails creating summaries and visualizations of features of the primary studies included in the meta-analysis in order to understand the structure and distribution of the data, especially with respect to between- and within-study variation. I will illustrate the workflow using data from a previously published meta-analysis and discuss connections between preliminary analysis and subsequent statistical modeling strategies. This talk is based on joint work with Jingru Zhang and Elizabeth Tipton, with a pre-print available at https://osf.io/preprints/metaarxiv/vfsqx_v1.
-
Video Recording
Novemeber 14, 2025, 11 am CST
***A Meta-Review on the Methodological Quality of Education Intervention Meta-Analyses
-
Speakers: Marta Pellegrini, University of Cagliari, Dr. Terri Pigott, Georgia State University, Hannah Scarbrough, Georgia State University, Natalie Pruitt, Georgia State University, and Caroline Chubb, Georgia State University
-
Description: This meta-review explored the current practices of education meta-analyses in terms of systematic review procedures and meta-analysis methods. We reviewed 247 meta-analyses on the effects of K–12 school-based academic interventions on student academic achievement published after 2011. We found that many reviews were mostly consistent with several best practice recommendations for the review stage, including problem formulation, selection and coding procedures. Reviews rarely preregistered their protocol or shared data, which reduce the transparency and reproducibility of the process. Best practice meta-analysis methods with robust consensus among methodologists were seldom used in our review sample. Recommendations for generating more credible and reproducible findings are provided. We also identify areas with the need for more research and guidance, including how to conduct critical appraisal, how to deal with outliers and missing covariate data, disciplined strategies to build meta-regression models.
-
Video Recording