#TBED2019 prof Larry Hedges - purely statistical issues have had the lion's share of attention - oartly cos easy to do.
But a new role of expert reviewer is emerging. Not all ed researchers understand what doing good reviewing involves. New institutions too.
#TBED2019@dylanwiliam lists four important questions for "critical consumers" to ask of education research:
- Does it solve a problem we have?
- How much improvement will we get?
- How much will it cost?
- Will it work here?
.@dylanwiliam refers to the issue of the ‘file drawer problem’: it’s easier to get experiments that have a statistically significant result published, but the statistical power of most social studies experiments is low; so only ‘lucky’ experiments tend to get published! #TBED2019
#TBED2019@dylanwiliam great overview of impact of closeness of assessment to approach to T&L. Incentive to use sensitive ones for researchers - but this challenges application potential & in real world sensitive assessments v hard to design well
“Effect size can be influenced by the outcome measures used... assessments that test things that are very close to what has been taught in an intervention will tend to result in larger effect sizes than those that are more remote, like standardized test” @dylanwiliam#TBED2019
Another issue with meta-analysis...
Interventions vary in their:
Duration
Intensity
Collateral effects e.g. assignment of teachers [teacher quality]
Dylan William #TBED2019
@CEMatDurham@jpembroke A good blog James and CEM! The impact is little or low because the vast majority of ‘tracking data’ is pure fiction. Pseudo science at its worst - the pretence of a mathematically precise line from an unwarranted assumption to a forgone conclusion!
@jpembroke@CEMatDurham As ever, great blog. Totally agree with what you are saying. First task in my new headship was to scrap existing complicated tracking system, put in a simple one and use @CEMatDurham Incas+ assessments to cross reference judgments.