Upload
wendi
View
41
Download
1
Embed Size (px)
DESCRIPTION
Session 3: Analysis and reporting Collecting data for cost estimates Jack Worth (NFER) Panel on EEF reporting and data archiving Peter Henderson, Camilla Nevill, Steve Higgins and Andrew Bibby. Data collection for cost estimates. Jack Worth, NFER EEF London Evaluators Workshop. - PowerPoint PPT Presentation
Citation preview
Session 3: Analysis and reporting
Collecting data for cost estimatesJack Worth (NFER)
Panel on EEF reporting and data archivingPeter Henderson, Camilla Nevill, Steve Higgins and Andrew Bibby
Data collection for cost estimates
Jack Worth, NFEREEF London Evaluators Workshop
6th June 2014
Summary
• Reporting cost is an important part of evaluations
• Collecting the right information can be theoretically and practically challenging
• Evaluators should be sharing experiences and best practice
Effectiveness
Cost effectiveness
Costs to consider collecting
• Direct costs – how much would it cost a school or parent to buy the intervention?– how much did it cost may differ
• Indirect costs – do staff have to put in more hours than usual?– always think: what is the counterfactual?
• Needs to be quantitative information
Collecting cost information
• Impact or process evaluation?– cost effectiveness relates to impact...– ...but process methods e.g. surveys, case studies, often
better suited to cost data
• Planning and communication– across evaluation project team– with development partner
Reporting cost information
• Present the average cost to compare with average effectiveness
• Cost per pupil or per school?– may depend on the specific intervention– cost per pupil is comparable to other interventions
• Present the assumptions made and data sources used
Sharing best practice
• Recommend agreeing principles for a common approach among evaluators
• Questions?
NFER provides evidence for excellence
through its independence and insights, the
breadth of its work, its connections, and a
focus on outcomes.
National Foundation for Educational Research
The Mere, Upton Park
Slough, Berks SL1 2DQ
T: 01753 574123
F: 01753 691632
www.nfer.ac.uk
EEF reporting and data archivingPeter Henderson (EEF)Camilla Nevill (EEF)Steve Higgins (Durham) - ChairAndrew Bibby (FFT)
The reporting process and publication of results on EEF’s websitePeter Henderson(EEF)
Reporting process
Evaluation team
Dissemination team
Classifying the security of findings from EEF evaluationsCamilla Nevill (EEF)
Group Number of pupils
Effect size Estimated months’
progress Evidence strength
Literacy intervention 550 0.10 (0.03, 0.18) +2
www.educationendowmentfoundation.org.uk/evaluation
Example Appendix: Chatterbooks
Rating 1. Design 2. Power (MDES)
3. Attrition 4. Balance 5. Threats to validity
5 Fair and clear experimental design (RCT) < 0.2 < 10%
Well-balanced on observables
No threats to validity
4 Fair and clear experimental design (RCT, RDD) < 0.3 < 20%
3 Well-matched comparison (quasi-experiment) < 0.4 < 30%
Some threats
2 Matched comparison (quasi-experiment) < 0.5 < 40%
1 Comparison group with poor or no matching < 0.6 < 50%
0 No comparator > 0.6 > 50% Imbalanced on observables
Significant threats
Combining the results of evaluations with the meta-analysis in the Teaching and Learning ToolkitSteve Higgins (Durham)
Andrew Bibby
Archiving EEF project data
1. Include permission for linking and archiving in consent forms
2. Retain pupil identifiers
3. Label values and variables
4. Save Syntax or Do files
Prior to archiving…