12
A Close Examination of Policy- Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier- Thompson

A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Embed Size (px)

Citation preview

Page 1: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality

Matthew Linick & Diane Fuselier-Thompson

Page 2: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Explicit criteria for judging program quality:

Can be clearly discerned in the text.

“A successful program will display the following characteristics…”

Implicit criteria for judging program quality:

Can be inferred by research questions.

“We will measure various aspects of the program…”

Implicit vs. Explicit Criteria for Judgments of Program Quality

Page 3: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Types of Studies:

Impact and Outcome reports (14)

Implementation reports (12)

Methodologies:

Mixed Method: Interview and Survey (17)

Qualitative: Interview and/or focus group (6), Observation (5)

Quantitative: RCT (4), Quasi-experiment (6), Comparative Statistical Analysis (4)

Methodologies of Research Reports

Page 4: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Explicit criteria were primarily included in implementation and outcome evaluations

When reports included explicit criteria, program quality was judged along methodological standards

Statistical significance in quantitative studies

Logic Model often used as rubric in implementation studies

Most evaluation reports refrain from making actual judgments of program quality

Authors tend to be uncritical of the evaluated program

Evaluations tend to report findings in lieu of making judgments

Explicit Statements of Criteria for judging program quality

Page 5: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Implementation evaluation: ‘Ending Violence in Schools: A Study of Morton North High School’ Logic model used as rubric

Evaluators constructed logic model based on relevant research and used this model to evaluate the implementation of violence prevention approaches used by the school

Impact evaluation: ‘Start Reading: Impact Study’

Statistical significance used as explicit criteria for judging program

Statistically detectable differences between treatment and control schools in using a regression discontinuity

student reading achievement

classroom reading instructional practices

student time engaged with print

Examples of Explicit Criteria used during Program Evaluations

Page 6: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Few explicit statements of criteria:

9 of 31 reports have explicit statements

Explicit criteria are stated more often when the program is deemed to be successful

6 of 9 reports with explicit statements were found to be successful

Explicit Statements of Criteria for judging program quality

Page 7: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Frequently Provided as a basis for judging program quality:

Statistical significance was often set as a goal of a research model attempting to estimate the positive impact of a program.

Research questions were used to establish the goal of the study, but the questions often did not contain criteria for making judgments.

Program goals were often referenced as the desired outcomes of the stakeholders or clients, but evaluators usually avoided such statements.

Implicit Statements of Criteria for judging program quality

Page 8: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Outcomes evaluation: ‘Extended School Day Program’ Evaluators framed evaluation questions as research

questions

What are the outcomes for students, teachers, and schools in this program?

What were the effects on test scores, attendance, teacher attitudes, etc.?

Example of Implicit Criteria used during Program Evaluations

Page 9: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Many of the reports imply that stakeholder expectations are a guiding principle for program ‘quality’.

Implicit Criteria for Program Quality in each of the 31 reviewed reports.

• Not easily discernable (found in discussion of results in 20/31 reports)

• Implied criteria tied to stakeholder expectations (25/31 reports)

Implicit Statements of Criteria for judging program quality

Page 10: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Examining the Program/Quality Criteria/Methodology.

• Implicit criteria reflects stakeholders’ desired outcomes.

• Desired outcomes influence methodological choices.

• Methodological choices influence the criteria used to judge program quality.

Implicit Statements of Criteria in Reports

Stakeholder Expectations Methodology Program Quality

Page 11: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

How do Statements of Criteria relate to Methodology Used?

 

Methodology Criteria “For this program to be successful…”

Interview/Focus Groups

Interviews and discussions with program recipients will demonstrate that these stakeholders have benefitted from the program in a certain manner

Mixed Methods: Interview/Survey

The survey and interview data will display that the implementation of the program is congruent with established goals of the program design

RCT This program will have a positive effect of a minimum level on the desired outcome.

Stakeholder expectations:

Program recipients experience the intervention in a certain way.

Large scale implementation of the intervention occurs in a certain manner.

The program will have an effect on a certain outcome.

Page 12: A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson

Contact Information:

Matt Linick

[email protected]

Diane R. Fuselier-Thompson

[email protected]

Questions, Comments, or Praise?