17
Evaluation Planning II: Evaluation Planning II: Setting Boundaries and Setting Boundaries and Analyzing the Evaluation Analyzing the Evaluation Context Context Dr. Suzan Ayers Dr. Suzan Ayers Western Michigan University Western Michigan University (courtesy of Dr. Mary Schutten) (courtesy of Dr. Mary Schutten)

Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Embed Size (px)

Citation preview

Page 1: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Evaluation Planning II: Setting Evaluation Planning II: Setting Boundaries and Analyzing the Boundaries and Analyzing the

Evaluation ContextEvaluation Context

Dr. Suzan Ayers Dr. Suzan Ayers Western Michigan UniversityWestern Michigan University

(courtesy of Dr. Mary Schutten)(courtesy of Dr. Mary Schutten)

Page 2: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Four considerations Four considerations

Identifying evaluation audiencesIdentifying evaluation audiences

Setting boundaries on whatever is Setting boundaries on whatever is evaluatedevaluated

Analyzing evaluation resourcesAnalyzing evaluation resources

Analyzing the political contextAnalyzing the political context

Page 3: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

1. Audience Identification 1. Audience Identification

Evaluation is adequate only if it collects Evaluation is adequate only if it collects information from and reports to all information from and reports to all legitimate evaluation audienceslegitimate evaluation audiences

Primary Audience: sponsor and clientPrimary Audience: sponsor and client Secondary audiences: depends on how Secondary audiences: depends on how

the evaluator defines constituents the evaluator defines constituents Common to limit to too narrow an audienceCommon to limit to too narrow an audience Figure 11.1 (p. 202)Figure 11.1 (p. 202) Return to list of audiences periodicallyReturn to list of audiences periodically Who will use results and how is key to Who will use results and how is key to

outlining studyoutlining study

Page 4: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Potential Secondary AudiencesPotential Secondary Audiences

Policy makersPolicy makers ManagersManagers Program fundersProgram funders Representatives of program employeesRepresentatives of program employees Community membersCommunity members Students and their parents Students and their parents (or other program (or other program

clients)clients)

RetireesRetirees Reps of influence groupsReps of influence groups

Page 5: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

2. Setting the Boundaries2. Setting the Boundaries Start point: detailed description of the Start point: detailed description of the

program being evaluatedprogram being evaluated

Program description: describes the Program description: describes the critical elements of the program (goals, critical elements of the program (goals, objectives, activities, target audiences, objectives, activities, target audiences, physical setting, context, personnel)physical setting, context, personnel) Need for description to be thorough enough Need for description to be thorough enough

to convey program’s essenceto convey program’s essence

Page 6: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Characterizing the EvaluandCharacterizing the Evaluand What problem was program designed to What problem was program designed to

correct?correct?

Of what does the program consist?Of what does the program consist?

What is the program’s setting and What is the program’s setting and context?context?

Who participates in the program?Who participates in the program?

What is the program’s history? Duration?What is the program’s history? Duration?

Page 7: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

When and under what conditions is the When and under what conditions is the program implemented?program implemented?

Are there unique contextual events Are there unique contextual events (contract negotiations, budget, (contract negotiations, budget, elections…) that may distort evaluation?elections…) that may distort evaluation?

What resources (human, materials, time) What resources (human, materials, time) are consumed by the program?are consumed by the program?

Has there been a previous evaluation?Has there been a previous evaluation?

Page 8: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Program TheoryProgram Theory

Specification of what must be done to Specification of what must be done to achieve desired goals, other impacts may achieve desired goals, other impacts may be anticipated, & how goals & impacts be anticipated, & how goals & impacts would be generated would be generated (Chen, 1990)(Chen, 1990)

Serves as a tool for:Serves as a tool for: Understanding programUnderstanding program Guiding evaluationGuiding evaluation

Evaluators must understand assumptions Evaluators must understand assumptions that link problem to resolve with program that link problem to resolve with program actions & characteristics & those a/c with actions & characteristics & those a/c with desired outcomesdesired outcomes

Page 9: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Helpful in developing program theory Helpful in developing program theory (Rossi, 1971)(Rossi, 1971)

1. 1. Causal hypothesisCausal hypothesis: links problem to a : links problem to a causecause

2. 2. Intervention hypothesisIntervention hypothesis: links program : links program actions to the causeactions to the cause

3. 3. Action hypothesisAction hypothesis: links the program : links the program activities with reduction of original problemactivities with reduction of original problem

Sample ProblemSample Problem Declining fitness levels in childrenDeclining fitness levels in children Causal hypothesis?Causal hypothesis? Intervention hypothesis?Intervention hypothesis? Action hypothesis?Action hypothesis?

Page 10: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Methods for Describing Methods for Describing Evaluand Evaluand

Descriptive DocumentsDescriptive Documents Program documents, proposals for funding, Program documents, proposals for funding,

publications, minutes of meetings, etc…publications, minutes of meetings, etc… InterviewsInterviews

Stakeholders, all relevant audiencesStakeholders, all relevant audiences ObservationsObservations

Observe program in action, get a “feel” for Observe program in action, get a “feel” for what really is going onwhat really is going on

Often reveal Often reveal difference betweendifference between how how program program runsruns and how it is and how it is supposed to runsupposed to run

Page 11: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Challenge of balancing different Challenge of balancing different perspectivesperspectives MinorMinor differences may reflect stakeholder differences may reflect stakeholder

values or positions and can be informativevalues or positions and can be informative MajorMajor differences require that evaluator differences require that evaluator

attempt to achieve some consensus attempt to achieve some consensus description of the program before initiating description of the program before initiating the evaluationthe evaluation

Redescribing evaluand as it changesRedescribing evaluand as it changes Changes may be due toChanges may be due to

Responsiveness to feedbackResponsiveness to feedback Implementation not quite aligned with designers’ Implementation not quite aligned with designers’

visionvision Natural historical evolution of an evaluandNatural historical evolution of an evaluand

Page 12: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

3. Analyzing Evaluation Resources: 3. Analyzing Evaluation Resources: $$

Cost-free evaluationCost-free evaluation: cost savings realized : cost savings realized via evaluation may pay for evaluation over via evaluation may pay for evaluation over timetime

If budget limits are set before the evaluation If budget limits are set before the evaluation process begins, it will affect planning process begins, it will affect planning decisions that followdecisions that follow

Often evaluator has no input into the budgetOften evaluator has no input into the budget Offer 2-3 levels of services (Chevy vs. BMW)Offer 2-3 levels of services (Chevy vs. BMW) Budgets should remain somewhat flexible to Budgets should remain somewhat flexible to

allow for evaluation process to focus on new allow for evaluation process to focus on new insights during the processinsights during the process

Page 13: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Analyzing Resources- PersonnelAnalyzing Resources- Personnel Can the evaluator use ‘free’ staff on site?Can the evaluator use ‘free’ staff on site?

Program staff could collect dataProgram staff could collect data Secretaries type, search recordsSecretaries type, search records Grad students doing internship, course-related Grad students doing internship, course-related

workwork PTAPTA

Key that evaluator Key that evaluator ORIENT, TRAIN, QCORIENT, TRAIN, QC such such volunteers to maintain evaluation’s integrityvolunteers to maintain evaluation’s integrity Supervision and spot-checking useful practicesSupervision and spot-checking useful practices Task selection is essential to maintain study’s Task selection is essential to maintain study’s

validity/credibilityvalidity/credibility

Page 14: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Analyzing Analyzing Resources:Resources:Technology, others, constraintsTechnology, others, constraints

The more information that must be The more information that must be generated by the evaluator, the costlier generated by the evaluator, the costlier the evaluationthe evaluation

Are existing data, records, evaluations, Are existing data, records, evaluations, and other documents available?and other documents available?

Using newer technology, less expensive Using newer technology, less expensive means of data collection can be employedmeans of data collection can be employed Web-based surveys, e-mails, conference calls, Web-based surveys, e-mails, conference calls,

posting final reports on websitesposting final reports on websites Time (avoid setting unrealistic timelines)Time (avoid setting unrealistic timelines)

Page 15: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

4. Analyzing the Political 4. Analyzing the Political ContextContext

Politics begin with decision to evaluate Politics begin with decision to evaluate and influence entire evaluation processand influence entire evaluation process Who stands to gain/lose most from different Who stands to gain/lose most from different

evaluation scenarios?evaluation scenarios? Who has the power in this setting?Who has the power in this setting? How is evaluator expected to relate to How is evaluator expected to relate to

different groups?different groups? From which stakeholders will cooperation be From which stakeholders will cooperation be

required? Are they willing to cooperate?required? Are they willing to cooperate? Who has vested interest in outcomes?Who has vested interest in outcomes? Who will need to be informed along the way?Who will need to be informed along the way? What safeguards need to be formalized (i.e., What safeguards need to be formalized (i.e.,

IRB)?IRB)?

Page 16: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Variations Caused byVariations Caused byEvaluation Approach UsedEvaluation Approach Used

Variations in the evaluation plan will Variations in the evaluation plan will occur based on the approach taken by occur based on the approach taken by the evaluatorthe evaluator

Each approach has strengths and Each approach has strengths and limitationslimitations

Review Table 9.1 for characteristics of Review Table 9.1 for characteristics of eacheach

Use of single approaches tends to be Use of single approaches tends to be limitinglimiting

Page 17: Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

To Proceed or Not?To Proceed or Not? Based on information about context, Based on information about context,

program, stakeholders & resources, decide program, stakeholders & resources, decide ‘go/no-go’‘go/no-go’

Ch. 10 inappropriate evaluation conditions:Ch. 10 inappropriate evaluation conditions: Evaluation would produce trivial informationEvaluation would produce trivial information Evaluation results will not be usedEvaluation results will not be used Cannot yield useful, valid informationCannot yield useful, valid information Evaluation is premature for the stage of the Evaluation is premature for the stage of the

programprogram Motives of the evaluation are improperMotives of the evaluation are improper Ethical considerations Ethical considerations (utility, feasibility, propriety, (utility, feasibility, propriety,

accuracy)accuracy)