96
Principles of Program Evaluation A Workshop for the Southwest Virginia Professional Education Consortium Christopher R. Gareis, Ed.D. The College of William & Mary [email protected]

Christopher R. Gareis, Ed.D . The College of William & Mary [email protected]

  • Upload
    malini

  • View
    42

  • Download
    0

Embed Size (px)

DESCRIPTION

Principles of Program Evaluation A Workshop for the Southwest Virginia Professional Education Consortium. Christopher R. Gareis, Ed.D . The College of William & Mary [email protected]. Pre-Session Understandings. Perspective on Our Time Together Today. The Profession of Education - PowerPoint PPT Presentation

Citation preview

Page 1: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Principles of Program EvaluationA Workshop for the

Southwest Virginia Professional Education Consortium

Christopher R. Gareis, Ed.D.The College of William & Mary

[email protected]

Page 2: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Pre-Session Understandings

Page 3: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Perspective on Our Time Together Today

• The Profession of EducationThe regular and expert exercise of learned judgment and skills in service of one of an individual’s life needs and one of society’s greater purposes.

• Zen and the Art of Professional Development

• The Principle of Collective Wisdom

Page 4: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Let’s graph ‘em

Page 5: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Shared Understandings

• What is a “program”?A planned, multi-faceted series of activities leading to intended outcomes for individuals, groups, and/or a community.

• What is “evaluation”?The use of valid and reliable information/data to make judgments.

Page 6: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Definitions of “Program Evaluation”

• Judging the worth of a program. (Scriven, 1967)

• The process of systematically determining the quality of a school program and how the program can be improved.

(Sanders, 2000)

• The deliberate process of making judgments and decisions about the effectiveness, direction, and value of educational programs in our schools.

(me)

Page 7: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

DO YOU KNOW OF…?

• …a program that you thought was effective but was discontinued without explanation?

• …a program that you believe is likely a waste of time, money, and effort, and yet it continues?

Page 8: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Random Acts of Improvement

Our Intentions

Our Efforts

Page 9: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Aligned Acts of Improvement

Our Intentions

Our Efforts

Page 10: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Perspective

Planning Evaluation

Page 11: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

The Role of Evaluative Thinking

Planning

Redesigning Evaluative Thinking Implementing

Frechtling, 2007

Page 12: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

• > 20 major models of program evaluation• Seminal Model: Daniel Stufflebeam’s CIPP

model (1971)

Our focus: Creating a Logic Model Focusing the Evaluation Identifying Performance Indicators EVALUATIVE THINKING

Page 13: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Our Objectives Today1. Understand basic concepts and principles of program evaluation

at the school level

2. Create a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational program

3. Select the appropriate focus for the evaluation of a program

4. Identify appropriate data sources aligned to the intended educational outcomes of a selected school-level program

5. Appreciate the importance of engaging in intentional program evaluation as a teacher leader

Page 14: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Talk About Your Program

1. What is your program?2. What does your program do?3. Who participates in your program?4. Whom does your program serve?5. How do they benefit?6. How do you know, or how would you know,

that you program is a success?

Page 15: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Think Visually

• Sketch a visual metaphor of your program.

Page 16: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Every educational program:• Uses resources or INPUTS, such as…

– Materials– Funds– Teachers– Students

• Engages in activities or PROCESSES, such as…– Instruction– Interventions

• Intends produce certain results or OUTCOMES, such as…– Learning– Achievement– Engendering of certain attitudes or values– Eligibility for a next step in life– Bringing about some change

Page 17: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Visualizing an Educational ProgramThe Logic Model

Inputs Processes Outcomes

Page 18: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What is a logic model?A diagram with text that illustrates and describes the reasoned relationships

among program elements and intended outcomes to be attained.

A visual theory of change.

We use these

resources…

For these activities…

So that these

students / teachers …

Can produce

these results…

Leading to these

changes for the better…

How Why

Modified from guest lecture by John A. McLaughlin (November 16, 2009) at The College of William & Mary

Page 19: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Simple Logic Model

Inputs Processes Initial Outcomes

Intermediate Outcomes

Ultimate Outcomes

Page 20: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Everyday Logic

Get an

antihistamine

Take the

antihistamine

Feel better

Page 21: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Try This

• You’re hungry.

• Sketch a logic model to address your need.

• Oh, but one twist: You have no food in the house.

Assumptions

Page 22: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Inputs Processes InitialOutcomes

Ultimate Outcomes

Core Teachers• Math

Resource Teachers• Special Education• Gifted Education

Common planning by teachers• Frequency• Enthusiasm

Student affect for instruction

Student engagement in instruction

Student achievement• Quarterly class grades in math

Student achievement• 6th grade Math SOL• 7th grade Math SOL

Greenlawn Middle SchoolCommon Math Planning Initiative

Intermediate Outcomes

Page 23: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

The Reach of Intended OutcomesInitial

OutcomesIntermediate

OutcomesUltimate Outcomes

Time Immediately or very closely following program activity

End of a term or year, typically

End of a year, several years, or beyond formal K-12 schooling

People Targeted individual students and/or aggregate groups

Targeted individual students and/or aggregate groups

• Targeted individual students and/or aggregate groups• May have an organizational or societal impact

Indicators • Discrete/Finite• Measurable and/or observable

• May be sets of discrete indicators• Measurable and/or observable

• May be sets of discrete indicators• May or may not be readily measured or observed

Page 24: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Now Try This

• The Great Recession has had a broad and, in many places, deep impact on Americans.

• One “solution” to our economic morass has been a call from policymakers for improved financial literacy of our citizens.

• K-12 schools are seen as the logical place to teach young citizens financial literacy.

• Sketch a logic model that depicts this “theory of change”Causality

Page 25: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Source: STEM Education Symposium (retrieved 9/16/12).

Page 26: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu
Page 27: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

A Simple Logic Model:9th Grade Transition Program

9th Grade TeachersGuidance

CounselorsStudent Mentors

New Students

Mentor TrainingSummer

OrientationMentor

ActivitiesMilestone Activities

Sense of Belonging of 9th

Graders

Academic Success of 9th

Graders

Promotion to 10th Grade

Reduced Dropout Rates

Graduation Rate of 9th Grade

Cohorts

Page 28: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu
Page 29: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Stephen M. Millett, Susan Tave Zelman, (2005) "Scenario analysis and a logic model of public education in Ohio", Strategy & Leadership, Vol. 33 Iss: 2, pp.33 - 40

Page 30: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Source: SUNY NGLC Grant Proposal (retrieved 9/16/12)

Page 31: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Context & Inputs

Processes of Professional Development

Outcomes for Teachers Impact on Student Learning

Experience & Expertise of Professional Developers

Experience & Expertise of Participants- Pre-service teachers- In-service teachers- Instructional leaders- Administrators

Psychometric Principles-Translated into practical terms & skills for teachers

State Standardized Assessments- de facto Curriculum?

State &/or School District Curriculum

District Goals, Initiatives, or Imperatives

Explore AlignmentC=Ia=A

Unpack Curriculum- Content- Cognitive levels

Create a Table of Specifications or “blueprint”

Critique an assessment for validity & reliability

Explore uses of ToS- Create an assessment- Critique & improve an assessment- Create a unit plan assessment- Plan instruction- Analyze assessment results

Create good assessment items, prompts, assignments, & rubrics

Understand role of C=Ia=A Alignment

Understand relationship between assessment & evaluation

Understand & apply concepts of validity & reliability

Use a ToS to:- Create an assessment- Critique & improve an assessment- Create a unit plan assessment- Plan instruction- Analyze assessment results

Apply principles to the spectrum of assessment types & practices

Create assessments that yield valid & reliable data

Use assessment data to make decisions:- About student learning- For student learning- About assessments- About instruction

Contribute to the design & development of common assessments

Provide “opportunity to learn” throughaligned instruction

Exposure to the full curriculum (e.g., depth & skill development)

Increased student achievement

Deeper, more meaningful learning

A Model for Improving Assessment Literacy

Page 32: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

TRY THIS

Task: Create a logic model for your program

Materials: • Chart paper (in “landscape layout”)• Post-It notes• Markers• Answers to the earlier questions about your

program

Page 33: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Gallery Walk(with Docents)

• Can you read the logic model (without the docent’s assistance)?

• Questions that the logic model raises for you?

• Feature(s) you want to “borrow”?

• Suggestions to strengthen the logic model?

Page 34: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Why bother with logic modeling?

• Seems like a lot of work.• Logic models are too complex—how could I

realistically ever know all the variables at play in a complex program!

• I don’t “think” this way.• Even if we created a logic model, what would

we do with it?!

Page 35: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Limits of Logic Models

• Represent intentions not reality• Focus on expected outcomes, which may

mean missing out on beneficial unintended outcomes

• Challenging to know causal relationships• Does not address whether what we’re doing is

right

Page 36: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Benefits of Logic ModelingMcLaughlin, J. A., & Jordan, G. B. (1999). Logic models: a tool for telling your program’s

performance story. Evaluation and Program Planning 22, 65-72.

1. Builds a shared understanding of the program and expectations for inputs, processes, and outcomes

2. Helpful for program design (e.g., identifying critical elements for goal attainment and plausible linkages among elements)

3. Points to a balanced set of key performance indicators

Page 37: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Are you going to New York or by train?

Page 38: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Your question IS your FOCUS

1. Implementation FidelityQuestion: “Are we implementing the program as

designed?”Focus: Inputs & Processes

2. Goal AttainmentQuestion: “Are we seeing evidence that we are

achieving our intended outcomes?”Focus: Initial, Intermediate, and/or Ultimate Outcomes

Page 39: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

ESL Dual-endorsement ProgramApproved teacher

preparation programs

MDLL TESOL/ESL

courses

VDOE regulations for ESL prep

ESL summer school in

LEAs

Recruitment of

candidates

Scheduling MDLL

courses

Advising candidates

Locating field sites

Program approval

Arranging field experiences

• Orientation• Transportation• Supervision

Coursework

Field Experiences

Elem/Sec/SPED program

Dual

ly-e

ndor

sed

teac

hers

Satis

facti

on w

ith

prep

arati

onIm

pact

on

stud

ent

lear

ning

Page 40: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Once you have a logic model and performance indicators, what do you do?

1. Determine the intent (and “audience”):– Formative evaluation (e.g., implementation fidelity and

improvement—an assumption of continuation)– Summative evaluation (e.g., to continue or discontinue)

2. Articulate a limited number of relevant evaluation questions

3. Identify who is needed to conduct the evaluation4. Determine how the evaluation will be conducted

(time, money, data collection & analysis, compilation & reporting)

http://www.google.com/imgres?imgurl=http://meera.snre.umich.edu/images/copy_of_logicmodelnoindbig.png/image_preview&imgrefurl=http:// …

Where would IF evaluation

occur and where would GA

evaluation occur?

Page 41: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What is the focus of the program evaluation that would answer each of the following questions?

a. Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall?

b. Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading?

c. How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)?

d. Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start?

e. How effective was the character education program in our middle school?

f. Did our new guidance program help new ESL students transition socially to our school?

g. How many AVID teachers have we trained during the past three years?

Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp. 373-374).

i.e., “Implementation Fidelity “or “Goal Attainment”

Page 42: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What Information Do You Use to Answer Your Questions?

Page 43: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Examples of Assessment Sourcesin Schools

• Student attendance rates• Teacher attendance• PTA membership• Achievement gap among subgroups• Inequity of class enrollment (e.g.,

proportionally fewer minority students in AP classes)

• Special education referral rates• Behavior referrals• Reading levels• SOL scores• PALS scores• DIBELS scores• Benchmark test scores• SOL recovery data• SAT scores• AP scores• Surveys

• Class grades• Grade distributions • Algebra I enrollment / completion• College acceptances• Summer school rates• Dropout rates• Retention rates• Acceleration rates• Identification for gifted services• Athletic championships• Debate team competitions• Student demographics• Staff demographics (e.g., years of

experience, licensure status)• Family demographics (e.g., income,

educational levels)• Financial resources (budget)• Per pupil costs

Page 44: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What assessment sources could you use to gather relevant, accurate, dependable information/data to answer each question?

a. Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall?

b. Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading?

c. How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)?

d. Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start?

e. How effective was the character education program in our middle school?

f. Did our new guidance program help new ESL students transition socially to our school?

g. How many AVID teachers have we trained during the past three years?

Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp. 373-374).

Page 45: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Avoid DRIP:Align Your Data Sources to Your Logic Model

Inputs

• Data source(s) to assess attainment or completion

Processes

• Data source(s) to assess attainment or completion

Initial Outcomes

• Data source(s) to assess attainment or completion

Long-term Outcomes

• Data source(s) to assess attainment or completion

Page 46: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Aligning Focus and Assessment Sources:Process (P) or Outcome (O) Indicator?

[͟ Using a classroom observation protocol to determine % of student engagement

[͟ Advanced pass rate on high school end-of-course SOL test

[͟ Questionnaire of 4th grade math students to determine degree of “mathphobia”

[͟ Review of teacher-made lesson plans over the course of a 9-week period to determine % including explicit phonics instruction

[͟ Graduation rates

[͟ Number of “leveled” books available in the media center tracked over a 3-year period

[͟ Survey of students’ attitudes about participating in an after-school tutorial program (e.g., “On a scale of 1-to-5, how much did you enjoy attending the after-school program? How helpful was your after-school tutor?”)

[͟ 3rd grade SOL pass rates

[͟ % enrollment of minority students in foreign language courses in the high school

[͟ Implementation of an advisory period at the middle school level

[͟ Change from a standard 7-period bell schedule to an alternating-day block schedule

[͟ Average AP scores

[͟ Student attendance rates

[͟ Teacher attendance rates

[͟ Review of committee meeting agendas to determine % that focus on discussion of achievement data

[͟ Budget allocation per student

[͟ Grade Point Averages

Page 47: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

AVID: Advancement Via Individual Determination

Page 48: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Retrieved 9/16/12)

Page 49: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

TRY THIS

Tasks:

1. Imagine undertaking an Goal Attainment evaluation of your program. a. Articulate at least 2 evaluation questionsb. Identify 1-2 performance indicators (aka, data sources) that

could help answer each question

2. If time allows, try doing the same for an Implementation Fidelity evaluation.

Materials: • Your logic model

Page 50: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Sage Advice

Remember that not everything that matters can be measured,

and not everything that can be measured matters.

Know your means from you ends

Page 51: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

To make “evidence-based decisions," you must have data that are:

• Valid Necessary for drawing appropriate inferences

• Reliable Necessary for avoiding erroneous judgments

• Triangulated Necessary to confirm data

• Longitudinal Necessary to determine trends and patterns

• Disaggregated Necessary to provide "fine-grain" analysis, rather than over-generalizing

How do SOL results hold up as a data source?

Page 52: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

“Without evaluation, change is blind and must be taken on faith”

-- Sanders (2000)

Page 53: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Managing Complex ChangeAdapted from Knoster, T. (1991) presentation at TASH Conference, Washington, DC.

Vision Skills Incentives Resources Action Plan CHANGE

Vision Skills Incentives Resources Action Plan CONFUSION

Vision Skills Incentives Resources Action Plan ANXIETY

Vision Skills Incentives Resources Action Plan RESISTANCE

Vision Skills Incentives Resources Action Plan FRUSTRATION

Vision Skills Incentives Resources Action Plan TREADMILL

Vision Skills Incentives Resources Action Plan

Evaluative Feedback

CHANGEfor the better

Page 54: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Time to Evaluate

1. Understand basic concepts and principles of program evaluation at the school level

2. Create a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational program

3. Select the appropriate focus for the evaluation of a program

4. Identify appropriate data sources aligned to the intended educational outcomes of a selected school-level program

5. Appreciate the importance of engaging in intentional program evaluation as an teacher leader

Page 55: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Empirical Research ≠ Program Evaluation

• Identifying focus• Hypotheses testing

• Value judgments• Replication of results• Data collection

• Control of variables• Generalizability of results• Focus

• Organizational need• Comparison of outcomes to

intended program outcomes• Contextually driven• Low likelihood• Broad and heavily dependent

upon feasibility• Little to none• Little to none• Dependent variables (i.e., results)

more so than on Independent variables (i.e., the cause)

Page 56: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Investigating the Efficacy of a Clinical Faculty Program

Christopher R. Gareis, Ed.D.Leslie W. Grant, Ph.D.

The College of William & Mary

Page 57: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 58: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What we have already known

Page 59: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 60: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Indicators of Improved Effectiveness of Recruitment, Selection, & Training

1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 20100

5

10

15

20

25

30

35

40

45

50

Total Trained

Targeted Re-cruiting

Restructured Program

Level-specific cohorts begin

Currently Active

Page 61: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Percentage of Placements with CT vs CF

Page 62: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Indicators of Perceived Effectiveness Training

Coherence between field and university:• I have had student teachers in the past with absolutely no guidance or direction

from the college. I’m not sure how effective I was. This class has been so beneficial and I feel I am more confident in mentoring this time! Thank you!

Skill development as a mentor:• I gained awareness in many of the philosophies behind coaching. As I learned

techniques, I grew as a mentor. • This program is essential for bringing professionalism to the training of student

teachers. Mentoring does not always come naturally and having techniques and strategies to address problems and guide students is invaluable.

Outcome orientation:• I am prepared to mentor aspiring teachers. I am prepared to help them become

independent and effective teachers.Professional growth as a teacher:• I always feel very good about what I’m doing when I leave class. I like having the

professional growth and being able to take a good look at my teaching. I have really enjoyed it. What a great program.

• I didn’t realize how much I didn’t know about being a CT.

Page 63: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Overall Evaluation of Experiences in the Clinical Faculty Training

1=Poor, 5=Excellent

Page 64: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What We Want to Know:

Cooperating Teachers Clinical Faculty

Page 65: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 66: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Research Questions1. To what degree do CF differ from CTs in their sense of self-

efficacy for roles of a cooperating teacher?

2. To what degree do mid-term evaluations of student teachers placed with CF differ from those placed with CTs?

3. To what degree do final evaluations of student teachers placed with CF differ from those placed with CTs?

4. To what degree do new teachers who had been placed with CF differ from those who had been placed with CTs with regard to sense of efficacy for teaching, perceived impact on student learning, and intent to remain in the profession?

Page 67: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 68: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Research Methods – Data Collection

Surveys– Clinical Faculty & Cooperating Teachers (R1)

• 1998 – 2011 • n = 101• Response rate: 37.0%

– Graduates of the School of Education (aka, new teachers) (R4)• 2005 – 2010 • n = 94• Response rate: 44.76%

Mid-term & Final Student Teaching Evaluations (R2 & R3)– 2008 – 2011 (n = 319)

• Student Teacher self-evaluations• CT/CF evaluations • University Supervisor evaluations

Page 69: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Research Methods – Data Analysis

Unit of Analysis – Cooperating teacher designation

– Clinical Faculty (CF) – Trained through the School of Education’s Clinical Faculty Program

– Cooperating Teacher (CT) – Not trained through the School of Education’s Clinical Faculty Program

Statistical Analyses

– T-tests for significant differences according to CT/CF designation

Page 70: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

RQ 4

Page 71: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Research Questions 2 & 3

2. To what degree do mid-term evaluations of student teachers placed with CF differ from those placed with CTs?

3. To what degree do final evaluations of student teachers placed with CF differ from those placed with CTs?

Page 72: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 73: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Statistically Significant Differences in Mid-Term Ratings (RQ2)

Competency CF Rating CT Rating

1. Understands subject matter and pedagogy...

2.30 2.44

17. Implements assessments for learning 2.20 2.38

26. Participates in and applies professional development

2.41 2.60

30. Demonstrates potential for teacher leadership

2.34 2.51

4 out of 30 Competencies

Page 74: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Differences in Mid-Term Ratings (RQ2)

1 3 5 7 9 11 13 15 17 19 21 23 25 27 292

2.1

2.2

2.3

2.4

2.5

2.6

2.7

CT RatingCF Rating

Page 75: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Statistically Significant Differences in Final Ratings (RQ3)

Competency CF Rating CT Rating

16. Creates and selects appropriate assessments for learning

2.36 2.52

1 out of 30 Competencies

Page 76: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Differences in Final Ratings (RQ3)

1 3 5 7 9 11 13 15 17 19 21 23 25 27 292

2.1

2.2

2.3

2.4

2.5

2.6

2.7

2.8

2.9

CT RatingCF Rating

Page 77: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

“Overall Teaching Effectiveness” Ratings (R2 & R3)

Mid-Term Final

CF CT CF CT

Student Teacher Self-Evaluation Ratings

2.10 2.12 2.37 2.48

Cooperating Teacher Evaluation Ratings

2.29 2.39 2.63 2.69

University Supervisor Evaluation Ratings

2.10 2.07 2.69 2.64

Page 78: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Overall Rating Comparison

Student Teacher

CF/CT

University Supervisor

Page 79: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Research Question 1 (RQ1)

1. To what degree do CF differ from CTs in their sense of self-efficacy for roles of a cooperating teacher?

Page 80: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Sense of Efficacy for the Role of CF/CT (R1)

QuestionClinical Faculty (trained)

n=76Cooperating Teachers

(non-trained) n=25T-Test

Equal Variances Assumed

T-TestEqual Variances Not

Assumed

Convey Role as CF/CT* 4.51 4.04 0.007 0.038Four Fundamental Roles of Mentoring* 4.24 3.65 0.002 0.015

Foster Relationship 4.57 4.62 0.768 0.772

Effectively Observe* 4.59 3.92 0 0.001

Use a Variety Supervision Strategies** 4.51 4.19 0.066 0.111

Effectively Conference 4.47 4.23 0.169 0.198

Summatively Evaluate ST* 4.55 4.04 0.001 0.006

Impact Professional Abilities of ST 4.49 4.38 0.423 0.447

Provide High Quality Field Exp. 4.57 4.54 0.784 0.772Likelihood ST Positively Affects Pupil Learning 4.37 4.38 0.941 0.939

Likelihood ST Enters Teaching 4.37 4.31 0.663 0.687

Likelihood ST Remains in Teaching 3.99 4.04 0.774 0.796Likelihood ST Positive Impacts Pupil Learning in First Year 4.41 4.31 0.454 0.51Likelihood ST Emerges as a Teacher Leader 4.08 4 0.662 0.686

*significance < .05

**significance < .10

Page 81: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

To what experiences do CF and CT attribute their mentoring acumen?

Group Experience % Indicating Most Important Experience

Cooperating Teachers Experience as a classroom teacher

28.0%

Clinical Faculty Clinical Faculty training* 38.7%

* 20% of CF selected “experience as a classroom teacher” as Most Important.

Page 82: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

RQ 2

RQ 3

RQ 1

Page 83: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

What could explain...

...differences in student teaching outcomes associated with Clinical Faculty?

....the lack of differences between CF and CTs in both intermediate and long-term outcomes?

Page 84: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 85: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Should we continue,

discontinue, or change

the W&M Clinical Faculty Program?

Page 86: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Virginia Standards for Mentoring/Clinical Faculty

Partnership divisions/schools

Experienced classroom teachers

School of Education faculty/staff

School of Education teacher preparation program competencies and expectations

Context & Inputs

Processes Program Activity

Initial Outcomes

Intermediate Outcomes

Long-term Outcomes

Ultimate Outcomes

Recruitment

Application

Selection

2 + 1 Credits 2-week

summer course

4 follow-up meetings (fall and spring)

Knowledge, Skills, and Dispositions of Mentoring Relating Assessing Coaching Guiding

(Portner, 2003)

Knowledge, Skills, and Dispositions of SOE teacher preparation program competencies and expectations

Sense of efficacy for CF role (during and upon completion of training)

Sense of efficacy for CF role (after ≥ 1 year as CF)

Impact on student learning as new teachers (perceived)

Impact on student learning: Classroom

measures School/ division

measures State

measures

Evaluation of field experience by new teachers ≥ 1 year post-graduation

Formative feedback for teacher candidates during student teaching

Mid-term evaluations o CF/CT o TC o US

Teacher Candidate competence by conclusion of student teaching Final Evaluations

o CF/CT o TC o US

Administrator evaluations

Portfolio evaluations

Impact on student learning by teacher candidate

Evaluation of field experience by teacher candidates

Employer satisfaction with new teacher performance

Sense of self-efficacy for teaching (after ≥ 1 year post-graduation)

Intent to continue in the profession (after ≥ 1 year post-graduation)

Growth of Clinical Faculty network

Page 87: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Program Evaluation in Use

Berkeley Middle SchoolSOL Jam & Cram

Spring 1999

Page 88: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

INPUTS PROCESSES OUTPUTS

INITIAL

OUTCOMES

INTERMEDIATE

OUTCOMES

ULTIMATE

OUTCOMES

Analyzed SOL results and class performance

Selectedstudents: C or B

avg. LPT 140-

160 7th gr. SOL

350-400

Invited 175 “in the middle” students• Moti-

vational “recruit-ment”

Analyzed SOL test blueprints

Targeted specific SOLs for focused review

Review & enrichment activities

High-activity, novel instructional strategies

8 sessions (1/2 hour each)

Timely (week prior to SOLs)

Volunteer teachers (but paid)

Collaboratively planned lessons

Lunch & door prizes provided

Sense of “efficacy” for SOL success (survey of students)

SOL assessment results

Academically prepared for success in high school

Independent thinker

Responsible citizen

Lifelong learner

SOL Jam & Cram

Page 89: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Why Did Students Attend?

7th 8th

50% 32% It is important to me to do my best in school.

33% 65% I want to avoid having to go to summer school.

10% 3% My parents made me sign up.

7% -- Some of my friends signed up, so I did, too.

Page 90: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Why Did Students NOT Attend?

7th 8th

4% -- I do not enjoy learning.

-- 6% I already know that I will have to attend summer school.

4% -- I don’t’ believe that Jam & Cram would help me on any of the SOL tests.

4% 6% I do not feel that I fit into the group of students who were selected to attend.

52% 56% I have a conflict on the day of Jam & Cram, so I wouldn’t be able to attend.

26% 25% I forgot to return the slip on time

9% -- I can study fine on my own

-- 6% I didn’t want to go. (“It was really stupid, dumb.”)

-- 3% I don’t like school.

Page 91: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Post-SurveyParticipants

To what degree do you agree or disagree with the following statement: “Jam & Cram helped me to be better prepared for the SOL tests.” [4 pt. scale]

7th Grade: 3.688th Grade: 3.25

Non-Participants

7th 8th

35% 22% I wish I had attended

45% 56% I’m glad that I didn’t attend because I think I did well anyway.

21% 22% Other (mostly explanations of why they didn’t attend)

Page 92: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Percentage of Students Passing SOL TestsBerkeley Middle School

Grade 8

CT-08 EngRLR-08 EngW-08 HSS-08 Math-08 Sci-080

20

40

60

80

100

1998-991999-00

HSS-08 / 7th Grade

Page 93: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

Panthers (2.75 GPA)

Control-P (2.67 GPA)

Jaguars (2.74 GPA)

Control-J (2.67 GPA)

0%

1000%

2000%

3000%

4000%

5000%

6000%

7000%

8000%

9000%

Page 94: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

ChowsChows (2.69 GPA)

Control-CC (2.47 GPA)

Greyhounds (3.13 GPA)

Control-G (3.03 GPA)

0%

1000%

2000%

3000%

4000%

5000%

6000%

7000%

History %

Page 95: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu

An Analysis of SOL Data: What We Can Learn

1. We did great last year with gains of 11-17 percentage points in all areas!2. Science is our strongest area: highest pass rate, highest advanced pass rate, and

relatively low disparity.3. Writing is a strong area: high pass rate and relatively low disparity…but very few

advanced passes.4. English/Reading is a relatively strong area with a high pass rate and a very high

advanced pass rate.5. Computer Technology is a consistently strong area…but has the second highest

disparity group6. Math 8 is conquerable…but the disparity is wide and our overall pass rate is still in

shaky territory7. History/Social Studies remains our greatest challenge in terms of overall pass rate and

disparity by both ethnicity and gender…BUT we had whopping gains last year!8. Our year 2002 class (last year’s 6th grade) is academically strong. Our 2003 class (this

year’s 6th grade) is even stronger!9. Jam & Cram was a successful strategy by all anecdotal accounts, but was not effective

statistically speaking. We may want to repeat the program but with definitive subgroups of students and not in a single cram sessions.

10. SOL Resource class appears to have been a success based on pass rates alone (nearly 100%), although it has not been compared to control data.

Page 96: Christopher R. Gareis,  Ed.D . The College of William & Mary crgare@wm.edu