Program Measure Review – Considerations for the APR Jennifer Coffey, PhD, OSEP Program Lead 1...

Preview:

Citation preview

1

Program Measure Review – Considerations for the APR

Jennifer Coffey, PhD, OSEP Program Lead

“Continuous improvement is better than delayed perfection.”Mark Twain

RollCall

Mute:*6 Unmute: #6

4

SPDG National Meeting Follow-up

Resources and Materials Found at: Website: http://

signetwork.org/content_pages/239-3rd-annual-spdg-national-meeting

Dropbox Folder: https://www.dropbox.com/home/SPDG%20National%20Meeting_Nov2013

Archived Presentation Recordings Allison’s Metz: Use of Data presentation Jennifer’s Program Measure presentation

5

The External Evaluation Pilot year – next year will be baseline The Data Quality Initiative (Westat)

2 Reviewers Evaluated APRs and the procedures/guidance

provided to projects OMB Review

Overall we are doing well Meaningful measures Some concern about Program Measure 2

Need to hear from you how we can help

6

Directors’ Webinars Schedule:Apr 3 Organization Driver: Use of Data,

Program Measure Exemplars

May 1 Family Engagement

Jun 1 Organizational Driver: Facilitated Administration & Systems (Dean Fixsen)

Jul 21 Project Directors’ Conference: Program Area Meeting (DC)

Sep 4 Leadership DriverOct 21- SPDG National Meeting (DC)23

7

Rubric B

8

Considerations for your APR writing Match numbers (e.g. targets) in different

sections of your APR Give names of all fidelity measures (status

chart & description) Describe it as a fidelity measure (e.g., “----

measure assesses the presence or absence of the core components of ---- intervention”)

Describe the 20% reliability check by an outside observer in the “Explanation of Progress” section (after the status chart)

9

Further considerations… Choose (working with your PO) 1 fidelity

measure to follow for each initiative. Create the target for that fidelity measure

Follow each initiative separately with its own data

10

Program Measure 2 Exemplar North Carolina’s APR

11

Things not to worry about For program measure 3 and 4 – having

an exact dollar/participant target. The target percentage is critical, however.

12

Guidance for each measure

13

Summary of the numbers: Program measure 1

  Met Target

Yes No

Year 2(13 initiatives) 4 9

Year 3(13 initiatives) 6 7

Year 4(7 initiatives) 5 2

Total(33 initiatives) 15 18

% 45% 55%

14

Measure 2  Met Target

Yes No

Year 2(2 initiatives) 1 1

Year 3(5 initiatives) 4 1

Year 4(5 initiatives) 1 4

Total(12 initiatives) 6 6

% 50% 50%

15

Measure 3

 

Project Costs

 

Met Target

Cost for TACost for all

PD% for TA Yes No

Year 2(8 initiatives)

$2,057,004 $2,791,357 74%Year 2(8 initiatives)

6 2

Year 3(10 initiatives)

$3,010,015 $4,078,198 74%Year 3(10 initiatives)

10 0

Year 4(7 initiatives)

$1,511,883 $1,808,396 84%Year 4(7 initiatives)

6 1

Total(25

initiatives)

$6,578,902 $8,677,951 76%Total

(25 initiatives)22 3

% 88% 12%

16

Inter-rater reliability Measure 1When the 2 raters differed, they used one of several methods to determine a final rating: 1. They identified characteristics of the description that were similar to characteristics of descriptions that they rated previously, and gave it the same rating as the previously rated descriptions. 2. They each identified description elements that influenced the rating (e.g., identified information that was lacking from the description, identified critical components that were included in the description) and came to agreement on the most appropriate rating. 3. They identified description elements that were essential to the PD component and came to an agreement on how to rate descriptions that were missing one or more of the critical elements. 4. They reviewed supporting documentation cited in the rubric and discussed key aspects of the PD component that should be included in the grantee’s description, and came to agreement on the most appropriate rating.

17

Next steps The external evaluators will modify Rubric A

(Program Measure 1 template/guidance) DQI recommends specifying where to find

information relevant to each component (e.g., insert a footnote with a link to the NIRN website for information specific to expectations for trainers for domain A(2) and insert a different NIRN link for information specific to adult learning principles for domain B(2)).

DQI also recommends refining descriptions of the domains and adding information about the components, particularly when a substantial number of descriptions received ratings of inadequate or barely adequate, to improve the quality of descriptions grantees provide on professional development components.

18

Next steps Learn from SPDGs that earned good

ratings for their program measures April Webinar (Measures 1 & 2)

Evaluator Q & A session Feedback from you via email or call (

jennifer.coffey@ed.gov; 202-245-6673)

Recommended