39
Performance Accountability Improving Data Accuracy and Reporting Washington State Web-Ex August 22, 2014

Performance Accountability Improving Data Accuracy and Reporting

Embed Size (px)

DESCRIPTION

Performance Accountability Improving Data Accuracy and Reporting. Washington State Web-Ex August 22, 2014. Objective. Mutual Understanding of Data Collection-Entry-Reporting accountability from Local Areas to State to DOL. - PowerPoint PPT Presentation

Citation preview

Performance Accountability

Improving Data Accuracy and Reporting

Washington State Web-ExAugust 22, 2014

2

ObjectiveMutual Understanding of Data

Collection-Entry-Reporting accountability from Local Areas to State to DOL.

Encourage discussion of data collection and reporting requirements, procedures, and guidance.

Establish open forums for communication for technical assistance.

3

Background

Oversight agencies like GAO and OIG cite data quality issues with ETA’s data (2002)

Guidance issued annually containing report submission deadlines and source documentation requirements◦Guidance for PY12 included TEN 4-13,

8/28/13, and TEGL 28-11 for PY11/FY12 Reporting and Data Validation

4

OIG Audit of Federal Monitoring

OIG Conducted Follow Up Audit in 2008

◦One of five audit questions: Does ETA have an effective monitoring process?

5

Policies/Procedures and TrainingData management and the resultant quality of

reported data are derived from and influenced by the policies, procedures and protocols utilized at the state and/or local levels

Grantees should develop guidance for staff and sub-grantees involved in the collection of data:

Definitions of data elements Sources of information Participant record and documentation requirements Procedures for collecting, entering and reporting data and

associated “business rules” that cover timeliness and completeness

Procedures for entering data into an automated database Procedures for correcting data

6

Training and MonitoringData collection and data entry:

◦ Routine training should be provided for data management guidance

◦ All staff involved in the collection or entry of data should be trained in the procedures

◦ The data entry process should include steps for verifying entered data against original sources on a sample basis or for entire population of records

7

Myth: Data Management is Purely Technical

Reality: In the end, data management is really about providing the best possible services

◦Reporting and Validation are there to support effective service provision

Accurate reporting is a requirement but it’s also a tool

◦Staff should develop a holistic understanding of both data and the real participants and services the data represent

8

Myth: Data Management is Easy

Reality: Data management is HARD!

◦Business rules are complex and multi-layered

◦Data sets are large and hard to visualize

◦Specs are complex and evolving ◦Circumstances change

…and sometimes the closer you look, the less clear things

become, which is another challenge

9

• Review reports summarizing performance on the common measures, other federal measures, and/or project-specific measures

• Talk to staff and others with first-hand knowledge of the program and its operation

• Generate questions related to the logic of the program design and current environment

• Develop a list of performance issues

What’s Involved in Analyzing Results?What’s Involved in Analyzing Results?

10

Low Performance on the Attainment of a Diploma or Certificate Rate

Low Performance on the Attainment of a Diploma or Certificate Rate

CHALLENGE

Potential Contributing Factor

Data Source?

Potential Contributing Factor

Potential Contributing Factor

Increased # of youth with multiple barriersIncreased # of youth with multiple barriers

Certificates not recordedCertificates not recorded

High training drop-out rateHigh training drop-out rate

Late project start-upLate project start-up

Fishbone DiagramFishbone Diagram

Potential Contributing Factor

Potential Contributing Factor

Potential Contributing Factor

Data Source?

MIS/Reason for dropping out

MIS/Part. file documentation

MIS/Part. Char. In files

Monitoring reports

MIS/Part. file assessments

Assigned to wrong svcs.Assigned to wrong svcs.

Poor quality servicesPoor quality services

Monitoring reports

11

Performance Accountability*unless State ‘early implementer’

WIA (Workforce Investment Act) effective through PY15

‘Services’ based Participation and Exit

Data Validation requiredReporting Cohort

primarily 1st to 3rd Qtr after Exit

Nine Common MeasuresReporting Participant

InformationSequence of Services –

Core, Intensive, Training

WIOA (Workforce Investment and Opportunity Act) effective PY16*

‘Services’ based Participation and Exit

Data Validation codifiedReporting Cohort

extended 2nd to 4th Qtr after Exit

Twelve Primary Indicators of Performance

Expanded Reporting Participant Information

‘Career Services’ and Training

12

Reporting RequirementsCommon Measures

◦Aggregate CountsIndividual Records

◦Demographics◦Outcomes◦Services and Activities

Types Dates

13

Reporting “Most Recent” ActivitiesMost Recent Date Received Staff-

Assisted ServicesMost Recent Date Received Intensive

ServicesMost Recent Date Received Rapid Response ServicesMost Recent Date Received Educational

Achievement ServicesMost Recent Date Participated in

Alternative SchoolMost Recent Date Participated in Work

ExperienceMost Recent Date Received Leadership

Development Opportunities

14

Participant• An individual determined eligible to

participate in the program who receives a service funded by the program in either a physical location (e.g., One-Stop Center) or remotely through electronic technologies.

• Three Components1.Determined eligible to participate in the

program2.Receives a funded service3. In either a physical location or

through electronic technologies

15

Components of Participant1.Individual determined eligible to

participate• Depends on program/funding; doesn’t apply

in the case of W-P, which is based on universal access

2.Receives a service• Not all services trigger participation; it’s

important to understand the distinction between those that do and those that don’t

3.In a physical location or remotely• Many substantial services are remotely

accessed; this needs to be captured

16

Multiple Program Participation

Counting Participants in Multiple ProgramsEarliest date of serviceCan participate in several

programs simultaneously ◦Counted as a participant in each of those programs

◦The participant won’t exit from the program unless there is a gap of no service for 90 days

17

A Service Is:

Any core, intensive or training activity

made available to eligible participants that

allows them to benefit from specific programs in the workforce system.

18

Services that Do Not Begin or Extend ParticipationEligibility determinationCase management administrative

activities to obtain information regarding employment status, educational progress, need for additional services, etc.

Income maintenance or Support payments

Visitors to One Stop Centers, etc., for reasons other than its intended purposes

Follow-Up Services

19

Participation Cycles and Dates of ServiceAlthough there are clear issues

around exit, there are also issues around participation cycles and dates of service in general

◦Service provision prior to formal participation

◦Staff unclear about services that commence participation

◦Dates of service inconsistent across file and MIS, within MIS, within file, within documents

20

When a Service is included in Performance?

Core, intensive or training services made available to eligible participants and require significant staff involvement who exit the program.◦These aforementioned individuals

are included in the performance measures

◦Those WIA Adult and DW program participants who only receive self service or informational activities are excluded from performance

21

Date of Exit

Participant has not received a service

funded by the program or funded by a

partner program for 90 consecutive

calendar days.

22

Exiter

• A participant who hasn’t received a program or partner-funded service for 90 consecutive days and no future services are scheduled• Three components

1.Hasn’t received a service2.For 90 consecutive days3.No future services scheduled

23

Components of Exiter

1.The participant hasn’t received a service• Could be program- or partner-funded

2.For 90 consecutive calendar days• A gap in service can stop the 90-day clock if based

on specific/allowable circumstances

3.No future services scheduled• Specific services and activities as allowable• Does not include any follow-up services or

circumstances where the participant voluntarily withdraws or drops out of the program

24

Extending the Exit Date

• Services provided by partner programs can extend the point of exit

• Participant Services provided during the initial days, prior to exit, following end of activities

•Excluding Follow-Up Services

25

When To Exit

• Services and Activities should be closed when the service plan or service strategy is complete

◦ - The service plan is a “living document,” with additions and changes possible

◦ - Co-enrollment in different funding streams, additional partner services and a valid gap in service can extend the exit date

26

Illustration: Participation and Exit

Participation None or Follow-Up Services

Exit DateParticipation

Date

Eligible and Receives Service

End of 90-Day Period

Last Service

27

Further Clarification of DATES

• Participation and Exit Dates are always dates of service• Participation Date reflects first funded

service• Exit Date reflects last funded service

• Translation of no more ‘hard exit’• Not intended to take responsibility away

from case managers - case managers do not have to wait 90 days to begin providing follow-up services•Although federal guidance states that an exit cannot be officially recorded until that 90 days has elapsed, possible to use a ‘case closure’ MIS code or ‘exit’ form, for example.

Non-Compliance with EXIT Requirements

Exit dates not reflective of dates of last service

‘Case management’ used to extend exit date

Hard exits utilized◦Date of last contact = Exit date◦Date of employment = Exit date

Services provided within 90 daysLack of common exit date (across core

workforce programs)Exit dates not consistent with dates in

MIS 28

29

Follow Up Services/Retention

Do NOT extend ParticipationTwelve Months: Required for

Youth Participants; Available for Adult and DW

Post-Employment Services to Ensure: ◦Entered Employment◦Employment Retention◦Earnings◦Career Progress

30

Follow Up ServicesFollow-up begins after the expected

last service

Youth are required to receive at least 12 months of follow-up services, which are triggered at exit (the only exclusion is for summer youth employment)

Not intended to take responsibility away from case managers for WIA. Case managers do not have to wait 90 days, for instance, to begin providing follow-up services.

Source DocumentationWhether scanned, paper, or

system cross-match, the purpose of source documentation is to have an auditable trail that documents the participant, services delivered and outcomes received.

31

32

DiscussionThoughts?Observations?

33

WIOA Performance Accountability Overview

Core Programs’ Performance Measures (except WIOA Youth)1. Entered Employment

◦ 2nd quarter after exit

2. Employment Retention◦ 4th quarter after exit

3. Earnings◦ Median earnings 2nd quarter after exit

4. Credential Rate◦ New; Up to one year after exit; Doesn’t apply to WP

5. In-Program Skills Gain◦ New; Achieving measurable skills gains, Doesn’t apply

to WP

6. Employer Effectiveness◦ New; before PY16

35

WIOA Youth Performance Measures 1. Placement Rate (Education, Employment)

◦ 2nd quarter after exit

2. Retention (Education, Employment)◦ New; 4th quarter after exit

3. Earnings◦ New; Median earnings 2nd quarter after exit

4. Credential Rate◦ Up to one year after exit

5. In-Program Skills Gain◦ New; Achieving measurable skills gains

6. Employer Effectiveness◦ New; before PY16

35

What’s Eliminated

Literacy/Numeracy indicator for youth

◦Although utilized in development of Skills Gain measure

Customer Satisfaction as statutory measure

State Incentive Funds

◦But Governor’s reserve may be used for local incentives

36

Additional Provisions

State Targets◦ Must use statistical adjustment model—use

now codified (Sec. 116(b)(3)(A)(viii))◦ Targets for first two years included in State

Plans

Additional Information required in Annual Reports◦ Example: Amount of funds spent on each type

of service◦ Data Validation now codified (Sec. 116(d)(5))

37

Additional ProvisionsSanctions

◦State Level If a state fails performance, Secretaries shall

provide TA (used to say will provide TA upon request)

If a state fails for 2nd consecutive year or fails to submit their Annual Report, it can lead to a reduction in statewide funds (stronger language)

◦Local Level If failure continues for a 3rd consecutive year, the

Governor must take corrective action which shall include development of a reorganization plan (and new local board)

38

39

Questions Final Questions or Comments