Upload
others
View
5
Download
0
Embed Size (px)
Citation preview
1
SCCE Higher Education Conference
Finding the Meaning in
Research Compliance Metrics
1
Luanna K. Putney
Director of Research Compliance
University of California
Rachelle Jeppson
Controller
Lawrence Berkeley National Lab
Presentation Goals
2
� Outline challenges of developing research compliance metrics that
appropriately measure effectiveness of compliance efforts
� Identify strategies for developing meaningful systemwide metrics in a
decentralized environment with multiple locations
� Provide example of metrics development in the area of Effort Reporting
2
Challenges of Meaningful
Metrics Development
3
‘Meaningful’ Metrics
4
� Measurable
� # of FTEs/compliance program vs. reporting structures within programs
� Reasonably Correlate with Compliance Elements
� # of trainings offered in research compliance area and # of attendees
vs. # of trainings offered in research administration
� Serve to Inform About Compliance Risks
� Types of reports of non-compliance to Office of Human Research
Protection (OHRP) and % corrective actions complete within X# of days
vs. # of reports of non-compliance
� Correlate with Compliance Efforts
� % of clinical trial billing errors after launching training program vs. # of
complaints related to leadership in clinical trial billing unit
Learner Participation:
Examples of ‘meaningful’ research compliance metrics
3
Challenges
5
� Not all meaningful metrics are measureable
� Example: Effectiveness of key leadership in compliance programs
� Compliance units are decentralized—communication, buy-in and data
collection are challenging
� Example: UC 10 campus + 1 national lab with different cultures and
approaches to compliance
� Lack of access to data
� Example: Central compliance units often do not have access to data
needed for collection of measureable metrics from decentralized units
� Lack of leadership support
� Example: When leadership doesn’t understand importance of metrics
and/or does not want to memorialize compliance weaknesses
� Lack of resources to increase effectiveness of compliance programs
� Example: Cannot show correlation of program efforts with risk
mitigation and/or compliance improvement
Learner Participation:
Other Challenges?
Strategies for Developing
Meaningful Metrics
6
4
Strategies
7
� Form or leverage existing leadership groups to develop & own metrics in
functional areas of greatest risk to organization/unit
� Often need top-down approach for this to work
� Engaging leadership early in process is key
� Obtain direct access to enterprise-wide metrics data
� Example: University of California Learning Management System—
systemwide training initiatives
� Leverage existing data that compliance units may already be collecting
� Example: IRBs & IACUCs internal annual reports to Institutional
Officials
� Utilize surveys/assessments that are easy to use, only ask the most
important metrics questions, and are consistently administered
� ‘Work’ your Annual Plan with activities likely to improve compliance
aligned with proposed research compliance metrics
� Throughout year, note unexpected compliance program activities
and/or instances of non-compliance that may spur new metrics
Learner Participation:
Examples of Strategies That Work
University of California Metrics Development:
Effort Reporting to Federal Sponsors
8
5
NSF Audit Activity
Semiannual Report to Congress – 9/2009
9
“The OIG has been conducting a series of audits to evaluate whether
universities’ internal controls are adequate to properly manage, account
for, and monitor salary and wage costs; and to determine whether these
costs are allowable in accordance with federal costs principles. It is critical
for these systems to be sound because NSF annually provides more than
$1.2 billion, approximately one-third of all NSF funds to universities for
salaries and wages. Further, this figure is expected to grow as the ARRA
increases NSF’s funding of grants.
“Four audits of labor-effort reporting systems, part of a series of audits
addressing this critical grants management issue at large universities,
found that those systems lacked adequate controls to ensure that time
claimed on the NSF awards was actually incurred and reported
accurately. As a result, the universities and NSF had little assurance that
the $92 million of labor charged to those awards represented actual
work performed on NSF research, and those funds remain at risk for
improper and unallowable charges.”
http://www.nsf.gov/pubs/2010/oig10001/oig10001.pdf
These audits identified key weaknesses in the following areas:
• Time and effort documentation
• Transfers of labor costs between awards without explanation or approval
• Certification and accuracy of labor reports
NSF Recent Audit Activity
10
University of Michigan
2009
September June March January September
2008Report Date
Arizona State University
Purdue University
Cornell University
University of Arizona
Vanderbilt University
“Although the mischarged amounts for the sample
are not material individually …. it constitutes an
internal control weakness that could result in more
substantive mischarges for the remaining $14
million in labor charges to NSF awards in 2007, as
well as the labor portion of the $159 million of
other Federal awards.”
Excerpts from NSF Audit Reports
“The significant nature of these control weaknesses raises
concerns about the reasonableness and reliability of the
remaining $11.7 million of FY 2007 labor charges to NSF
grants.”
“Specifically, of the $1.07 million NSF salary charges
sampled over 19% were improperly certified. The
nature of this control weakness coupled with the
University administration delaying acting on
internal audit recommendations, raises concerns
about the reasonableness and allowability of the
remaining $38 million of FY 2007 labor charges to
NSF grants, and could affect the reliability of the
salary portion of Cornell’s other $262 million of
Federal awards.”
6
Yale Settlement
11
Investigation
• Subpoenaed by 5 federal agencies initially
• Allegations
� Cost transfers not allocable to grants charged
� Spending down funding at end of award
� Summer salaries wrongfully charge
• Investigation covered a 7 year period
� $3B in funding/6,000 awards/30 agencies
Settlement
• $3.8M in damages + $3.8M in penalties
Compliance Improvements Implemented in Wake of
Investigation
• Cost Transfer Improvements
• New effort reporting policies, procedures and forms
• New research administration policies
• Campus-wide training and communication
• Business process re-design for research admin
• Staffing/skills assessment review for research admin
FBI Press Release
Yale Press Release
Themes in Audit Report Findings
12
Audit Reports consistently cite deficiencies in the following areas:
• Written procedures and guidance clearly define
� Roles and responsibilities
� Terminology and requirements
• Timely certifications by individuals with suitable means of verification
• Tracking of committed cost sharing
• Tracking of PI committed effort to actual effort performed on the grant
• Training programs address all effort reporting requirements, is kept up to date and is taken
by all officials involved in the process on a periodic basis
• Senior management officials have accountability for timely certifications
• Escalation process when past due reports are not being addressed
• Appropriate and timely cost transfers
7
Approach to Metric Development
13
14
A Simple Approach to Metric Development
Determine What to Measure
• Gather data – internal and external to the organization
• Develop preliminary metrics
Determine How to
Measure
• Accumulate initial data set to “test” the usefulness of preliminary metrics
• Make adjustments to metrics, as appropriate
Implement
• Implement ongoing metrics reporting and analysis process
• Re-evaluate and refine metrics to ensure they are useful and relevant
Reassess
Management
buy-in and
support
8
15
Metrics Must be Both Flexible and Stable
Ability to make
adjustments to metrics
so they are useful and
relevant
Need for consistency in
metrics so that meaningful
comparisons across
operating units and time
periods can be made
Effort Reporting Metric Development at the
University of California (UC)
16
9
Effort Reporting Process
17
Accurate data inputs
Payroll DataSalary & Effort %
Cost Sharing Commitments
Contract & GrantsProject Data
Effort CommitmentsUniversity of California Effort Reporting System
(ERS)
Well trained, knowledgeable staff who understand their roles and responsibilities
ReviewersERS Coordinators
Certifiers
Department/Campus Leadership
Tra
inin
g a
nd
O
ng
oin
g C
om
mu
nic
atio
n
Appropriate Oversight and Monitoring
Data PeopleSystems
User Perceptions
&
Cultural Influences
Effort Reporting at UC
Collaborative • 5 campuses (B,D,LA,SD,SF) and OP
worked together
• Strong interest in a common solution
• Clear scope and objectives
• Strong commitment to success
• Commitment of resources
Campus-Driven • Campus-driven initiative for a common
UC-wide business solution
• Campuses funding the initiative
• Single version of software in use for
multiple campus installations
Comprehensive • Project members met with other
institutions to identify best practices
• Management group identified and
addressed operational, policy, cultural
and technical issues
• Developed communications strategy
and online training modules
18
Most campuses utilize the Effort Reporting System, an
internally developed system
� Developed internally beginning in 2003
A few campuses remain on the Personnel Activity Report; a manual, paper-based process developed in
the early 1980’s.
10
19
Status of UC Effort Metric Development
Determine What to Measure
• Gather data – internal and external to the organization
• Develop preliminary metrics
Determine How to
Measure
• Accumulate initial data set to “test” the usefulness of preliminary metrics
• Make adjustments to metrics, as appropriate
Implement
• Implement ongoing metrics reporting and analysis process
• Re-evaluate and refine metrics to ensure they are useful and relevant
Re-assess
UC
Process
UC Effort Reporting Metric Development
Step 1: Do Your Research
Fact Finding Purpose: Gather Data
Reviewed
Applicable
Background
Materials
Reviewed
Internal Audit
Reports
Partnered with
Financial
Management
Follow-up
Discussions with
Selected Internal
Audit Staff
Developed
Campus
Interview and
Survey Questions
Conducted
Campus
Interviews
Limited Campus
Survey (UCD)
Presentation of
Fact Finding to
Compliance
Committee
20
• NSF audit reports of other colleges and universities
• System-wide internal audit reports
• Fact finding activities at 5 campuses
Step 2: Obtain Senior/Executive Management Buy-in for
Metric Development
• Approval and endorsement of the President’s
Compliance Committee
• Collaboration with Vice Chancellors for
Finance/Controllers and Vice Chancellors for Research
11
UC Effort Reporting Metric Development
Step 3: Develop Metrics
21
• Partnered with Effort Reporting Management
Working Group (Vice Chancellors for
Finance/Controllers)
• Looked to NSF Audit Reports and findings per the
NSF OIG website
• Identified broad categories of metrics:
� Timeliness
� Completion
� Accuracy
� Individuals with suitable means of
verification
� Cost Sharing
� Committed effort compared to actual effort
• Developed specific metrics for each category; taking
into consideration data available in the system
Step 4: Obtain Concurrence of Senior/Executive Management on Metrics
• President’s Compliance Committee
• Vice Chancellors for Finance/Controllers and Vice Chancellors for Research
Outcome Metric
Timeliness % of Effort Reports Certified by Established Due Dates
Completion % of Effort Reports Certified for a Given Reporting Period
Accuracy % of Recertified Effort Reports, Effort Report Version 2.0 (no prior recertification)
Accuracy % of Recertified Effort Reports, Effort Version 3.1 or Higher (prior recertification)
Individual Knowledgeable of Work
Performed/Suitable Means of
Verification
% of PIs Who Self Certify* - where the % is calculated based on the number of
PI/faculty member effort reports self certified divided by the number of effort reports
eligible for self certification
Individual Knowledgeable of Work
Performed/Suitable Means of
Verification
# of Persons Certifying: <25 Effort Reports; >26 and <50 Effort Reports; >51and <75
Effort Reports; >76 and <100 Effort Reports; and >101 Effort Reports
Other Process Effectiveness Indicators
Quality of Data Inputs to Enhance
Accuracy
Committed Cost Sharing Input to ERS via a System Interface - rated on a scale of 1 to
3 where 1 indicates a system interface and 3 indicates that the campus does not
include committed cost sharing in the certified effort reports.
Quality of Data Inputs to Enhance
Accuracy
Committed Effort Compared to Actual Effort via a System Interface - rated on a scale
of 1 to 3 where 1 indicates committed effort is compared to actual effort via an
automated process and 3 indicates that the campus does not review committed to
actual effort.
22
Proposed UC Effort Reporting Metrics
12
23
Effort Metrics Reporting – Example Mock-up
Lessons Learned Based in the Development
Of Initial Effort Reporting Metrics
24
13
25
Lessons Learned
• Engage stakeholders in the process so they have buy-in to and
ownership of the metrics
– Metrics should not be “owned” by the Compliance organization
• Identify initial measures and get started
– It’s more important to have metrics than to spend years developing
the “perfect” metrics
• Leverage data that already exists or can be easily created by existing
systems
– If the process for gathering metric data is too manual, time
consuming and burdensome, it will probably not be done
Questions?
26
Luanna K. Putney
Director of Research Compliance
University of California
510-987-0028
Rachelle Jeppson
Controller
Lawrence Berkeley National Lab
510-486-7558