52
WSRC-RP-2002-00252 SAVANNAH RIVER SITE PERFORMANCE METRIC MANUAL

Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

WSRC-RP-2002-00252

SAVANNAH RIVERSITE

PERFORMANCEMETRIC MANUAL

Page 2: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

2

TABLE OF CONTENTS

OVERVIEW ............................................................................................................................. 3

BACKGROUND AND HISTORY .......................................................................................... 4

BENEFITS................................................................................................................................ 5

READING THE CHARTS....................................................................................................... 6

READING THE SUMMARY SHEETS .................................................................................. 6

CALCULATING SUMMARY SHEETS................................................................................. 9

COLOR VALUES AND DEFINITIONS................................................................................. 9

DOE/CONTRACTOR INVOLVEMENT.............................................................................. 10

SIX SIGMA AND SRS PERFORMANCE METRICS.......................................................... 11

MANAGING METRICS ........................................................................................................ 13

COMMUNICATION.............................................................................................................. 14

LIST OF CURRENT SRS PERFORMANCE METRICS ..................................................... 15

DEFINITIONS, GOALS AND CONTACTS FOR SRS PERFORMANCE METRICS....... 18

WASHINGTON GOVERNMENT GROUP INITIAL DIRECTIONS FORPERFORMANCE METRICS................................................................................................. 33

APPENDIX A: SRS PERFORMANCE METRIC PRESENTATION .................................. 43

APPENDIX B: NUCLEAR SAFETY.................................................................................... 49

Page 3: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

3

OVERVIEW

This manual provides the “nuts and bolts” thatthe Savannah River Site (SRS) used indeveloping the site’s performance metrics.The Background and History section explainsthe how the site began the system, includingwhere concepts were first established. TheBenefits section describes the major benefitsthe site feels it has derived from the system.The next sections, Reading the Charts,Reading the Summary Sheets, and CalculatingSummary Sheets, were taken from an SRSnewspaper article, where the charts wereexplained to site personnel. These sectionsgive an explanation of how to read theinformation provided. To be successful, thesemetrics require Westinghouse Savannah RiverSite’s (WSRC) customer, the Department ofEnergy (DOE). The DOE/ContractorInvolvement section explains how the twohave worked together at the Savannah RiverSite. Six Sigma is now being used throughoutmajor companies in the United States, andWSRC is adopting the tools from Six Sigma toanalyze the underlying processes of theperformance metrics. The Six Sigma sectionexplains how WSRC is using these tools and

what is expected from this effort. ManagingMetrics section describes specific steps usedby the WSRC point of contact to manage over60 key performance indicators. These metricsrequired a communication strategy to informsite employees about the system, which isdescribed in the Communications section.

To further help the reader in developing a setof metrics for another DOE site, a list of SRSmetrics, the definitions, goals and SRS subjectmatter experts for each metric, and the initialdirections from the Washington GovernmentGroup are provided. Appendix A is apresentation used by the WSRC ExecutiveVice President, Dave Amerine, with notes.Nuclear Safety performance metricsdefinitions used at SRS are explained inAppendix B.

It is hoped that this manual will assist othersites in the development of performancemetrics. Comments and questions can bedirected to Gail Jernigan, WSRC, Building703-A Room A-129, Aiken, South Carolina.She can also be reached by calling 803-725-774 or by email at [email protected].

Page 4: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

4

BACKGROUND AND HISTORY

During the Executive Safety Conference inDecember 2001, Greg Rudy, Department ofEnergy Savannah River Operations Office(DOE-SR) and Bob Pedde, WestinghouseSavannah River Company (WSRC) President,volunteered for the Savannah River Site (SRS)to become a pilot for the DOE Complex, usinga performance metric system discussed at theconference.

This system uses key performance indicators(KPIs) as its basis. The color rollup schemewas established by the Institute of NuclearPower Operations (INPO) after Three MileIsland. It provides a quick status summary,which can be modified to suit variouscustomers. (For the commercial world, thismight include INPO, Nuclear RegulatoryAgency, Board of Directors, and otherstakeholders.). The underlying principlebehind each metric is to use objectivity toassess SRS performance. This system provideskey information at a glance but allows “drilldown” to identify issues and actions. Insteadof focusing on events, it provides an easyassessment of trends. It also encourages thesharing of expertise and knowledge and allowsbenchmarking of successes as well asanalyzing problem areas.

While this metric system was discussed withthe DOE at the Safety Conference, for severalyears SRS had been using performancemetrics based on the DOE-SR Site Manager’sFocus Areas of Safety and Security; TechnicalCapability; Community, State and RegulatorRelationships; Cost Effectiveness; andCorporate Perspective. The process ofdefining and measuring performance in theseFocus Areas heightened the awareness of theSRS mission and objectives outlined in theSRS Strategic Plan and helped focusmanagement attention on achieving results.Using these metrics helped SRS continuouslyimprove operations by clearly targeting areasfor improvement. The results of analyses wereavailable in summary format on a quarterly

basis in a newsletter and displayed on postersaround the site. Detailed files and backupmaterial for the ratings were posted on thesite’s intranet so that all organizations andemployees could have access to theinformation.

In addition, WSRC has been using the KPIcolor rollup system since January 2001.Washington Government Group (WGG) (nowWashington Energy and Environment), adivision of Washington Group International,had initiated the development of performancemetrics, using the same INPO format. DaveAmerine, WGG Executive Vice President, ledthe effort to develop a set of meaningfulperformance metrics for all companies withinthe division. During the previous year, he hadmet with crosscutting subject matter experts todetermine a set of metrics and theirdefinitions. A standard description and roll-upalgorithms were developed. Because of thediversity of business units, the group waschallenged to develop a set of consistent anduseful parameters. This diversity of businessunits is analogous to the DOE Complex.

After becoming the pilot for the DOEComplex, SRS staff members used metricsfrom both the Focus Area metrics and theWSRC Corporate metrics to develop the list ofsite metrics. They combined the variousmetrics into the five Focus Areas. The list ofmetrics is being refined as DOE-SR andWSRC continue to monitor them.

As a further refinement, WSRC is also usingthe same format and list of metrics for eachdivision within the company. Not all metricswill be applicable to a division, and divisionsmay have additional metrics that managementwishes to track to assess the division’sperformance. Facilities will also track theirperformance metrics using this same formatand the list of metrics. WSRC anticipatesimplementation of facility metrics by the endof 2002.

Page 5: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

5

BENEFITSListed below are some of the benefits of this system.

The focus is on performance and results through key performance indicators (KPIs). As a resultof the Governmental Performance and Results Act and other Congressional actions, it isimperative that the site demonstrates results. This system gives a visual display of the site’sresults, as well as performance.

Color rollup allows quick “drill down” to source of problem area. By looking at colors on thesummary charts, management can quickly determine the areas of concern. A look at the next levelof detail can provide information as to which particular area needs attention. Additional detailscan be found in the individual KPI. The site can also determine if too many resources are beingused with little benefit

KPIs as basis are objective. Eliminating the subjectivity of performance metrics allowsmanagement to focus on results and focus resources on areas of concern.

Analysis/Action section of KPI allows assessment of problem status. A critical part of each KPI,the Analysis/Action section analyzes successes and failures. If the site is doing well in a particulararea, others can learn from them. If the KPI is trending to Yellow or Red, this section will helpmanagement determine if appropriate actions are being taken to remedy the problems in the areabeing measured.

This system has the ability to accommodate various needs. The use of the summary panels isflexible to allow different groupings to satisfy needs of DOE-HQ, Defense Nuclear FacilitySafety Board, Congress, and other stakeholders. At the same time, the various KPI do not have tobe changed.

An analysis of the summary panels could allow/promote sharing/prioritizing of resources/ideasacross the site. The site can use the Analysis/Action section to promote sharing of ideas across theDOE Complex to the benefit of all. In addition, if one focus area is doing extremely well,management can use this system to determine if resources should be reallocated to areas that needmore attention.

The use of this system can promote consistency in approaches across the site. By using the sameINPO format for all metrics, site personnel can quickly assess the performance of a particular KPIwithout having to learn different performance measurement systems.

This system will facilitate continuous improvement across the site and the DOE Complex. Theuse of these KPIs gives site personnel the ability to analyze their performance to lead to improvedperformance.

Page 6: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

6

READING THE CHARTS

This box represents the Safety and Security box from the Summary Charts. The small boxes across thetop are the scores (color-coded) for the past four quarters. The oldest quarter is on the left, and the mostcurrent quarter is on the right. Others have used this to show the last four months instead of quarters.However, using quarters allows the reader to see what a particular area has done over the past 12 months.The letter “G” in the box represents “Green” for those who may not see this printed in color. The largerbox is the current month, which also has a “G” for “Green” in the bottom left corner.

Reading the Summary Sheets

This is the Summary Chart for all of the performance metrics. The first column represents the five DOEFocus Areas: Safety and Security; Technical Capability and Performance; Community, State andRegulatory Relationships; Cost Effectiveness; and Corporate Perspective.

G G G G

Safety andSecurity

G

W W W W G G G G G W W W Y W G W Y Y Y Y G G G G

W Y GW W G W W W G W W W W W G G G G W G G G G W W W W W W Y

GW W W W W W W W Y Y W W G G G W G W W W

W WW W W W W G W W W W G W R Y R Y G G G G

Y GG G W W G G G W B B W W B B B B

Nuclear Non-Proliferation

W

EM Integration

B

Feedback and Improvement

Physical SecurityNuclear Safety

Disciplined Operations

Employee Relations

Radiological Safety

W

Safety and Security

Tech Capability and

Performance

W

Level IFocus Area

G

Financial Forecasts

PBI Performance

G

W

Environ-mental Compliance

Index

Public Participation

Program

Community, State and

Regulatory Relationships

Cost Effectiveness

Environ-mental

Release Index

W

Industrial Safety

G

Emergency Services and

Fire ProtectionW

Production Infrastructure

W W

Oversight

W W

Y

Corporate Perspective

R

Financial Performance

Project Manage-mentEngineering

Public Perception

W Y

G

Page 7: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

7

Reading across provides the Level 1 performance for each focus area. For example, Safety and Securityhas the following Level 1 performance indicators: Industrial Safety; Emergency Services and FireProtection; Radiological Safety; Nuclear Safety; and Physical Security. For more details, a reader wouldgo to the next level. (See chart below.) At this level, the individual key performance indicators (KPIs) orthe result of rolling up several KPIs are shown. This allows readers to quickly see where the site isexcelling and where there might be problems. Using both charts and the individual performance metrics,managers, DOE-HQ, and others can “drill down” to an area of their particular interest.

This is the expanded version of the Safety and Security section shown on the previous page. Here thereader can see that Safety and Security is made up of Industrial Safety; Emergency Services and FireProtection; Radiological Safety; Nuclear Safety; and Physical Security. Further study shows thatIndustrial Safety is composed of Total Recordable Case Rate, Lost Workday Case Rate, and Cost Index.

Focus AreaY W W W G G G G G G G G G G G G G G G G

Y W W W Y W W G R R R Y G G G W W G G G

W W Y Y W W Y Y W W Y Y

W W W W W W W W W W W W

G G G G G G G G

W

W

G G

Physical Security

W

Cost IndexDays Away,

Restricted or Transferred Rate

Total Recordable Case Rate

Fire Protection Impairment Status

G G

G

Emergency Services and

Fire ProtectionG

G

Nuclear Safety

R

Radiological Safety

Reportable Dose Exceedances

Reportable Contamination

Nuclear Safety Issue

Management Index

W

YR

Security Incidents

Significant Nuclear Safety

Incidents Index

Y W

Fictitious ExamplePerformance IndicatorsThrough March 31, 2002

Emergency Exercises/ Drills

G

Corrective Actions

EPHA Annual Review/Revision

Industrial Safety and Health

Safety and Security

Level 2Level 1

G

Page 8: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

8

This is a fictitious individual performance metric (or KPI) in the INPO format. Specifically, this showsthe site’s performance on Contamination Events. The upper section provides a graph, where readers cansee how the site’s performance has been over the past 12 months. It includes a line for the goal.Underneath the graph is the actual data for this particular metric. Underneath that are four boxes. The firstbox is the “Definition,” which describes the metric being discussed by describing what is being measured.The box below that one is the “Goal,” which describes the site goal and the values for the various color-coded scores. To the right of the “Definition” box is the “Analysis/Action” section that describes why thesite is where it is today. It answers questions such as “what are we doing to reach this level ofperformance and what are we doing to sustain this level of performance.” If the color is green or white,this section would describe how the site level of performance is able to attain this level of performance. Ifthe color is yellow or red, this section would describe specific actions the site is taking to improve in thisarea. Finally, in the lower right is a comment section with the Key Performance Indicator Owner and thesubject matter expert. This section also includes any other comments about the metric.

Data Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb MarContamination Events 1 2 3 3 2 4 4 3 4 5 4 6Goal 1 1 1 1 1 1 1 1 1 1 1 1Score White White Yellow White White Yellow Yellow Yellow Yellow Yellow Yellow RedDefinitionThis chart reflects the number of personnel contamination events per month.

Goal Comments

Analysis / ActionWHITE: Analysis:Over the past year, this metric has been trending upward, with more contamination events each month. While management has been stressing PPE use to avoid additional contamination events, personnel still resist using the new suits, resulting in more events. Action: Staff will be surveying personnel to discover the reasons for the resistance to the new suits. Explanations of the results of not using the suits will be explained to each employee using the new suits, as well as additional training on the use of PPE to avoid future contamination events.

KPI Owner: Jim Stafford (803) 952-9888 SME: Athena Freeman (803) 952-9938Green: ≤ 1 contamination eventsWhite: >1 and ≤ 3 contamination eventsYellow: >3 and ≤5 contamination eventsRed: >5 contamination events

SAFETY AND SECURITYRadiological Safety

Reportable Contamination EventsThrough March 31, 2002

0

1

2

3

4

5

6

7

8

9

10

Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar

Num

ber o

f Eve

nts

ContaminationEvents

Goal

Page 9: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

9

CALCULATING SUMMARY SHEETSTo summarize the metrics, each key performance indicator (KPI) is assigned a value based on thecolor of the performance for the month and/or quarter as shown below.

Excellent Green 3 pointsSatisfactory/Normal White 2 pointsTracking to Satisfactory Yellow 1 pointUnsatisfactory Red 0 points

To determine the value for a summary, the values for the individual metrics are added. (SRS KPIs areequally weighted at this time. A more sophisticated differential weighting may be added later.) Forexample, if a summary has 4 metrics with color values of green (3 points), white (2 points), white (2points) and green (3 points), the total for that summary is 10 points (3+2+2+3=10). Using the tablebelow, since there are 4 metrics and the total 10 points, the value for the summary is Green.

Values Number of applicable KPIs to be evaluated2 3 4 5 6 7 8

Green >6 >8 >10 >13 >15 >18 >20

White >4 >5 >7 >9 >11 >12 >14Yellow >2 >3 >5 >7 >10 >10 >12Red <2 <3 <5 <7 <10 <10 <12

COLOR VALUES AND DEFINITIONS

G Excellence - where performance exhibits a significant strength, such as achievement oflong-term goals

W Satisfactory/Normal - Satisfactory performance. Management should watch theperformance to ensure that it improves.

Y Less than Satisfactory - where performance needs improvement and managementattention.

R Unsatisfactory - Weakness: where performance is significantly below the goal or wherethe annual goal is not expected to be reached.

B No Data - This performance measure did not have data to report. This is also used forcharts that are underdevelopment and do not have data to report yet.

NA Not Applicable

Page 10: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

10

DOE/CONTRACTOR INVOLVEMENTBefore beginning a performance metric systemsuch as the one discussed in this manual, bothDOE and WSRC worked together to develop alist of performance metrics. SRS found thatsenior management involvement is imperativeto ensure that the site is measuring the “right”performance areas. SRS’s list of KPIs is aninvolving process with input frommanagement and subject matter experts.

This manual includes the list currently beingused at SRS and a generic list, proposed byWashington Government Group (nowWashington Energy and Environment) to beused by its companies within this division.Either or both of these can be used indeveloping a list of metrics.

However, most sites probably already havenumerous performance metrics. Existingmetrics are recommended as a starting pointsince site personnel are already familiar withthese metrics. These are probably the itemsthat management considers most important fortracking.

Once the list of metrics was determined, bothDOE and WSRC worked together todetermine the definition of each metric. Whatinformation should be measured? How shouldit be measured? Since these were metrics thesite was already using, SRS management

asked, “Are we measuring the ‘right’information?”

Next, DOE and WSRC developed the goal andscoring levels for green, white, yellow, andred for each metric. Where should the “bar”be? Consideration was given for existing siteresources and the current level ofperformance.

The most important section of a KPI is theAnalysis/Action section. After WSRCprovides this section to DOE-SR, DOEreviews it to determine if WSRC understandsthe data and the level of performance. Thisanalysis section should explain why the site’sperformance is where it is. Furthermore, theaction section should provide activities andsteps WSRC is planning to take to eitherimprove or sustain performance. DOE-SRreviews this section to determine if WSRC isresponding appropriately.

For individual KPIs, KPI owners and subjectmatter experts from DOE-SR and WSRCreview the various sections of the appropriateKPI. The list of KPIs and groupings arereviewed and determined by seniormanagement.

Page 11: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

11

SIX SIGMA AND SRS PERFORMANCE METRICSOverview

Many companies in the United States are nowembracing Six Sigma. Six Sigma is amanagement tool that helps define andimplement needed process improvements.The improvements are evaluated based on thesigma level of the process. The sigma level isa business metric used to indicate theperformance of a process or service to thecustomer's specification. WSRC is currentlyintegrating the use of Six Sigma tools into theevaluation of the site’s performance metrics.As such, the Six Sigma tools will be used tostatistically evaluate and improve the siteprocesses that directly impact the site'sperformance metrics. The followingdiscussion provides an overview of Six Sigmaand also presents the initial plan for Six Sigmaintegration with the business metrics that hasbeen developed at SRS.

Six Sigma:

The Six Sigma tools are being applied by alltypes of companies worldwide in order to:

− Improve customer satisfaction;− Improve profit margins and reduce

costs;− Reduce cycle times; and,− Gain greater predictability of the

results.

The Six Sigma success strategy obtains resultsby reducing the cost of poor quality (COPQ),using a data-driven approach. The cost of poorquality (COPQ) can be lost contracts,cancelled projects, lost funding, injuries, aswell as reviews, inspections, waste, defects,etc. The Six Sigma tools are used to identifythe critical drivers/inputs for each processoutput. Actions and controls are then put inplace to improve the overall process outputs(i.e. reduce the number of defects and processvariation). By identifying and controllingthese critical drivers/inputs, the COPQ will bereduced.

In order to measure the process capabilityrelative to the customer specification, averageperformance of the process and variation inthe process, a sigma level is calculated foreach process. This provides a standard meansto compare processes and also provides a soliddata driven foundation from whichimprovements to each process can bemeasured. The specific goal of Six Sigmatools is to optimize each process to a sigmalevel that is cost effective for that specificprocess.

The bottom line goal is to utilize the SixSigma tools to manage each process in aneffective and efficient manner. Thismanagement approach has been successfullyapplied in both the manufacturing andtransactional business worlds. As such,WSRC will be using the Six Sigma tools tohelp drive improvements in both theirmanufacturing and transactional processes inorder to improve the overall site performance.

WSRC Integration Strategy:

WSRC will be integrating the Six Sigma toolsto management and evaluation of the SRSPerformance metrics. The primary goals forthis integration are:

− Improve the ability to predict futuresite performance;

− Identify the critical actions orprocesses associated with each metric;

− Identify potential improvements for thecritical actions or processes; and,

− Implement controls on the criticalactions or processes to ensureimprovements are sustained.

In order to meet these goals, the followingstrategy utilizing Six Sigma tools is beingdeveloped and implemented:

1. Prioritize the Level 1 site metrics. Thisprioritization will be completed in order tobegin evaluating the key site metrics. Thekey site metrics are those metrics that

Page 12: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

12

could be improved which would lead topositive impacts relative to the SRS Goals.

2. Evaluate the Level 2 metrics for eachLevel 1 metric. This evaluation will:

a. Determine if the process measuredby the metric is in control;

b. Identify the customerspecification limits;

c. Determine the current capabilityfor the process (determining thesigma level);

d. Determine if the process iseffective (Is the current defect rateacceptable? What is the costassociated with the current defectrate?);

e. Determine if the process isefficient (Is the cycle time of theprocess acceptable? Is the cost toconduct the process acceptable?);and

f. Establish measurable goals forimprovement to each of the Level2 metrics.

3. Determine critical processes for eachLevel 2 metric relative to the improvementneeds.

4. Identify process improvement projects(yellow belt and black belt efforts) forcritical processes.

5. Establish overall scorecard and trackprocess improvements.

Summary:

The current SRS site metrics provide a historyof past performance and depict current siteperformance. The Six Sigma tools will allowWSRC to predict future performance based onthis past history and will be used to improvethe overall site performance. This will allowWSRC to manage in a proactive manner basedon the predicted performance of the metrics.With the integration of Six Sigma tools intothe overall SRS business, WSRC expects toimprove the overall site performance whiledecreasing the overall site costs.

Page 13: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

13

MANAGING METRICSWSRC found that no one person can be asubject matter expert on all the variousmetrics; however, a point of contact wasneeded to collect and manage the various KPIsand sum them in the summary pages. A personwas named to manage the performance metricsystem. This person has the responsibility tocollect the data and individual KPIs, sum themusing the algorithm, develop the summarycharts, and perform quality assurance on eachKPI. This manager issued an Excelspreadsheet template for each KPI and askedeach KPI owner and subject matter expert toprovide the information required for each KPI.The individual KPI owner and subject matterexpert worked with their DOE customer indeveloping the definitions, goals, and scoringvalues.

At the beginning of each month, since SRScollects the information monthly, this managersends out a reminder by electronic mail toeach KPI owner and subject matter expert.KPIs are due the seventh business day of eachmonth to the manager who then collects andreviews them. Quality assurance reviews areperformed to ensure that the each KPI iscomplete and that the Analysis/Action sectionis complete. The metrics are also reviewed toensure that there is consistency within allmetrics. Senior management decided early thatthe graphs should be similar in appearance.For example, the color bars of the information

should be a consistent color. (WSRC choseblue for the color of the first bar and a red andwhite striped for the second bar.) Care is takento ensure that KPI graphs are easily read ifprinted in black and white. The goal isgraphed as a line in green. If there is only oney-axis and the KPI scoring permits, a separatecolor bar is established on the far right,showing the levels for green, white, yellow,and red so that the reader can quickly establishthe scoring color without reading the text. Thisis simply a series of small color boxes stackedon top of one another.

Once the KPIs are submitted and qualityassurance reviews are complete, theperformance metric manager summarizes theKPIs by the various focus areas and then thefinal overall summary sheet. Another qualityassurance review is performed by anotherindividual to ensure that the package iscorrect. A package of all summary sheets andKPIs is prepared in Adobe Acrobat, which issent to both DOE-SR and WSRC seniormanagers and placed on the site’s Intranet.Summary pages are posted in or around theDOE-SR and WSRC senior managementoffices, conference rooms, and otheradministrative areas across the site. A copy issent to all KPI owners and subject matterexperts. This is all accomplished by thefifteenth of each month so that the posting isavailable to the site in a timely manner.

Page 14: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

14

COMMUNICATIONFor this performance metric system to be trulyeffective for the Savannah River Site,communications to all employees was used.Newsletter articles, videos, Intranet, employeecommunications and numerous meetings wereused to describe the system and its impacts.

As site subject matter experts were developingindividual key performance indicators (KPIs),other staff members were strategizingcommunication processes. Before the metricswere released, a short news article was placedin the site’s newspaper. The next month amore extensive center page spread was used toprovide the biggest impact to all employees.(In fact, much of the section on describing themetric system came from the newspaper.)

In addition, the DOE-SR point of contact, JeffAllison, Assistant Manager for Health, Safety,and Technical Support Division, and theWSRC Executive Vice President, DaveAmerine, attended many site, organizations,councils and committees to teach siteemployees about the new systems. Examplesof these include DOE-SR Senior Staffmeeting, WSRC Senior Staff (all WSRC VicePresidents and General Managers), WSRCDeputy Managers Council, WSRC ProgramManagers Committee, and WSRCperformance metric KPI owners and subjectmatter experts meeting, an all managersmeeting. Essentially, the same presentationwas used in these meetings, ensuring

consistent communication across the site. (SeeAppendix A for a copy of the presentation.)

One of these presentations was video tapedand placed on the site’s Intranet, allowing allSRS employees to learn about the new metricsystem. Names and phone numbers of pointsof contact were published in numerous placesso that staff members could call with anyquestions.

When the first set of metrics was issued, thesummary sheets were placed on large posterboards and displayed around the site. Mostprominently, the posters were placed outsideboth DOE-SR Managers’ and WSRCPresident’s office. Other locations includedvarious conference rooms and otheradministrative areas. These posters have beenupdated monthly to reflect the most current setof summary charts. As people become morefamiliar with these charts, specific KPIs willbe placed on these posters and other locations.To allow for these posters to be updatedmonthly, each chart is laminated before beingplaced on the poster, using rubber cement.(Rubber cement is easily removed each monthwhile not damaging the posters.)

The entire set of metrics is placed on the site’sIntranet each month. To allow for furtherhistorical perspective, past month’s metricsstay on the site’s Intranet.

Page 15: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

15

LIST OF CURRENT SRS PERFORMANCE METRICSTo manage and find the key performance indicators, a numbering system, similar to an outline, isused. Under this system, each DOE Focus Area is numbered, using Roman Numerals. Safety andSecurity is I, Technical Capability and Performance is II, Community, State, and RegulatorRelationships is III, etc. For the next level, Level 1, an alphabetic numbering system is used. UnderSafety and Security, Industrial Safety and Health is A, Emergency Services and Fire Protection is B,Radiation Safety is C, Nuclear Safety is D, and Physical Security is E. The next level, Level 2, whichfor SRS is mainly individual KPIs, a number is assigned. For example, under Industrial Safety andHealth, Total Recordable Case Rate is 1, Days Away, Restricted or Transferred Rate is 3, and CostIndex is 3.

This numbering system is used by naming each file in Excel and is the page number in the set ofcharts. Doing this makes it easier to find a particular KPI, Level 1, or Focus Area either in the set ofprinted metrics or on a computer. For example, Total Recordable Case Rate is I-A-1, Days Away,Restricted or Transferred is I-A-2, and Cost Index is I-A-3.

DOE FOCUSAREAS

Level 1 Level 2

1. Total Recordable Case Rate2. Days Away, Restricted, or Transferred Rate

A. IndustrialSafety andHealth 3. Cost Index

1. Fire Protection Impairment Status2. Emergency Preparedness Hazards Assessments AnnualReview/Revision3. Emergency Management Corrective Actions

B. EmergencyServices andFire Protection

4. Emergency Exercises and Drills Conducted Versus Scheduled1. Events Resulting in Reportable Dose ExceedancesC. Radiation

Safety 2. Reportable Contamination Events1. Nuclear Safety Issue Management IndexD. Nuclear

Safety 2. Significant Nuclear Safety Incidents Index

I. Safety andSecurity

E. PhysicalSecurity

1. Security Incidents

Page 16: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

16

DOE FOCUSAREAS

Level 1 Level 2

1. Materials Processed Towards Stabilization2. Environmental Release Sites Assessments and Completions3. Tritium Loading and Finishing Lead Time

A. Production

4. Defense Waste Processing Facility FY01-06 Canister Performance1. Preventive Maintenance/Predictive Maintenance2. Delinquent Predictive Maintenance

B.Infrastructure

3. Corrective Maintenance backlog in man-hours1. Conduct of Operations2. Violation/Inadequate Procedures

C. DisciplinedOperations

3. Pollution Prevention and Waste Reduction1. Design Engineering Changes2. Engineers Participating in IDEAS

D. Engineering

3. Engineering Productivity1. Project Management- Schedule2. Project Management -Cost

II. TechnicalCapability andPerformance

E. ProjectManagement

3. Project Management Health1. Reportable Environmental EventsA.

EnvironmentalRelease Index

2. Projected Dose to the Public

1. Enforceable Agreement Milestones-Federal Facility Agreement2. Environmental Enforcement Actions (Notice of Violation)3. South Carolina Department of Health and Environmental Controland Environmental Protection Agency Index

B.EnvironmentalComplianceIndex

4. Environmental Commitments Index1. Citizens Advisory Board Responsiveness2. Public Meeting Effectiveness

C. PublicParticipationProgram 3. Public Involvement Response to Inquiries

1. Tours/Visitors' Program Effectiveness2. Low Country Plan Implementation

D. PublicPerception

3. Education Outreach1. Number of Opened Employee Concerns per 100 Employees2. Average Number of Days to Closure of Employee Concerns

III.Community

State andRegulator

Relationships

E. EmployeeRelations

3. Staffing to Target

Page 17: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

17

DOE FOCUSAREAS

Level 1 Level 2

1. Cumulative Budget Cost Performance for DP2. Cumulative Budget Cost Performance for EW023. Cumulative Budget Cost Performance for EW04

A. FinancialForecasts

4. Comparison of Actual Overtime Cost to Budget1. Performance Based Incentives CompletedB. PBI

Performance 2. Super Stretch Performance Based Incentives Earnings1. WSRC Encumbrance Summary (Operations)2. On time Payments3. On time Invoicing4. Cost Savings Performance (PACE)

C. FinancialPerformance

5. Employees Cost Effectiveness Performance (IDEAS)1. Nonconformance Report Processing Status2. Problem Identification Report Status3. Self-Assessment Findings

IV. CostEffectiveness

D. FeedbackandImprovement

4. Facility Evaluation Board Grade StatusA. Oversight 1. Defense Nuclear Facility Safety Board 94-1 and 2000-1

Recommendation Implementation1. Schedule Performance2. Annual Operating Plan Milestone Performance

B. NuclearNon-Proliferation 3. Cost Performance

1. Receipts of Rocky Flats Plutonium2. Ship to Waste Isolation Pilot Plant

V. CorporatePerspective

C. EMIntegration

3. Mound Receipts

Page 18: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

18

DEFINITIONS, GOALS AND CONTACTS FOR SRS PERFORMANCE METRICSThe following section provides a brief definition and goal of each KPI used at SRS. In the Goal section, the color scoring is included. In addition,SRS subject matter owners and experts and their phone numbers are provided to encourage exchanges of ideas.

Definition Goals and Scores Subject Matter Owner andExpert

I-A-1 Total Recordable Case Rate And I-A-2 Days Away,Restricted, Or Transferred Rate: the occupational safety andhealth act of 1970 requires covered employees to prepareand maintain records of occupational injuries and illnesses.WSRC measures safety performance in accordance withthese guidelines. The incidence rates are calculated bymultiplying the number of injuries, illnesses or lost workdaysby 200,000 and dividing by the total number of work hours.

TRC Rate Goal = 0.93DART Rate Goal = 0.30

Green < 2% above goalWhite <7% and >2% above the goalYellow <12% and >7% above the goalRed>12% above the goal

KPI Owner: Kevin Smith (803-952-9924) SME: Linda Blackston(803-952-9905)

I-A-3 Cost Index: The Cost Index metric measures the severityof the injury or illness. It is equal to the approximate dollar loss(direct and indirect) per 100 hours worked of all injuries andillnesses, calculated using weighting factors which were derivedfrom a study of the direct and indirect dollar costs of injuries.

Cost Index Goal = 3.00

Green < 2% above goalWhite <7% and >2% above the goalYellow <12% and >7% above the goalRed>12% above the goal

KPI Owner: Kevin Smith (803-952-9924) SME: Linda Blackston(803-952-9905)

I-B-1 Fire Protection Systems Impairment Status: Thisindicator depicts the percent of unplanned impairments of fireprotection suppression and detection systems on site. A systemis considered available if it is not impaired. Problems aredetected by operations, maintenance, or fire departmentpersonnel through observation or routine checks. Fire ProtectionSuppression and Detection Systems combined. WSRC does nottrack operable vs. inoperable. This indicator will be revisedeffective with the March data to track both unplannedimpairments and impairments that have exceeded their plannedduration.

The goal is to maintain unplanned fireprotection system impairments at lessthan 2%. Green is < 2% of fireprotection systems impaired, White is >2% and < 4% of fire protection systemsimpaired, Yellow is > 4% and < 10% offire protection systems impaired, andRed is > 10% of fire protection systemsimpaired.

KPI Owner Terri Bolton (803-725-5173) SME: Richard Lewis(803-725-5211)

I-B-2 Emergency Preparedness Hazard Assessment AnnualReview/Revision: This indicator depicts the review and revision

Green = Accomplish 84-100% within 12months and 100% within 15 months

KPI Owner: Chris Baker (803-725-5096) SME: Lynda Blystone

Page 19: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

19

status of Emergency Preparedness Hazard Assessments. White = Accomplish 95-100% within 15monthsYellow = Accomplish 75-94% within 15monthsRed = <75% within 15 months

(803-557-9254)

I-B-3 Emergency Management Corrective Actions: Thisindicator depicts the status of corrective actions for FunctionalArea 13 related deficiencies at the site and facility level.

Green - 5% or less overdue; White - 10%or less overdue; Yellow - 15% or lessoverdue; Red >15%

KPI Owner: Chris Baker (803-725-5096) SME: Lynda Blystone(803-557-9254)

I-B-4 Emergency Exercises and Drills Conducted vs.Scheduled: This indicator depicts the number of exercises anddrills scheduled each month versus the number actuallyconducted.

Green > 80% of scheduled drillsconducted White = 65-79% of scheduleddrills conducted Yellow = 64-50% ofscheduled drills conducted Red <50% ofscheduled drills conducted.

KPI Owner: Chris Baker (803-725-5096) SME: Lynda Blystone(803-557-9254)

I-C-1 Events Resulting in Reportable Dose Exceedances:This chart reflects the number of internal exposure cases fromradioactive materials resulting in ³100 mrem committed effectivedose equivalent (CEDE)1 or external doses greater than siteACL (or FB-Line ACL for FB-Line employees) in a singleevent.1Committed Effective Dose Equivalent (CEDE) - Thecumulative whole body dose a worker would receive over a 50year period from internal exposure.

Green: Number of events resulting inreportable dose exceedances = 0Yellow: Number = 1Red: Number >1

KPI Owner: Jim Stafford (803)952-9888 SME: Athena Freeman(803)952-9938

I-C-2 Reportable Contamination Events: This chart reflectsthe number of personnel contamination events per month.

Green: < 1 contamination eventsWhite: >1 and < 3 contamination eventsYellow: >3 and <5 contamination eventsRed: >5 contamination events

KPI Owner: Jim Stafford (803)952-9888 SME: Athena Freeman(803)952-9938

I-D-1 Nuclear Safety Issue Management Index: The NuclearSafety Information Management Index measures the number ofunresolved safety issues (NIs and PISAs) over the previous threemonth compared to last 12 month period. Additionally, theaverage time and weighted mean time of open reports factor intothe measurement formula. Unresolved is defined to be those NIsfor which a PISA declaration decision has not been made, orPISAs which have not been rolled up into a JCO or any otherAB Document." (Additional details on Nuclear Safety can be

The goal for the Nuclear SafetyInformation Management Index is tohave a measurement less than 1 duringthe quarter. Green is less than 1, Whiteis 1, Yellow is 2 and Red is 3

KPI Owner: G. Clare (952-7222)SME: Andrew Vincent (952-7209)

Page 20: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

20

found in Appendix B.)I-D-2 Significant Nuclear Safety Incidents Index: This metricreviews and grades all Unusual Occurrences (UOs) and otherevents related to Nuclear Safety listed in SIRIM under FacilityCondition-01 for rolling 3 months. UOs relating to most seriousincidents, such as For Cause Shutdowns, OSR/TSR violationsand Criticality Limit Challenges are graded more severely in theformula. (Additional details on Nuclear Safety can be found inAppendix B.)

Goal for the Site is Less than 15Color definition is as follows: GreenLess than 10, White 10 to 19, Yellow 20to 29, Red 30 or greater

KPI Owner: George Clare (952-7222) SME: Andrew Vincent(952-7209)

I-E-1 Physical Security: This indicator depicts trending data forsecurity incidents. WSRC does not separate Security Eventsfrom Computer Security Events and Document Security Events;therefore, for this metric all security events are combined.

Goal is to minimize the number ofsecurity incidents across the site.Green = 0 - 8 events, White = 9 - 13events, Yellow = 14 - 18 events, Red =19 plus events, based on 15,000 WSRCemployees.

KPI Owner: Chris Baker (803-725-5096) SME: Lynda Blystone(803-557-9254)

II-A-1 Materials Processed Towards Stabilization: Combinedstabilization of1) Items converted to Pu metal from solution and packaged in

Bagless Transfer,2) RFETS Pu metal canned in Bagless Transfer,3) MK 16/22 assemblies and SFO containers charged,4) Oxide containers processed,5) Residue containers dispositioned,6) RBOF storage units moved to L-Basin, and7) FRR/DRR casks unloaded and processed.Monthly & cumulative scheduled versus actual stabilizationactivity is shown.The sum total of each of the units of production in the chartdefinition section is the number needed to meet the WSRCNuclear Material Management Vision 2006 Roadmap Base Caseand the corresponding base fee for the applicable PerformanceBased Incentive in accordance with the WSRC/DOEPerformance Baseline.

Monthly:Green = > 80% of scheduledWhite = 65-79% of scheduledYellow = 64-50% of scheduledRed = < 50% of schedule

KPI Owner: John Dickenson(803-952-4604) SME: HarryPund (803-952-3684)

II-A-2- Environmental Release Sites Assessments andCompletions: This metric measures ER progress in assessing

ER's goal for FY02 is to assess a total of15 sites and complete at least 13 release

KPI Owner: Dean Hoffman (2-6837) SME: Bob Pride (2-6479)

Page 21: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

21

and completing release sites against its commitment toaggressively remediate inactive waste sites. Specific annualgoals are reported to Congress and a plan to accomplish thesepre-established numbers is reviewed on a monthly basis.Assessments are complete when the key document for a site issubmitted to the final approval authority. This includessubmission of Rev. 0 Record of Decision (ROD), RCRA Permitclosure Plan or a Site Evaluation Report ending in a No FurtherAction (NFA). Release Sites are complete when the physicalcleanup is complete and a final report is submitted for approval.An SE request for NFA is considered cleanup complete. A NoAction recommendation in a Rev 0 ROD is complete uponregulatory approval.

sites. A plan is identified to accomplishthese goals based on commitmentsincorporated in ER's actual scope ofwork. (This schedule for actualaccomplishment may vary in conjunctionwith valid change control.) Scoring: G =exceeding both goals; W = at leastmeeting goals; Y = behind on one goal;R = behind on both goals

II-A-3 Tritium Production: This Level 2 is composed of 2 KPIs: Reservoir Performance and Tritium Loading and Finishing Time.II-A-3- Reservoir Performance: This indicator depicts thecompletion of Function Tests in support of ReservoirSurveillance Operations (RSO), Life Storage Program (LSP) andProduction Sampling (PS). This represents the PBI goal and thestretch PBI goal related to Function Tests and DP's progresstoward reaching these goals.

The FY01 PBI Goal was 101.7 and thestretch PBI goal was 116.95. For FY02,the PBI goal is 117 and the stretch PBIgoal is 129. Goals are: Green (10% overPBI stretch goal for month), White ( 0%to 9.99% over PBI stretch goal formonth), Yellow ( Up to 5% below stretchgoal for month), and Red ( More than5% below stretch goal for month)

KPI Owner: Cheryl Cabbil (208-1234) SME: Henry King (208-1729 )

II-A-3- Tritium Loading and Finishing: This indicator showsDefense Program's progress toward meeting monthly DOEshipping requirements.

The goal is to ensure reservoirs are readyto meet shipping requirements.Average Loading Lead Time Goals:Green (>60 days), White (>45 days),Yellow (>30 days) and Red (Less than30 days). Average Finishing Lead TimeGoals: Green (>30 days), White (>20days), Yellow (>15 days) and Red (Lessthan 15 days). 0% of schedule

KPI Owner: Cheryl Cabbil (208-1234) SME: Henry King (208-1729 )

II-B-1 Preventive Maintenance/Predictive Maintenance: Thismetric measures the effectiveness of the Preventive Maintenance(PM) and Predictive Maintenance (PdM) programs by indicating

The Goal is to perform 50% of the TotalMaintenance as PM / PdM. A monthlyindex scoring is applied to the goal as

KPI Owner: Chuck Campbell803-725-2726 SME : RickFleming 803-725-1460

Page 22: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

22

what percentage of Total Maintenance is being performed asPM/PdM. This indicator revised to reflect man-hours versus itemcount.

follows:Green: 50-100%White: 40-49%Yellow: 30-39%Red <30%

II-B-2 Delinquent Predictive Maintenance: This metricmeasures the total number of delinquent Preventive Maintenance(PM) work orders - these PM's are not field work complete, buthave exceeded their scheduled date and any allowable graceperiod.

The Goal is to reduce Delinquent PM'srelative to a previous average of 215. Amonthly index scoring is applied to the70% goal as follows:Green: <70% of GoalWhite: 71-110% of GoalYellow: 111-130% of GoalRed: >131%

KPI Owner: Chuck Campbell803-725-2726 SME : RickFleming 803-725-1460

II-B-3 Corrective Maintenance in man-hours: This metricmeasures the Total Corrective Maintenance (CM) backlog inman-hours. Corrective Maintenance excludes work associatedwith Preventive Maintenance and Plant Modifications.

The goal is to reduce CM Backlog (inhours) relative to a previous average of103,871 hours. A monthly index scoringis applied as follows:Green: <75% of AverageWhite: 76-105% of AverageYellow: 106-130 % of AverageRed: >131% of Average

KPI Owner: Chuck Campbell803-725-2726 SME : RickFleming 803-725-1460

II-C-1 Conduct of Operations: This indicator is composed ofthe normalized sitewide monthly total of ORPS/SIRIMreportable occurrences in 16 selected Nature of Occurrencecategories as defined in DOE M 232.1-1A, which are deemed tobe of ConOps significance. The value is derived by dividing thenumber of SIRIM recordables by headcount, which isnormalized for 200,000 work hours.

The goal is a sitewide normalizedcumulative value per month of 4.5.Green is <4.5, White is 4.5-5.5, Yellowis 5.5-6.0, and Red > 6.0

KPI Owner: Steve Johnson (803-952-9886) SME: Joy Price (803-952-9657)

II-C-2 Violation/Inadequate Procedures: This indicatordepicts the normalized sitewide monthly total of ORPS/SIRIMreportable occurrences in the Nature of Occurrence category01F, Violation/Inadequate Procedures, as defined in DOE M232.1.1A. The value is derived by dividing the number of SIRIMrecordables by headcount, which is normalized for 200,000 workhours.

The goal is a sitewide normalizedcumulative value per month of 1.5.Green is defined to be <1.5, White is >1.5 and < 2.0, Yellow is > 2.0 and < 2.5,and Red is > 2.5.

KPI Owner Steve Johnson (803-952-9886) SME: Joy Price (803-952-9657)

Page 23: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

23

II-C-3 Pollution Prevention and Waste Reduction: Thisindicator depicts WSRC's progress toward meeting a 20%reduction in newly generated radioactive and hazardous solidwaste volumes based on DOE-SR approved waste forecast. Theobjective is to implement changes to decrease the amount ofwaste generated, thereby reducing current and future operatingcost and environmental and personnel risks.

Achieve annualized disposal volumereduction of 20% (1,222 cubic meters)based on DOE-SR approved solid wasteforecast. WSRC documentation ofimplemented waste reduction initiatives'impacts determines both waste volumeand operating cost reductionperformance. Green is > 10% of monthlyprorated goal, White is 0-9% abovemonthly prorated goal, yellow is 1-9%below monthly prorated goal, and Red is> 10% below monthly prorated goal.

KPI Owner: Luke Reid (803-557-6309), SME: Tim Coffield (803-557-6316)

II-D 1- Engineering Changes: The purpose of this metric is tomeasure design changes caused by design errors. This isaccomplished by tracking the fraction of design changesrequired because an error was made by engineering in providingrequirements or in performing the design. The value chartedwill be the design changes NOT caused by design error. Thismetric will provide SRS and WGI with an indication the qualityof engineering design.

The goal is based showing improvementover historical performance. Green -95% or higher; White is 85-94%; Yellowis 75-84%, and Red --< 75%.

KPI Owner George Clare (803-952-7222) SME: Monica Jorque(803-952-9246).

II-D-2 Engineering Participating in IDEAS: The purpose ofthis metric is to measure participation by engineers inimprovements to products and processes through the employeeinvolvement program. This is accomplished by trackingsubmittals by engineers to the IDEAS program. This metric willprovide SRS and WGI an indicator as to the degree to whichengineers are initiating recommendations for product/processimprovements.

The monthly goal (40) is derived from anannual goal of 475 IDEAS submittals(averaging .3 submittal per engineer).Green -- 90% of goal, White is 80-89%of goal, Yellow is 70-79% of goal, andRed is < 70% of goal.

KPI Owner George Clare (803-952-7222) SME: Brenda Kelly(803-725-0676)

II-D-3 Engineering Productivity: The purpose of this metric istrack engineering productivity by combining specific Divisionmeasurements, normalize for the goal of 1.00, and weightingeach measure by the ratio of Division Engineering organizationpopulation to the Site Engineering organization population. EachDivision metric is linked to deliverables, commitmentfulfillment, or equipment availability. As an example, HLW

The goal is to maintain a normalizedratio for the composite measure and theindividual Division measures eachmonth. Green is 1.25 or better, White is1.24 to 0.85, Yellow is 0.84 to 0.65 andRed is < 0.64.

KPI Owner George Clare (803-952-7222) SME: Andrew Vincent(803-952-7209) or WilliamTucker (803-952-7182) .

Page 24: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

24

measures commitment fulfillment whereas P&CT measurescomputer system availability. USQS&E's, new or revisedprocedures, and PMT's. Since this a revised metric, the data willcontinue to be analyzed with an eye to both improvingperformance and improving the metric.II-E-1, 2, and 3 Project Management Schedule, Cost, andHealth: Applies to all active projects (i.e., Line Item, GeneralPlant, Capital Equipment and Cost funded). The indicators areofficially measured quarterly. Prior to the beginning of eachquarter within each focus area, attributes and initiatives aredeveloped and weighted relative to customers' expectationsjointly by WSRC & DOE-SR for each project. Performanceagainst these attributes and initiatives are scored jointly byWSRC & DOE-SR at the end of quarter. For the months inbetween, an internal assessment and score will be generatedbased upon monthly review meetings with DOE-SR. This KPIincludes an assessment of monthly project Forecast AtCompletion (FAC) as part of the cost metric.

Goal is to achieve 95 or greater score forschedule, cost, technical and overall.Green is "Excellent" (95 to 100); Whiteis "Very Good" (85-94); Yellow is"Good" (75-84); Red is "Poor" (scoreless than 75).

KPI Owner: Steve Morton (803-952-6620) SME: David M.Matos (803) 952-7294.

III-A-1 Reportable Environmental Events: This indicatordepicts the number of Reported Environmental Events (non-compliances as determined by receipt of regulatory notificationfrom SCDHEC and /or EPA, and the number of NationalResponse Center notifications of releases > reportable quantity.The objective of this indicator is to measure continuousimprovement in the SRS Environmental Management System.One point is assigned for each release. REEs include events(letters of Non-Compliance) and releases (greater thanReportable Quantity).

Achieve reductions of 20% for REEbased on the WSRC current FY02performance compared to the previousfive-year performance average. (TheFY01 goal was 6) The FY02 goal waslowered from 6 to 4 or less points.Scoring for REE: Green - 4 or less,White -- 5, Yellow - 6 Red - >6.

KPI Owner: Patricia Allen (803-725-1728) SME: Dave Lester(803-725-2904)

III-A-2 Projected Dose to the Public: Operations at SRS,which are permitted for the release of radionuclides, operate farbelow regulatory standards. Consistent with the ALARAprogram, this indicator measures how effectively operations areperformed to maintain permitted releases below site determinedALARA Guides. This assures the public and site stakeholdersthat operations are not resulting in doses to the public near

The Goal of this indicator is to show thatoperations do not exceed 100% of theALARA Release Guide for a compositeof Liquid or Air Releases. Red 100% -90%; Yellow 89% - 85%; White 84% -80%; Green < 80%.

KPI Owner: Patricia M. Allen(803-725-1728) SME: DaveLester (803-725-2904)

Page 25: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

25

regulatory limits.III-B-1 Enforceable Agreement Milestones – Federal FacilityAgreement: This metric measures Environmental RestorationDivision progress in completing regulatory milestonedeliverables. These deliverables are contained in the FederalFacility Agreement and the RCRA permit. Milestones arenegotiated annually with EPA and SCDHEC and are modified inaccordance with the FFA.

ER's goal is to meet all milestones on orbefore the date due. These milestonesare identified in Appendix D of the FFA,the RCRA permit, and other officialcorrespondence and documents that setout regulatory expectations. Since amissed milestone can lead to regulatoryenforcement actions, there is noacceptable number that can be missed.Scoring: G = no missed milestones +significant, high value negotiatedregulatory outcomes, W =no missedmilestones R = 1 or more missedmilestones

KPI Owner: Kim Cauthen (2-6540) SME: Patty Burns (2-6750)

III-B-2 Environmental Enforcement Actions (Notice ofViolations) : This indicator depicts the number of Notices ofViolation (NOVs) and fines issued to WSRC by SCDHECand/or EPA. Total NOVs/Fines paid are weighted and have beenrescaled to reflect current practices for SCDHEC issuance ofNOVs. NOV without fine = 0.5 point, NOV with fine <$10K = 1points, NOV with fine between $10-50K = 3 points, NOV withfine >$50K = 6 points. The scale for scoring has been modifiedto reflect this different weighting scheme.

The goal is 3 or less points. ScoringGreen <= 3 points, White = 4 - 6 points,Yellow = 7 - 9 points, Red > 9 points.

KPI Owner: Patricia Allen (803-725-1728) SME: David Lester(803-725-2904)

III-B-3 South Carolina Department of Health andEnvironmental Control and Environmental ProtectionAgency Index: This metric measures responsiveness toSCDHEC and/or EPA environmental compliance assessments.SCDHEC conducts most of the assessments (CWA NPDES3560, CAA, RCRA CME, SDWA, UST). EPA participates astheir schedule allows and leads assessments for CERCLA,TSCA, and FIFRA. Points are assigned to findings and daysdelinquent to develop an index. 0 points <3 findings, 5 points for3-4 findings, 7 points for 5-10 findings and 10 points > 10findings; 0 points for < 0 days delinquent, 10 points for 1-4 days,

The goal is to show adherence toregulations and responsiveness toSCDHEC and/or EPA environmentalcompliance assessments. Goal is to haveless than three findings per audit or anumeric index total of no more than 5points. Green is defined as < 5 points;White is 6-15 points; Yellow is 16-22points, and Red is > 23 points.

KPI Owner: Patricia Allen (803-725-1728) SME: Dave Lester803-725-2904)

Page 26: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

26

15 points for 5-10 days, 20 points > 10 days.III-B-4 Environmental Commitments Index: This indicatordepicts WSRC progress towards meeting 100% regulatorycommitments. The objective of this indicator is to measureperformance in meeting federal and state environmentalcommitments. Combined graph that includes both the OverdueCommitments Index and the Open Regulatory Items Index;federal and state regulatory agency commitments are applicableto WSRC.

Achieve 100% compliance in meetingregulatory commitments on schedule.Ratings for Percent on Schedule: Greenis defined as 100% - 95%, White is 94%- 90%, Yellow is 89% - 85%, and Red is> 84%. Rating for meetingenvironmental compliance commitments:Green = 100%, Red < 100%.

KPI Owner: Patricia Allen (803-725-1728) SME: Dave Lester803-725-2904)

III-C-1 Citizens Advisory Board Responsiveness: This metricdepicts responsiveness to SRS Citizens Advisory Boardrecommendations by the Department of Energy. It measures thenumber of days to respond to a recommendation.

The goal is for DOE-SR to respond toCAB recommendations within 10 daysfrom receipt. Green is all responses inless than 10 days. White is if anyresponse in 11-20 days. Yellow is aresponse from 20-30 days. Red is if anyresponse is more than 30 days from thedate of receipt.

KPI Owner Teresa Haas, 5-4643SME: Dawn Haygood, 5-9668

III-C-2 Public Meeting Effectiveness: This indicator measuresthe effectiveness of public meetings based on DOE andstakeholder feedback, meeting preparation/follow-up andattendance.

The goal is to maintain a score above 75points. Points are assigned based onfeedback forms, previous year averagesand checklist completion. Green is ascore of 75 or above, White is a score of70-74, Yellow is a score of 65-69, andRed is a score of < 65.

KPI Owner Teresa Haas (803-725-4643) SME: Dawn Haygood(803-725-9668)

III-C-3 Public Involvement Response to Inquiries: Thismetric measures how quickly SRS Public Involvement Staffrespond to public inquiries made via phone, email, and meetings.

The goal is to respond within a 5-dayperiod. Green = average response in lessthan two days, White = average responsein 3 days, Yellow = average response in4 days, and Red = response > 5 days.

KPI Owner Teresa Haas, 5-4643SME: Dawn Haygood, 5-9668

III-D-1 Tours/Visitors Program Effectiveness: SRS Tours areoffered to the public to educate them on siteactivities/capabilities. A survey is given to each tour group of 5or more touring SRS that contains the following questions:"Before the tour today, how did you feel about SRS?" "How doyou feel about SRS now that your tour is complete?" The SRS

To strive for positive, continuousimprovement in the public’s perceptionof SRS.Green: x >3.75White: 3.5< x <3.75Yellow 3.0 <x<3.5

KPI Owner Teresa Haas (803)725-4643 SME: Laurie Posey-(803) 725-5505

Page 27: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

27

Opinion Survey goal reflects the feelings of tour participantstoward the site. It is important that those who participate leavethe site with an opinion that is equal to or better than when theycame.

Red: x<3.0

III-D-2 Low Country Plan Implementation: This metric isdesigned to demonstrate WSRC’s performance in meeting themilestone commitments as identified per WSRC 2002 LowCountry Outreach Strategy and Plan. This tool will focusmanagement attention to ensure that Overdue and NotCompleted milestones receive the required level of attentionthrough completion. The formula is YTD Completed divided byYTD plus Milestones Overdue. Due date is either the originaltarget date identified in the plan or the revised date that has beenapproved via an informal change control process with our DOE-SR Customer.

Green – 95-100%White – 85-94%Yellow – 80-84%Red -<80%

KPI Owner: Lessie B. Price (803)502-9983SME: Phillip T. Mottel (803)502-9984

III-D-3 Education Outreach is composed of the Traveling Science Program and Internships at SRS.III-D-2 Traveling Science: This metrics indicates the numberof requests received from teachers in the CSRA for scientistsand engineers at the Savannah River Site to demonstratescientific, mathematics or engineering topics in the classroom.The metrics also indicates the requests from the teachers thatwere completed during the quarter.

The goal of the traveling scienceprogram is for CSRA teachers to haveaccess to Savannah River Site scientistand engineers. EOP strives to complete100% of the requests from the teachers.Green = 100%White = 70-99%Yellow = 50-69%Red =<50%

KPI Owner: Name BarbaraSmoak (725-7761) SME: BonnieToole (725-7473)

III-D-2 Internships at SRS: Internship programs offeropportunities to the education community to gain experience andknowledge in their field and build a pipeline of future skilledworkers to support site missions. This metric measures thenumber of interns utilizing Site resources by participating in theResearch Internship Program and the School-to-Work Program.

The goal is to provide at least 100internship opportunities at SRS throughthe School-to-Work program and theResearch Intern program. Green is >200 participants; White is 130-199;Yellow is 100 - 130; Red is <100.

KPI Owner: Barbara Smoak (5-7761) SME: Bonnie Toole (5-7473)

III-E-1 Number of Open Employee Concerns per 100Employees: This indicator reflects the number of concerns per100 employees. The information provides a commondenominator by which WSRC can identify adverse trends.

Goal: Maintain concerns per 100employees at a factor of less than 2.00.Green is defined as < 2.00; White is >2.00 and < 3.00; Yellow is >3.00 and <

KPI Owner John Runnels (803-725-4919) SME: Larry Adkinson(803-725-1951)

Page 28: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

28

WSRC can use this data, internally and externally, forbenchmarking with other organizations.

4.00; and Red is > 4.00.

III-E-2 Average Number of Days to Closure of EmployeeConcerns: This indicator reflects the average number of daysrequired to internally resolve employee issues through theWSRC Employee Concerns Program. Timely resolution isdirectly related to the nature of the concern and depth ofinvestigation required. Thoroughness and accuracy of theinvestigation are often related to customer satisfaction, which isimportant to our employees, customers, and Board members andis also an important factor of risk management. However, thetime factor is but one element of quality.

Goal: Resolve issues expeditiously suchthat the average days to closure remainsless than 60 days. (This factor may varywidely and have less validity when smallnumbers of concerns are being reported.)Green is defined as < 60 days; White isdefined as 61-90 days; Yellow is definedas 91 to 120 days; and Red is defined as> 121 days.

KPI Owner John Runnels (803-725-4919) SME: Larry Adkinson(803-725-1951)

III-E-3 Staffing to Target: This indicator demonstrates howclosely the current workforce is aligned with projected staffingtargets. Staffing plans and workforce skills analyses are updatedto coincide with the program planning cycle. Together, theyhighlight both shortages and excesses in terms of manpowerresources and worker skill sets.

Goal is to maintain a staffing to targetratio of 97% or higher. Green stop lightis 98.6-100%; White is 97-98.5%;Yellow is 95-96.9%, and Red is less than94.9%.

KPI Owner Ted Myers (803-725-5-4691) SME: Willie Bell (803-725-4207)

IV-A-1 Cumulative Budget Cost Performance for DP, EW02and EW04: This indicator shows fiscal year to date operating,capital equipment, and general plant project expenditures shownas compared to budget. Line items are excluded. Measurementwill be done for the major B&R structures on the site, EW02,EW04 and DP.

The goal for this metric is for actualexpenditures be close to budgetedexpenditures. Green is where theperformance is > 95% and < 100%,White is 85-95% or 100-102%, Yellowis 80-85% and 102-105%, and Red is <80% and > 105%.

KPI Owner: Dan Becker (803-725-1124) SME: Susan Crosby(803-725-7668)

IV-A-4 Comparison of Actual Overtime Cost to Budget: Thismetric compares FYTD monthly overtime actuals as a percent ofovertime budget.

The goal for this metric is for actualovertime is to be less the than or equal to102% of budget. Green is where theactual is equal to or <102% White is103-106%, Yellow is 107-110%, andRed is > 110%.

WSRC KPI Owner: Dan Becker(803-725-1124)SME Paula Hardin (803-725-1767)

IV-B-1 Performance-Based Incentive Performance: Thisindicator reflects contract performance success as indicated byfee from the WSRC/Division PBI milestones completed. Thepercentage shown is the milestones completed versus the

The ranges utilized reflect the higherexpectations of WSRC and are in excessof those required in the Contract.Green: x> 95%

KPI Owner: Clay Jones (803-725-4409) SME: Phil Croll (803-725-3158)

Page 29: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

29

projected completions. Total fee invoiced (within 5 days ofmonth end) is shown as the measurement. Adjustments havebeen/are made for WSRC/DOE approved baseline changes.

White: 85% < x < 95Yellow: 75 < x <85Red: x<75%

IV-B-2 Super Stretch Performance-Based IncentivesPerformance: This indicator reflects contract performancesuccess as reflected by fee from Super Stretch (SS) PBIscompleted. The percentage shown is the actual cash flowgenerated versus the projected cash flow for currently approvedSS PBIs. Adjustments are made when new SS initiatives areapproved or when operating circumstances require cash flowprojection revision.

The goal is to maximize super stretchearnings throughout the fiscal year.Success is measured as follows:Green: x > 90%White: 80 < x, 90%Yellow: 70% < x < 80Red: x < 70%

KPI Owner: Clay Jones (803-725-4409) SME: Phil Croll (803-725-3158)

IV-C-1 WSRC Encumbrance Summary (Operations): Tomeasure the quarterly Operating Encumbrance Summary(exclusive of Capital Equipment and GPP) of the Major B&R's:DP09, EW02,and EW04 against the established target.

The goal of this metric is for the actualencumbrance be close to the targetencumbrance. Green is where theperformance is >95% and <100%.White is 85-95% and 100 - 102%.Yellow is 80-85% and 102 - 105%. Redis <80% and > 105%.

KPI Owner: John Cantwell 952-9236 SME: Carol Posey 952-8233

IV-C-2 On Time Payments: This indicator reflects actualmonthly invoices submitted as a percentage of approvals fromDOE within 10 days time frame.

This indicator reflects actual monthlyinvoices submitted as a percentage ofapprovals from DOE within 10 days timeframe.Green: X > 90%White: 80% < X < 90%Yellow: 75% < X < 80%Red: X < 75%

KPI Owner Dan Becker (803-725-1124) SME: Anthony Grant(803-952-8803

IV-C-3 On Time Invoicing: This indicator reflects milestonescompleted versus milestones invoiced (within five workingdays). The measure is the fee associated with the milestones.

The goal is to maintain an on-time(within five working days of completion)invoicing ratio of greater than 90% forall PBI milestones/ objectives completed.Success is measured as follows:Green: X > 95%White: 85% X 95%Yellow: 75% < X < 85%Red: X < 75%

KPI Owner: Clay Jones (803-725-4409) SME: Philip Croll(803-725-3158)

Page 30: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

30

IV-C-4 Cost Savings Performance: This indicator depicts thesite's progress toward achievement of the PACE/(CRIT) goal.(The CRIT goal and the PACE goal are the same.) A monthlyinterim goal has been determined to ensure sufficient progress isbeing made toward the final goal for the year. Monthly goals area function of either (1) the total projected value of all currentPACE initiatives with projected implementation date of thecurrent month or earlier, or (2) the total projected value neededby the indicated date to ensure that the site's goal is met by theend of the fiscal year. The actual savings to date vs. the goal todate equals the cost savings performance.

Site PACE Goal for fiscal year 2002, asof 1/31/02, is $120,103,000. The FY '02PACE goal may fluctuate according tothe additional funds demand created byemerging scope. Green > 95% ofmonthly goal. White is defined as > 85%and <95% of monthly goal; Yellow isdefined as > 80% and <85% of monthlygoal, and Red is defined as <80% ofmonthly goal.

KPI Owner: Frank Iwuc (803-725-7199) SME: Mike Tarrant(803-725-0679 )

IV-C-5 Employees Cost Effectiveness Performance (IDEAS):To show the number of approved IDEAS needed per month toaccomplish the goal of 12 approved IDEAS per 100 employeesfor the year (or .12 IDEAS per employee).

1500 approved IDEAS for FY02 (basedon 12 approved IDEAS per 100employees). Green is > 90% of monthlygoal; White is 80-90% of monthly goal;Yellow is 70-79% of monthly goal, andRed is < 70% of monthly goal.

KPI Owner Frank Iwuc (803-725-7199) SME: Brenda Kelly (803-725-0676)

IV-D-1 Nonconformance Report Processing Status: NCR'sdocument item deficiencies as required by 10CFR830, SubPartA, "Quality Assurance Requirements". This indicator depicts thenonconformance report (NCR) population in 3 age categories of< 60 days, between 60 and 120 days, and > 120 days old.Suspended NCRs are not included in this metric because theycannot be dispositioned "due to a project cancellation,suspension, extended system shutdown, facility closure orabandonment-in-place." (WSRC 1Q, QAP 15-1).

Green: Population of old NCRs is <=200.White: Population of old NCRs is > 200,but <= 250.Yellow: Population of old NCRs is <250, but >= 300.Red: Population of old NCRs is greaterthan 300.NOTE: Old NCRs are defined as thoseactive NCRs > 120 days old.

KPI Owner Joe Yanek (803-952-9893). SME: Sun Hwang (803-952-9926).

IV-D-2 Problem Identification Report Status: Thisperformance metric evaluates the company’s ability to identifyproblems using the Problem Identification Report (PIR) programand effect corrective actions in the time established by theresponsible manager. Both the estimated corrective actioncompletion dates and the responsible manager(s) are identifiedon the PIR document. The data will take into accountSignificance Categories of the PIRs.

Green: Average PIR corrective actionscompleted within 5 days of the estimatedcompletion date.White: Average PIR corrective actionscompleted between 5 and 10 workingdays of the estimated completion date.Yellow: Average PIR corrective actionscompleted between 10 and 15 working

KPI Owner Joe Yanek (803-952-9893). SME: Sun Hwang (803-952-9926).

Page 31: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

31

days of the estimated completion date.Red: Average PIR corrective actionsafter 15 working days of the estimatedcompletion date.

IV-D-3 Self0Assessment Program Effectiveness: Thisindicator depicts the percent of self-assessment findings for themonth that are identified and closed (activity) each month.Additionally, the problem recurrence rate is presented as afunction of total findings identified.RECURRING FINDINGS X 100 = RECURRENCERATE (%)TOTAL FINDINGS IDENTIFIED

No more than 10% findings recur withinthe assessment unit. Green is < 10%,White is > 10% and < 15%, Yellow is >15% and < 20%, and Red is > 20%.

KPI Owner Joe Yanek (803-952-9893) SME: Sun Hwang (803-952-9926)

IV-D-4 Facility Evaluation Board Grade Status: Thisindicator reflects a running average of the most recent grades ofall evaluated facilities and programs. Grades are assigned thefollowing values: Excellent - 5, Above Average - 4, Average -3, Below Average - 2, Significantly Below Average - 1

The goal is a running average value of3.0. Green: >3.5; White: 2.5 - 3.5;Yellow: 2.0 - 2.5; Red: <2.0

KPI Owner: Steve Johnson (803-952-9886) SME: Joy Price (803-952-9657)

V-A-1 Defense Nuclear Facility Safety Board 94-1 and 2000-1 Recommendation Implementation: This indicator depicts theyear to date and cumulative status of SRS DNFSBRecommendation 94-1/2000-1 commitments. Beginning withFY01, there were 25 remaining SRS commitments. Theremaining DNFSB 94-1/2000-1 milestone commitments arescheduled over the next 7 years with only a few due each year.It is more meaningful to monitor progress for the entire life ofthe SRS commitments.

Complete the remaining DNFSBRecommendation 2000-1 milestones bythe due date in the DOE ImplementationPlan for the Remediation of NuclearMaterials in the Defense NuclearFacilities Complex (Revision 1), datedJanuary 19, 2001. Green: all DNFSB2000-1 commitments are on schedule;White: one commitment is likely to belate; Yellow: 2-3 commitments are likelyto be late; Red more than threecommitments are likely to be late.

KPI Owner: John Dickenson(803-952-4604) SME: HarryPund (803-952-3684

V-B reflects schedule, cost and Annual Operating Plan Milestone Performance for three SRS Projects: MOX Fuel Fabrication Facility (MFFF),Pit Disassembly and Conversion Facility (PDCF), and Highly Enriched Uranium Facility (HEU).V-B-1 MFFF and PDCF Schedule Performance: Thisindicator shows performance against scheduled deliverables forthe PDCF and MFFF project support as a percentage ofdeliverables to WGI, DCS, or NNSA made on schedule.

The goal is deliver 85-90% ofdeliverables on schedule. Green denotes95-100% delivered on schedule, White is85-94%, Yellow is 80-84% and Red is

KPI Owner: Jimmy Angelos(803-725-8593) SME: DickTansky (803-725-5303)

Page 32: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

32

<80% delivered on time.V-B-1 HEU Schedule Performance: This indicator shows themonthly Schedule Performance Index (SPI) for the HEU BlendDown Program. The cumulative SPI captures the SPI frominception to-date for the HEU Blend Down Project.

The goal for this metric is for themonthly Budgeted Cost of WorkPerformed (BCWP) to exceed theBudgeted Cost of Work Scheduled(BCWS). Green is where theperformance is >1.00. White is .99-.90,Yellow is .89-.80 and Red is <.79.

KPI Owner Sam Speight (803)208-2796; SME: Perry Stanley(803) 208-3192

V-B-2 MFFF and PDCF AOP Milestone Performance: Thisindicator shows AOP milestone on-time completion for thefiscal year to date for the Pu Disposition Program against themilestones scheduled.

The goal for this metric is to complete allAOP milestones on time. Green is 100%of milestones met on time in the currentmonth.

KPI Owner Jimmy Angelos (803)725-8593; SME: Dick Tansky(803-725-5303)

V-B-3 MFFF and PDCF Cost Performance: This indicatorshows fiscal year to date OPC and TEC expenditures in supportof the MFFF, PDCF and PDP Waste Building projects shown ascompared to budget.

The goal for this metric is for actualexpenditures be close to budgetedexpenditures. Green is where theperformance is > 95% and < 100%,White is 85-95% or 100-102%, Yellowis 80-85% and 102-105%, and Red is <80% and > 105%.

KPI Owner Jimmy Angelos (803)725-8593; SME: Dick Tansky(803-725-5303)

V-B-3 HEU Cost Performance: This indicator shows themonthly Cost Performance Index (CPI) for the HEU BlendDown Program. The cumulative CPI captures the CPI frominception to-date for the HEU Blend Down Project.

The goal for this metric is for themonthly Budgeted Cost of WorkPerformed (BCWP) to exceed ActualCost of Work Performed (ACWP).Green is where the performance is >1.00.White is .99-.90, Yellow is .89-.80 andRed is <.79.

KPI Owner Sam Speight (803)208-2796; SME: Perry Stanley(803) 208-3192

V-C-1 Receipts of Rocky Flats Plutonium, V-C-2 Ship to WIPP,and V-C-3, Mound Receipts are currently being developed.

Page 33: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

33

WASHINGTON GOVERNMENT GROUP INITIALDIRECTIONS FOR PERFORMANCE METRICS

This section provides the general definitions and goals of KPIs as developed by the WashingtonGovernment Group after a year of study by a crosscutting set of subject matter experts. These aregeneral guidance and should be used by sites in the development of their own metrics. Each site maywant to “bin” the KPIs to suit their particular needs.

1. INDUSTRIAL SAFETY INDICATORSA. Total Recordable Case Rate

Suggested Goal: 3% reduction in the average of the previous three yearsGreen = >3% reduction = 3 pointsWhite = >1% reduction = 2 pointsYellow = >0% reduction = 1 pointRed = increase = 0 points

B. Lost Days (days away) RateSuggested Goal: >5% reduction of average of past 3 years’ ratesGreen = >5% reduction = 3 pointsWhite = >3% reduction = 2 pointsYellow = >0% reduction = 1 pointRed = increase = 0 points

C Lost and Restricted Work Day Case RateSuggested Goal: 3% reduction in the average of the previous three yearsGreen = > 3% reduction = 3 pointsWhite = > 1% reduction = 2 pointsYellow = > 0% reduction = 1 pointRed = increase = 0 points

2. ENVIRONMENTAL MANAGEMENT LEVEL 2 INDICESA. Reportable Environmental Events

1.) Number of regulatory non-compliances (as determined by receipt of notification,e.g. Notice of Noncompliance from Regulatory Agency)

2.) Number of national Response Center notifications (CERCLA/EPCRA release>reportable quantity)

Suggested goal: x% lower than the previous four-year performance average to reflectcontinuous improvement.Target for KPIsGreen >10% below goal (Continued performance at no reportable events is ratedgreen) = 3 pointsWhite 0-9% below goal = 2 pointsYellow 1-9% above goal = 1 pointRed >10% above goal = 0 points

B. Environmental Enforcement Actions1.) Number of NOVs2.) Total Fine/Penalties paid (weighted – see below)

NOV w/o fine = 1NOV w/ fine < $50K = 3NOV w/ fine > $50K = 6

Suggested goal: x% lower than previous four-year performance average to reflectcontinuous improvement.

Page 34: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

34

Target for KPI #2:Green >10% below goal (Continued performance at no NOVs is rated green)White 0-9% below goalYellow 1-9% above goalRed >10% above goal

C. Pollution Prevention/Waste Reduction1.) Reduction in Volume of Waste Generated

- Hazardous- Radioactive- Sanitary (non hazardous, non radioactive)

2.) Sum of disposal costs savedSuggested goal: x% (each organization is to decide their own goal) reduction in wastegeneration from previous year while meeting all applicable regulatory requirementsand/or goals set by EPA, DOE, DOD, etc.Proposed measure: based yearly waste generation forecast from planned activities,achieve percentage reductions for each of the three waste stream types (each having itsown goal). Hazardous and radioactive waste reduction efforts should be given moreemphasis because of higher associated costs. Since unit disposal costs may varydepending on location, it is proposed that total cost savings be reported but notfactored into the target calculation.Target for KPI #3:Green >10% above goalWhite 0-9% above goalYellow 1-9 % below goalRed 10% below goal

3. EMERGENCY SERVICES AND FIRE PROTECTIONEach organization should set a goal (either a number or percent) of fire protection and firedetection systems it will allow in impaired or inoperable condition (with compensatoryaction) at any time.

Green = .8 goalWhite = goalYellow = 1.1 goalRed = 1.2 goal

A similar approach is to be taken for drills and exercises scheduled versus performed andpercentage of overdue corrective actions and assessments tolerated.A. Fire Protection SystemsB. Fire Detection SystemsC. Emergency Services

4. MAINTENANCE INDICATORSA. PM/PdM as a % of Total maintenance

This metric measures the effectiveness of the PM and Predictive Maintenance (PdM)by indicating what percentage of Total Maintenance is being performed as PM/PdM.(Since this is a ratio, the measurement of PM and PdM can be in whatever units theorganization determines most useful, i.e. hours, man-hours, events, actions, etc.)Suggested Goal: 50%Green = 56-100%White = 40- 55%Yellow = 30-39%Red =<30%

Page 35: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

35

B. Maintenance CM Backlog in Man-hoursThis measure measures the Total Corrective Maintenance (CM) backlog in man-hours Excludes work associated with Preventative Maintenance, PredictiveMaintenance, and plant modifications.Suggested Goal: The goal is equal to the average monthly backlog for the previousyear. A monthly index scoring is applied to the goal as follows:Green = <70% of goalWhite = 71-105% of goalYellow = 106-130% of goalRed = >131% of goal

C. Maintenance Delinquent PMsThis metric is intended to measure the total number of Delinquent PreventativeMaintenance (PM) work orders – those PMs that are not field work complete, buthave exceeded their scheduled completion date and any allowable grace period.Suggested Goal: The goal is equal to the average monthly backlog for the previousyear. A monthly index scoring is applied to the goal as follows:Green = <70% of goalWhite = 71-105% of goalYellow = 106-130% of goalRed = >131% of goal

5. MANAGEMENTA. NCR/PDR/Deficiency Reports Processing Status (age)

Each site would establish its own statistical baseline and then work towards animprovement goal. The coloring would be based on being within the goal orimprovement curve.

Green = < 95% of goalWhite = 96 - 100% of goalYellow = 101 - 110% of goalRed = >110% of goal

B. Corrective Action Reports/Items completed on timeGreen = >95%White = >90%Yellow = >85%Red = <85%

C. QA Surveillance Report Processing Status (age)Each site would establish its own statistical baseline and then work towards animprovement goal. The coloring would be based on being within the goal orimprovement curve.

Green = < 95% of goalWhite = 96 - 100% of goalYellow = 101 - 110% of goalRed = >110% of goal

6. ENGINEERING INDICATORSA. Engineering changes (as a measure of quality)

Measure changes to product design/work plans/engineering documents after approval(e.g. approved for construction status).

• As a result of final internal (cross discipline) review• After approval due to field implementation problems attributed to engineering design

Page 36: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

36

• As a result of testing problems attributed to engineering/design In a construction/production environment, measure required rework. Record only changes/rework caused by engineering unit. Total the % of initial products submitted for final internal Engineering review to the

total processed from that point without “substantial” revision.Green =< 10%White = < 15%Yellow =< 20%Red => 20%

B. Requests for Engineering Assistance or whatever the analogous vehicle is called.• Measure requests, resolutions, backlog of REAs or construction site RFIs (Request

for Information) due to engineering shortcomings.• Engineering involvement in plant start-up or product introduction to manufacturing

(measured in man-hours or dollars).• Number of non-conforming items during project execution (e.g., construction, initial

manufacturing and testing).• % responded to in original allotted time

Green = 100% - 96%White =<95% Yellow =<90% Red = <85%

C. Employee participation in product/process improvements (a measure ofimprovement in how work is done efficiently). Examples include:

• Status of log of process/product improvement suggestions.• Dollar value of projected savings due to accepted product/project improvements.• Number of employee meetings/reviews where improvement/development topics are

discussed.• Level of funding of process improvement projects.• Measure of success for a “Lessons Learned” Program.• Number of team interface meetings held (e.g., process mapping meetings,

constructibility review) during the design phase of project.• Measure volume of value engineering activities in net dollar savings.

Each site should determine its own scoring levels.

7. EMPLOYEESA. Employee Concerns Program

1.) Percent of concerns resolved by internal management2.) Number of concerns resolved internally + # concerns taken to external source

Green = 100% internalWhite = >95% internalYellow = >90% internalRed = <90% internal

3.) Employee Issue Resolution TrendGreen 100% < 30 days oldWhite >95% < 30 days oldYellow >90% < 30 days oldRed <89% < 30 days old

B. Employee Performance1.) Conduct of Operations Performance Index

Page 37: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

37

Track & trend Disciplined Operations (DO). Goal: maintain performance @ 4.5(+/- 1.0) events/200k hours. Maintain a healthy balance of operations withoutsuppressing reporting.Green =<4.5White =4.5 – 4.8Yellow =4.8 – 5.0Red =>5.0

2.) Violation/Inadequate ProceduresTrack & trend Violation/Inadequate Procedure DO component events per 200,000hours. Goal: maintain performance @ 1.5 events/200k hours. Maintain a healthybalance of operations without suppressing reporting.Green = <1.5White =<2.0Yellow = <2.5Red = >2.5

C. Employees Cost Effectiveness Ideas PerformanceTrack & trend IDEAS submitted, approved & implemented per 100 employees.Goal: 15 ideas accepted/100 employees. Because people are our most importantresource, then it is imperative to encourage and reward participation for cost effectivecontinuous improvementGreen = >15White = 15 -10Yellow = 10 -5Red = <5

8. SAFEGUARDS AND SECURITY(Note: Events are defined as violations, infractions, incidents)A. Security EventsB. Computer eventsC. Document eventsSuggested goal **:

Green = 0 eventsWhite = 1 eventYellow = 2 eventsRed = > 3 events

** Goal exceptions for sites with > 1000 employees and > 10,000 documents

9. FISCAL ACCOUNTABILITY*A. Gross Profit – Award Fee less Unallowable Cost/Available Fee

Green ≥ 91%White 80 – 90%Yellow 75 – 79%Red < 75%

B. Cash Flow – Date Collected less Date EarnedGreen < 10 daysWhite 11–20 daysYellow 21-30 daysRed >30 days

Page 38: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

38

C. Revenue Growth – Revenue for recorded time present in year divided by revenue tosame time period previous yearGreen >100%White = 100%YellowRed <100%

10. HUMAN RESOURCESA. Staffing to Target

• % Staffing to Budget/Staffing Plan• % Contract Staffing to Budget/Staffing Plan

KPI Target: % of goal attainment per site goalGreen .97 – 1.00White .90 - .97Yellow .70 - .89Red <.70 =

B. Critical Skills Staffing• % Critical Hires to Staffing Plan

KPI Target and scoring level is the same as for Staffing to TargetC. Workforce Diversity Hiring Index

• % Minority Technical Hires – Technical Hires = direct staff such as engineers,chemists, project controls, etc.

• % Female Technical Hires – Technical Hires = direct staff such as engineers,chemists, project controls, etc.

• Division Minority Goals/hired or placed• Division Female Goals/hired or placed

Each site should develop a target for female or minority hiring that is consistent withits affirmative action plan (%). Each report will be an actual YTD hire (%).KPI Target and scoring level is the same as for Staffing to Target

11. PROJECT MANAGEMENTA. SPI = Actual Days / Baseline Days

Red > 1.06Yellow = 1.059 to 1.0 or .94 with ETC > 1.0White = 0.99 to 0.94 with ETC <1.0Green < 0.94

B. CPI = Actual Cost/Baseline CostRed > 1.06Yellow = 1.059 to 1.0White = 0.99 to 0.94Green < 0.94

C. Health of project is a subjective assessment by the Project Manager of real andpotential obstacles to finishing on schedule, on budget or other problems

Each site should determine its own scoring levels.

12. COST EFFECTIVENESSA. Milestone Performance Milestones Achieved/Milestones Committed

Green > 95%White 85 to 95%

Page 39: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

39

Yellow 80 to 85%Red < 80%

B. Budget Cost Performance Budget Performance/Total BudgetGreen > 95%White 85 to 95%Yellow 80 to 85%Red < 80%

C. Cost Savings Performance Cost Savings/Cost Savings CommittedGreen > 95%White 85 to 95%Yellow 80 to 85%Red < 80%

D. PBI Performance PBIs Achieved/PBI CommittedGreen > 95%White 85 to 95%Yellow 80 to 85%Red < 80%

13. REGULATORY PERFORMANCEE. Overdue Commitments Index

1.) State Environmental Agency Overdue Commitments2.) Other State Agency Overdue Commitments3.) DOE Overdue Commitments4.) Stakeholder Groups Overdue Commitments

Green No overdue commitmentsWhite One overdue commitmentYellow Two or three overdue commitmentsRed Four or more overdue commitmentsNote: It is recognized that a single overdue commitment could be categorized asyellow or red depending upon the risk or if were one of only a few commitments tobe met.

F. Open Regulatory Items Index1.) State Environmental Agency Open Commitments2.) Other State Agency Open Commitments3.) DOE Open Commitments4.) Stakeholder Groups Open Commitments

Green All commitments on scheduleWhite Likely to miss >5% of commitmentsYellow Likely to miss >10% of commitmentsRed Likely to miss >20% of commitments

G. DNFSB Commitment Adherence Index1.) Status of 94-1 Commitments2.) Status of other non-94-1 DNFSB commitments3.) Commitments older than 6 months4.) Commitments older than 12 months.

Green Greater than 95% commitments on scheduleWhite Likely to miss >5% of commitmentsYellow Likely to miss >10% of commitmentsRed Likely to miss >20% of commitments

Page 40: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

40

14. COMMUNITY RELATIONSA. External Relations

1.) Responsiveness to Citizen Advisory Boards/Citizen Advisory CommissionsGreen 85-100 percent turnaround to requests/commitments within 30 daysWhite 70-84 percent turnaround to requests/commitments within 30 daysYellow 50-69 percent turnaround to request/commitments within 30 daysRed 49 percent or less turnaround to requests/commitmentsNOTE: Excluding legal and customer driven restraints.

2.) Public Meeting EffectivenessUtilizing survey techniques, measure public meeting effectiveness. Each site willdocument through exit surveys each participant's percentage of change from beforeto after the meeting or session relating to their impressions.

Green 50-100 percent improvementWhite 25-49 percent improvementYellow 10-24 percent improvementRed 9 percent or less

3.) Tour/Visitor's Program EffectivenessUtilizing survey techniques, measure tour and visitor program effectiveness. Eachsite will measure through exit surveys each participant's percentage of change frombefore to after the tour or program relating to their awareness."Did this tour and/or program increase your awareness of the facility and itsoperations?"Green 50-100 percent improvementWhite 25-49 percent improvementYellow 10-24 percent improvementRed 9 percent or less

15. SELF ASSESSMENTSB. Scheduled vs. Complete

Green 98-100%White 96-97%Yellow 94-95%Red <93%

C. StatusThis metric would include the number of items opened each month, closed eachmonth, items overdue, and the total number of open in the system. Goal = no morethan 1% delinquent. Percent may vary depending of the size of the site.Green 0% delinquentWhite <1%% delinquentYellow <2%% delinquentRed <3%% delinquent

16. QA ASSESSMENT LEVEL 2 SUPPORTING CATEGORIESWhen an assessment is conducted there is typically a checklist prepared to guide theassessor. Depending upon the type of assessment being performed, the checklistwill contain "attributes." The attributes will typically be one of two types: eitherrequirements from our established quality programs and procedures (a complianceassessment) or attributes in the form of performance indicators (an implementationassessment).A compliance assessment tells if written quality programs address the requirementsof the regulatory body/agency having jurisdiction over the project (e.g. NRC, DOE,

Page 41: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

41

Army Corps of Engineers, etc.). It also tells if those requirements are passed downthrough implementing procedures and procurement documents, and if personnel areeffectively trained to perform in accordance with those requirements andprocedures.An implementation assessment tells if written quality program and procedures arebeing implemented effectively.Because assessments are performed to a checklist of attributes, the number ofnoncompliances can be compared to the number of attributes and expressed as apercentage.

A. QA Program Compliance KPIPercentage of the QA Program Compliance attributes assessed versus the QAProgram Compliance attributes found compliant.QA Program Compliance KPI pertains to assessing whether the QA requirementsapplicable to the Project and the methods to meet these requirements are adequatelydelineated in the Project’s command media. This KPI should also address personneltraining and qualification assessment attributes.

Green = 100% - 95%White = 94% - 90%Yellow = 89% - 80%Red = 79%- below

B. QA Program Implementation KPIPercentage of the QA Program Implementation attributes assessed versus the QAProgram Implementation attributes found compliant.QA Program Implementation KPI pertains to assessing whether the work is beingperformed in accordance with the established Project command media associatedwith the QA Program Compliance KPI.

Green = 100% - 95%White = 94% - 90%Yellow = 89% - 80%Red = 79%- belowNotes on both the QA Program Compliance and Implementation KPIs

• The Assessment Plan must delineate specific attributes to be assessed. Changesto the plan during the assessment must be documented in the plan. Thus, theAssessment Plan provides the number of attributes assessed for both KPIs.

• A deficiency/concern is categorized as a noncompliance or an observation.Observations should not be included in the calculating the performance. Also,there should be a method to factor in additional value for attributes that areidentified with proficiency. (Note: If more value is given to an attribute withproficiency, the KPI could be greater than 100%).

• More than one noncompliance may be identified to an attribute. Therefore, thereneeds to be method whereby the noncompliances are grouped to the attribute.

C. Corrective Action Implementation KPI• Percentage of the corrective actions completed by the original forecasted

completion date.This KPI can be graded to the following:

Green Completed ahead of schedule.White Completed on or within 30 days after scheduled date.Yellow Completed within 30 to 60 days after scheduled dateRed Completed greater than 60 days after scheduled date.

Page 42: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

42

Consideration needs to be given for those corrective actions that are dependent onexternal interfaces outside the control of the Project. The Corrective ActionDocument serves as the tool to gather data for this KPI.

D. External Assessments• Findings/Audit• Days Open• Times Delinquent• Days Delinquent• Extension Request

Individual Site External Oversight Indicator Scoring Sheet

Attribute Red Yellow White GreenFindings/Audit >10 5-10 3-4 <3

Points 10 7 5 0Days Open >45 15-45 10-14 <10

Points 10 7 5 0

Times Delinquent >5 3-5 1-2 <1

Points 10 7 5 0Days Delinquent >10 5-10 1-4 <0

Points 20 15 10 0ExtensionRequests >5 3-5 1-2 <1

Points 10 7 5 0Index (Totals)

Page 43: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

43

APPENDIX A: SRS PERFORMANCE METRICPRESENTATION

Slide 1

1

Savannah River SitePerformance Metrics

This presentation has been used onsite toteach various managers and site employeesabout the SRS Performance Metric system.Examples provided are fictitious, butrealistic.Questions can be directed to:

Gail JerniganWestinghouse Savannah River CompanyBuilding 703-A, Room A129Aiken, SC 29853Phone: 803-725-7774Fax: [email protected]

Slide 2

2

Potential Solution - Color Rating System

HistoryInstitute of Nuclear Power Operations (INPO)key performance indications established postThree Mile Island−Vital element of industry improvement

initiative−Leading and following indicators−Includes analysis and action section

Color roll-up developed by utilities−Quick status summary (for utility, Nuclear

Regulatory Commission, and INPO)−Easy assessment of trends−Sharing of expertise/knowledge

Public utilities, with the use of this system,have shown that it is possible to be costefficient and cost effective whilemaintaining excellent performance.

Page 44: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

44

Slide 3

3

Strategy

To sharpen the focus on a critical fewperformance metrics with the ability toanalyze problem areas and benchmarksuccesses.Using the metrics developed for theWashington Government Group (WGG) effortand metrics developed for the Department ofEnergy-Savannah River (DOE-SR) FocusArea metrics, map Westinghouse SavannahRiver Company (WSRC) Key PerformanceIndicators (KPIs) and DOE-SR parameters toDOE-SR Focus Areas.Use objectivity to assess SRS performance.

Drivers for moving to this metric systemincluded:− Need to quickly look at a performance

metric system to identify problem areasand successes.

− Communication tool between DOE andcontractor to establish a commonunderstanding of problems, issues andsuccesses.

− Replace subjectivity with objectivity inthe spirit telling WHAT DOE wantsinstead of HOW.

To determine the metrics to be used, thefollowing were used:− DOE-SR Focus Areas (Safety and

Security; Technical Capability andPerformance; Community, State andRegulator Relationships; CostEffectiveness and CorporatePerspective)

− Existing DOE-SR/WSRC performancemetrics

− Corporate Washington Governmentperformance metrics

These were combined, discussed, andmapped to the DOE-SR Focus Areas.Further changes have been made once thesystem was established.

Slide 4

4

Benefits

Quickly illustrates successes and problems.Shifts contract performance to focus on resultsthrough KPIs.Color rollup allows quick drill down to source ofproblem areas.KPIs as basis are objective.Analysis/Action section of KPI allowsassessment of problem status.Flexibility: ability to accommodate various needs.Could allow/promote sharing/prioritizing ofresources/ideas across the site.Could promote consistency in approachesacross the site.Will facilitate continuous improvement.

The analysis/action section is the mostimportant section of the KPIs. This is wherethe KPI owner analyzes the performance bylooking at the data and asking WHY. Thereader should be able to quickly tell if theKPI owner really understands what ishappening. The action defines what the KPIowner plans to do to either improveperformance (if yellow or red) orsustain/improve (if white or green).This system’s flexibility allows the site tomap the various performance metrics to suitvarious customers, including DOE-HQ,Operations Office, regulators, contractor’sBoard of Directors, the public, and otherinterested stakeholders.

Page 45: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

45

Slide 5

5

“Drilling Down”

Site Level Metrics

WSRC Division

Facilities

KPIs

KPIs

KPIs

Another way to drill down is for eachdivision within the company to embrace thesame metrics for the division and then thefacility. While not all metrics will apply toall divisions, many metrics are applicableacross the site, company, division, andfacility. Using this same system allowsmanagers to analyze their division’s orfacility’s performance. In addition, usingthe same metric system across the sitemakes it easier for all employees, staff, andmanagers to understand how one action caneffect the site’s performance.

Slide 6

6

Color Values and Definitions

G

W

Y

R

B

NA

Weakness: where perfo rmance is significantly below the goal and not showing an improving trend, o r where the annual goal ac hievement is no t expected .

Not A ppl icable

Unsa tisfactory/S ignificant

Sa tisfactory/Norm al

No DataThis perfo rmance measure did no t have data to report for this time period; no sc ore is included in summary

Color Value s and D efinitions

Tracking to Sa tisfactoryNeeds Improvement: where performance needs improvement and management attention. Perfo rmance may be achieving goal, but showing a negative trend over several periods that, if continued, will challenge goal achievement.

Satisfacto ry Performanc e: at/above the industry average or the annual goal, with stab le or improving trend over several periods.

Excel lence/SignificantStrength: where perfo rmance exhib its a significant strength, such as industry top quartile performance o r ac hievement of longer term goals

Slide 7

7

Rating/Scoring Sheet

Scoring

3210

Rollup scoring based on number of applicable K PIs in grouping

R ollupValue s 2 3 4 5 6 7 8Green > 6 > 8 > 10 > 13 > 15 > 18 > 20W hite > 4 > 5 > 7 > 9 > 11 > 12 > 14Yellow > 2 > 3 > 5 > 7 > 9 > 10 >12Red < 2 < 3 < 5 < 7 < 9 < 10 < 12

# of applicable KPI's to be evaluated

Red

Score s for each L-2 or L-3 KPI metricGreenW hiteYellow

The algorithm was developedconservatively. Basically, it is using anaverage of the number of metrics. Forexample, if there are four KPIs to beevaluated and these four KPIs are Green,White, Green, and White, the sum would be10. (3+2+3+2+3=10). The average wouldbe 2.5, which is rounded to 3, Green.

The simplicity of the system provides theability to be either more or lessconservative.

Page 46: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

46

Slide 8

8

SRS Summary Example

Data represented here is fictitious.

B G G G B G G G B W G W B B B B B B B B B G G G

GB G G W B B B B B W W W B G G G B G G G B G G W B W W W B W W W

B G G G B G G G B G G G B G G W B B B B B G G G B G G G B G G G

GB W W W B G G G B G G G B W G G B R Y W B R R Y B W W W B Y W G

BB B B B

Large boxes represent current month; each small box represents a quarter of the year.

G W B

Y R NA

Technical Qualifications

Feedback and Improvement

W

Employee Relations

G

Public Perception

Weakness: where performance is significantly below the goal and not showing an improving trend, or where the annual goal achievement is not expected.

Safety andSecurity

Tech Capabilityand

Performance

Community,State and

Regulatory Relationships

CostEffectiveness

CorporatePerspective

DisciplinedOperations

Satisfactory Performance: at/above the industry average or the annual goal, with stable or improving trend over several periods.

B

Unsatisfactory/Significant

G

G

G

Public Participation Program

FinancialPerformance

G

B

AOP Milestones

G

Production Infrastructure

B G

Waste InventoryReduction

G

Industrial Safety

G

Emergency Services and Fire

Protection

W

Environmental Release Index

EnforceableAgreementMilestones

Environmental Compliance Index

Focus Area

G

Radiation Contamination

and Control

Level I

W

Nuclear Safety

G

Financial Forecasts PBI Performance

This performance measure did not have data to report for this time

Engineering

W

G W

No Data

W

Project Management

W

G

Satisfactory/Normal

G

CABResponsiveness

G

PACE

Physical Security

B

Needs Improvement: where performance needs improvement and management attention. Performance may be achieving goal, but showing a negative trend over several periods that, if continued, will challenge goal achievement.

Not Applicable

Excellence/SignificantStrength: where performance exhibits a significant strength, such as industry top quartile performance or achievement of longer term goals

Tracking to Satisfactory

Other KeyIndicators

Quarters Current Month

Slide 9

9

Safety and Security Focus Area Example

Data represented here is fictitious.

Focus AreaG G G G B G G G B G G G B G G G B G G G

B G G W B W W W B G G W B W G R B G W G

B B B B B B B B B B B B

B B B B B B B B B B B B B B B B

B G G G B G G G

Large boxes represent current month; each small box represents a quarter of the year.

Reportable Dose Exceedances

Reportable Contamination

Security Incidents

G

B

G G

Cost IndexLWC RateTRC Rate

Fire Protection ImpairmentStatus

G

G

Emergency Services and Fire Protection

Industrial Safety and Health

Safety and Security

Physical Security

W

Radiation Contamination and Control

G

Emergency Management Corrective Actions

RG

G G

Emergency Management EPHA Annual

Review/Revision

B

B

Nuclear SafetyNuclear Safety Issue

Management Index

B

Authorization Basis Document Management

IndexB B

Significant Nuclear Safety Incidents Index

Emergency Exercises and Drills Conducted Vs.

Scheduled

G

B

Savannah River SitePerformance Indicators

Through December 31, 2001

Level 2Level 1

Slide 10

10

Safety KPIs

Data represented here is fictitious.

Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov DecDataTRC Rate 0.47 0.51 0.73 0.94 0.97 0.96 0.93 0.95 0.94 0.95

Green Green Green W hite W hite W hite W hite W hite W hite W hiteLWC Rate 0.00 0.17 0.22 0.17 0.32 0.30 0.25 0.23 0.31 0.22

Green Green Green Green Green W hite Green Green W hite GreenLWD Rate 0.00 1.19 2.85 3.23 3.00 4.07 3.24 3.24 3.68 4.36Score Green Green Green Green Green Green Green Green Green GreenDefinition Analysis / Action

Goal - Monthly Rates are cumulative CommentsTRC Rate = 0.93 Green > 5% below the goalLWD Rate = 12.87 W hite 0-4% below the goalLWC Rate = 0.30 Yellow 0-4 % above the goal

Red >5% above the goal

The Occupational Safety and Health Act of 1970 requires covered employees to prepare and maintain records of occupational injuries and illnesses. WSRC measures safety performance in accordance with these guidelines. The incidence rates are calculated by multiplying the number of injuries, illnesses or lost workdays by 200,000 and dividing by the total number of work hours.

TRC is White and LWC and LWD are Green. Before the incidents in October, WSRC was approaching the established goals for 2001. Using the site's Lessons Learned program, information from these four incidents will be communicated to all s ite employees. Since the previous injury rate indicated that the Behavior Based Safety process was having a positive impact on the site's overall safety performance, each site department and section will have additional BBS training during the monthly safety meeting. Management will also continue to monitor safety during its weekly staff meetings.

W SRC combined Total Recordable Case (TRC) Rate, Lost W orkday Case (LWC) Rate and Lost Workdays (LWD) Rate for this metric. Stoplight definit ion and goals are more aggressive than Corporate.

SME Manager: Kevin Smith (803-952-9924) SME: Linda Blackston (803-952-9905)

�� ������

����

����

����

����

����

����

������

SAFETYIndustrial Safety

Industrial Safety IndicatorsThrough October 31, 2001

0.00

0.20

0.40

0.60

0.80

1.00

1.20

1.40

1.60

1.80

2.00

2.20

2.40

2.60

2.80

3.00

TRC

and

LW

C R

ate

0.00

0.80

1.60

2.40

3.20

4.00

4.80

5.60

6.40

7.20

8.00

8.80

9.60

10.40

11.20

12.00

12.80

13.60

LWD

Rat

e

��TRC RateLWC RateLWD RateGoal

I-A-1,2,3 Safety

Page 47: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

47

Slide 11

11

Metric GroupingInitial listing of metrics developed by WGG after ayear of study by various site experts

Safety (Industrial Safety, EnvironmentalManagement, and Emergency Services and FireProtection)Operational Excellence (Maintenance,Management, and Engineering)Work Environment (Employee Concerns, EmployeePerformance, and Safeguards and Security)Organizational Effectiveness (Fiscal Accountability,Human Resources, Project Management, and CostEffectiveness)External Relations (Regulatory Performance andCommunity RelationsAssessments (Self Assessments, QualityAssurance, and External Assessments)

Slide 12

12

Metric Grouping (continued)Current set of metrics “married” the WGG set with the DOE-SR set of metrics, based on DOE-SR’s manager’s FocusAreas

Safety and Security (Industrial Safety, Emergency Services andFire Protection, Radiological Safety, Nuclear Safety, andPhysical Security)Technical Capability and Performance (Production,Infrastructure, Waste Avoidance, Disciplined Operations,Employee Relations, Engineering, and Project Management)Community, State and Regulatory Relationships (EnvironmentalRelease Index, Enforceable Agreement Milestones,Environmental Compliance Index, Citizens Advisory BoardResponsiveness, Public Participation Program, and PublicPerception)Cost Effectiveness (Financial Forecasts, Performance BasedIncentives, Financial Performance, and Feedback andImprovement)Corporate Perspective (Nuclear Non-Proliferation, PlutoniumDisposition, Assisting Other Sites to Closure, and EMIntegration)

Slide 13

13

Communication Plan Outline

I. IndoctrinationA. Letter from WSRC Executive Vice President to

WSRC Vice Presidents, WSRC Deputy Managers,and Level 2 managers (telling what the metrics areand the schedule)

B.Training Sessions1.Deputy Managers2.Level 2 Managers3.Program Managers4.Subject Matter Experts5.DOE-SR Personnel and staff

II. Site CommunicationsA. SRS On LineB.SRS NewsC.DOE-SR Newsletter

Page 48: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

48

Slide 14

14

Summary

Using experience of Institute of NuclearPower Operations, World Association ofNuclear Operators, and WashingtonGovernment Group, proposed set of metricsprovides an objective view of site activitiesthat shifts focus to results.Proposal quickly allows management toidentify areas of concern and “drill down” tospecific activity.Analysis/Action section allows assessmentof problem status.

Slide 15

15

Questions?

WSRC Point of Contact:−Gail Jernigan - 803-725-7774

[email protected]

DOE Point of Contact:−Jeff Allison - 803-952-2-6337

[email protected]−Sherry Southern - 803-952-8272

[email protected]

Page 49: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

49

APPENDIX B: NUCLEAR SAFETYWSRC is evaluating the various components to determine applicability to SRS Nuclear Safety Level2 metrics. This appendix was created by Westinghouse Safety Management Solutions as a generalguidance. WSRC has opted not to include Authorization Basis Document Management Index.

1. Nuclear Safety Issue Management Index (these include all NIs, PISAs, Discovery USQs, etc.)• percentage of unresolved nuclear safety issues to total• average age of unresolved nuclear safety issues• number of unresolved nuclear safety issues

− greater than 30 days old− greater than 6 months old− greater than 1 year old

2. Significant Nuclear Safety Incidents Index• number of emergency notifications• number of reportable unusual occurrences related to nuclear safety• number of inadvertent safety system challenges• number of Technical Safety Requirement violations (or Operational Safety Requirement or

Technical Specification as applicable)• number of significant criticality infractions (e.g. both Double Contingency Controls violated)• number of Criticality Safety Limit violations• number of “for cause” shutdowns by DOE which are related to Nuclear Safety

NOTE: It is recognized that double, triple or even 5X counting may occur for one event. This isacceptable since accumulative counting appropriately reflects the significance of the event.

3. Authorization Basis Document Management Index• Number of overdue requirements/commitments to DOE associated with 10CFR830.

The order of safety significance and associated weighting for the color codes is:• Significant Nuclear Safety Incidents Index (50%)• Nuclear Safety Issue Management Index (30%)• Authorization Basis Document Management Index (20%)

The frequency of measurement is quarterly.

Page 50: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

50

1. Nuclear Safety Issue Management Index (including all NIs, PISAs, Discovery USQs, etc.)

a. Ratio of the number of unresolved nuclear safety issues on the last day of each quarter of arolling annual fiscal year to the total cumulative number opened during the rolling annualfiscal year.

RATIO A

0.00 < R < 0.25 0

0.25 < R < 0.50 1

0.50 < R < 0.75 2

R > 0.75 3

b. Average age of all unresolved nuclear safety issues on the last day of each quarter of a rollingannual fiscal year.

AVERAGE AGE B

0 month < T < 1 month 0

1 month < T < 3 months 1

3 months < T < 12 months 2

T > 12 months 3

c. Sum of the ratios of unresolved nuclear safety issues greater than 30 days old (U1), greaterthan 6 months old (U2), and greater than 1 year old on the last day of each quarter of a rollingannual fiscal year to the total cumulative number opened during the rolling annual fiscal year.

SUM OF RATIOS X

(U1/U) + 2(U2/U) + 5(U3/U) < 0.5 0

(U1/U) + 2(U2/U) + 5(U3/U) >0.5< 1.5 1

(U1/U) + 2(U2/U) + 5(U3/U) > 1.5< 2.5 2

Page 51: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

51

d. Result for Nuclear Safety Issue Management Index

S = A + (B + X) / 2 COLOR

S = 0 Green

1 < S < 2 White

2 < S < 3 Yellow

S > 3 Red

2. Significant Nuclear Safety Incidents Index

e. Number of emergency notifications during each quarter of the fiscal year (X=2).f. Number of reportable Unusual Occurrences related to nuclear safety during each quarter of

the fiscal year.g. Number of inadvertent safety system challenges during each quarter of the fiscal yearh. Number of Technical Safety Requirement, Operational Safety Requirement, and Technical

Specification violations during each quarter of the fiscal year (X = 2)i. Number of significant criticality infractions (e.g. both Double Contingency Controls are

violated) during each quarter of the fiscal year (X = 3)j. Number of Criticality Safety Limit violations during each quarter of the fiscal year (X = 3)k. Number of “for cause” shutdowns ordered by DOE during the rolling annual fiscal year due

to nuclear safety concerns as measured at the end of each quarter of the rolling annual fiscalyear(X = 6)

X = Count Multiplication Factor

NOTE: It is recognized that double, triple or even 10X counting may occur for one event. This isacceptable since accumulative counting appropriately reflects the significance of the event.

TOTAL COUNT SCORE COLOR

C = 2 0 Green

2 < C < 5 1 White

6 < C < 10 2 Yellow

C > 11 3 Red

3. Authorization Basis Document Management Index

l. Number of overdue requirements/commitments to DOE associated with 10CFR830 on thelast day of each quarter of the fiscal year.

Page 52: Performance Metric Workbook - EFCOG.org · READING THE CHARTS This box represents the Safety and Security box from the Summary Charts. The small boxes across the top are the scores

SRS Performance Metrics Manual, Rev. 0 WSRC-RP-2002-00252

52

TOTAL COUNT SCORE COLOR

N < 2 0 Green

3 < N < 5 1 White

6 < N < 8 2 Yellow

N > 9 3 Red

Each Level 2 metric will be evaluated annually and the Level 1 roll-up will also occur annually

The hierarchy of safety significance of issues and their associated weights for the color codes is:Nuclear Safety Issue Management Index 0.3Significant Nuclear Safety Incidents Index 0.5Authorization Basis Document Management Index 0.2

RESULT SCORE COLOR

0.0 < R < 1.4 0 Green

1.5 < R < 4.1 1 White

4.2 < R < 7.8 2 Yellow

R > 7.9 3 Red

R = (S * 0.3) + (C * 0.5) + (N * 0.2)