20
<Company Logo> User of this document is responsible to use only the current version available in organizational database. This document is for intended recipients only. No part of this document shall be reproduced, stored in retrieval system or transmitted in any form or by any means – recording, photocopying, electronic and mechanical without prior permission of XYZ Company. FOR: XYZ Company Global Development Center CONTENT: Process Performance Baseline Report October to December, 2007 R-OPP-PPBR-Q22009 Version 1.2

Sample Process Performance Report

Embed Size (px)

Citation preview

Page 1: Sample Process Performance Report

<Company Logo>

User of this document is responsible to use only the current version available in organizational database. This document is for intended recipients only. No part of this document shall be reproduced, stored in retrieval system or transmitted in any form or by any means – recording, photocopying, electronic and mechanical without prior permission of XYZ Company.

FOR:

XYZ Company Global Development Center

CONTENT:

Process Performance Baseline Report

October to December, 2007

R-OPP-PPBR-Q22009

Version 1.2

11-Aug-2009

Page 2: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

Document History

Version Date Author Description of Change Reviewed By Approved By

1.0a 22-Jul-09 Abc Initial draft - -

1.0b 24-Jul-09 Abc Report updated with

review comments

Measurement

Councilmember

s

-

1.0c 27-Jul-09 Abc Updated Maintenance

Projects’ graphs and

inferences with corrected

data set

- -

1.0d 28-Jul-09 Abc Added Box-Plots for all

metrics & updated

inferences

QA Head -

1.1 09-Aug-09 Abc Updated report based on

various review comments

from stakeholders

PPQA Group SEPG

1.2 11-Aug-09 Abc Updated report based on

various review comments

from Fagan Inspection

Global Head

Quality

SEPG

Distribution List

Designations

Designated PMs, Principals, VPs, QA team and Metrics Council members of other locations.

Mode of distribution will be through the HQ. This document is controlled through a configuration management tool by Corporate Quality Manager and its revisions are identified by its version number.

© XYZ Company-Restricted Circulation

Page 2 of 15

Page 3: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

Reference Documents

Document ID

CMMI v1.2 (Staged Representation) PA: MA, QPM, OPP

Bangalore SPIN Benchmarks – February 2006

The current version of the Measurement Process

The current version of SPC Guidelines

The current applicable version of Metrics Plan

Metrics database from projects for Q2, 2009

© XYZ Company-Restricted Circulation

Page 3 of 15

Page 4: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

TABLE OF CONTENTS

1 EXECUTIVE SUMMARY.................................................................................5

2 INTRODUCTION.........................................................................................10

2.1 Objectives................................................................................................10

2.2 Scope.......................................................................................................10

3 ACRONYMS AND DEFINITIONS....................................................................10

4 ORGANIZATIONAL GOALS..........................................................................12

4.1 Maintenance Projects:...............................................................................12

4.2 Enhancement & Development Projects:......................................................12

4.3 Support Projects:......................................................................................13

4.4 QA (Testing) Projects:...............................................................................13

5 MAINTENANCE PROJECTS...........................................................................14

6 ENHANCEMENT AND DEVELOPMENT PROJECTS...........................................14

7 SUPPORT PROJECTS..................................................................................14

8 QA (TESTING) PROJECTS............................................................................14

9 RECOMMENDATIONS.................................................................................14

10 APPENDIX - A: LIST OF DEFAULTING PROJECTS...........................................15

11 APPENDIX - B: RULES FOR DEFINING CONTROL LIMITS & GRAPHS PLOTTING 15

© XYZ Company-Restricted Circulation

Page 4 of 15

Page 5: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

1 EXECUTIVE SUMMARY

This report provides analysis of the metrics data, consolidated for the projects being implemented in GDCs at A & B locations. These projects belong to all business units. The data collected is for the period from 1st April to 30th June, 2009. The highlights of the report are described below:

Highlights: A total of 136 (80+43+7+6) projects are considered for analysis in this Process Performance

Baseline Report (PPBR). Break-up of the 136 projects are:

o All 80 out of 80 Maintenance projects.

o 43 [40 Enhancement + 3 Development] out of 54 projects.

o 7 out of 10 Support projects.

o 6 out of 8 Manual Testing Projects. Maintenance and Support metrics are collected from the project trackers for each calendar

month. Development and Enhancement metrics are collected on the basis of devolvement life cycle

phases (Requirement, Design, Coding, Testing and overall). Testing metrics are collected from the project trackers for each release The data distribution is checked for each metrics based on percentile method. The Outliers in this PPBR are identified using Box Plots. The Defect Rates reported in the two ‘Quantitative Performance 2008’ tables below are

much less than the actual as 60%% of projects have not logged defect data or have reported zero defects.

© XYZ Company-Restricted Circulation

Page 5 of 15

Page 6: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

For Maintenance Projects, the trends of Performance Mean of various metrics are:

Maintenance Projects' Quantitative Performance 2007

Measurements/Mean XYZ Company

Mean (Apr-Jun)

XYZ Company Mean (Jul-

Sept)

XYZ Company

Mean (Oct-Dec)

XYZ Company

Mean (Jan-Mar)

Benchmark

% Schedule Variance (Original)

4.76 2.26 7.13 2.72 9 (SPIN, Blr)

% Schedule Variance (Revised)

0.17 0.29 0.34 0.56 9(SPIN, Blr)

% Schedule Slippage (Revised)

0.28 0.58 0.69 1.99 9(SPIN, Blr)

% Effort Variance 0.26 0.13 -1.19 0.68 26(SPIN, Blr)

% Effective Project Utilization

93.20 89.42 84.66 85.95 NA

% QA Effectiveness 63.10 65.96 85.65 61.12 96(SPIN, Blr)

Defect Rate (Defects / Person Month)

1.03 2.37 2.72 1 NA

Delivered Defect Rate (Defects / Person month)

0.50 0.72 1.44 0.47 NA

% Rework Effort 2.72 2.83 9.78 3.47 NANA: Not Available

© XYZ Company-Restricted Circulation

Page 6 of 15

Page 7: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

For Enhancement & Development Projects, the trends of Performance Mean of various metrics are:

Enhancement + Development Projects' Quantitative Performance 2007

Measurements/Mean XYZ Company Mean (Apr-

Jun)

XYZ Company Mean (Jul-

Sept)

XYZ Company

Mean (Oct-Dec)

XYZ Company Mean (Jan-

Mar)

Benchmark

% Schedule Variance (Original)

3.93 15.79 3.38 0.82 9(SPIN, Blr)

% Schedule Variance (Revised)

0.59 2.86 0.04 0.94 9(SPIN, Blr)

% Schedule Slippage (Revised)

0.50 1.36 0.06 -1.12 9(SPIN, Blr)

% Effort Variance (Original) 4.12 2.40 0.16 0.17 26(SPIN, Blr)

% Effort Variance (Revised) 0.79 -1.46 -1.48 -0.15 NA% QA Effectiveness 92.90 79.79 83.70 90.53 96

(SPIN, Blr)Defect Rate (Defects / 100 Person Hrs)

3.88 24.33 4.02 2.18 NA

Delivered Defect Rate (Defects / 100 Person Hrs)

1.88 3.76 1.39 1 NA

NA: Not Available

© XYZ Company-Restricted Circulation

Page 7 of 15

Page 8: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

For Support Projects, the trends of Performance Mean of various metrics are:

Support Projects' Quantitative Performance 2007 Measurements/Mean XYZ

Company Mean (Apr-

Jun)

XYZ Company Mean (Jul-

Sept)

XYZ Company

Mean (Oct-Dec)

XYZ Company

Mean (Jan-Mar)

Benchmark

% P1 SLA Compliance (Response Time )

NA NA 92.88 95.30  NA

% P2 SLA Compliance (Resolution Time )

NA NA 95.15 95.38  NA

% P3 SLA Compliance (Resolution Time )

NA NA 100.00 100.00  NA

% EPU NA NA 106.52 104.43  NANA: Not Available

© XYZ Company-Restricted Circulation

Page 8 of 15

Page 9: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

For QA (Testing) Projects, the trends of Performance Mean of various metrics are:

QA (Testing) Projects' Quantitative Performance 2007  

Measurements/Mean XYZ Company

Mean (Apr-Jun)

XYZ Company Mean (Jul-

Sept)

XYZ Company

Mean (Oct-Dec)

XYZ Company

Mean (Jan-Mar)

Benchmark

% Effort Under-run NA NA 10.13 -0.39 NA% Time Crunch NA NA -1.39 -3.24 NA% Test Design Efficiency

NA NA 95.93 89.25 NA

% Test Efficiency NA NA 95.96 94.09 NA% Test Case Coverage

NA NA 95.13 100.00 NA

NA: Not Available

Conclusions:

1. If we refer to SPIN or ISBSG benchmarks and compare them with our data trends, one can easily infer that most of the projects still log either “good looking data” in their trackers or take it as a mere formality and assign a junior team member to ’fill’ the tracker once in a fortnight to avoid escalations from PPQAs. Unless this attitude changes, we will not be able to use measurement data for meaningful analysis leading to ‘data based decision making”.

2. Not all QA and Support projects have provided data for analysis and goal setting. However, analysis of the same is done on the available data points.

3. Goal setting for most of the metrics covered in the PPBR have been done based on benchmarks or the decision taken in Measurement Council. This is because the data being logged in the Project Tracker is still to reflect the true status.

4. Projects not reporting their data for statistical analysis are listed in Appendix- A at the end of the report.

5. Measurement Council decides to continue with the performance goals already published in September 08, as revising the goals every quarter may confuse the project teams.

If we compare this set of performance data with International benchmarks mentioned above in the last column, it becomes evident that the data being reported by our projects does not represent the true status of activities on the floor. However, more projects have started reporting data which is a welcome change.

Next Steps:

QA have recently suggested to make AMCs or PMs (those who own responsibility of projects’ revenue) accountable for the quality of data in the project trackers. This suggestion, if implemented, will certainly improve the current situation. The council decided to examine the effectiveness of this suggestion while compiling the next PPBR.

2 INTRODUCTION

© XYZ Company-Restricted Circulation

Page 9 of 15

Page 10: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

This Process Performance Baseline Report (PPBR) establishes Mean Performance values, data variation patterns and statistically driven control limits (UCL and LCL), which in turn act as organization benchmarks for various metrics. This also helps in comparing the current performance values with those in the recent past. These values are then used by projects to set / update product and process performance goals once in a year. These goals also provide pointers for causal analysis, thus providing opportunities for Product and Process improvement.

The data is collected from Project Trackers maintained by individual projects. The statistical analysis of all this data has been presented in the report along with inferences.

Some of the data points in almost all metrics in the report fall outside the control limits of process performance, as indicated by the statistical analysis using Box Plots. They have not been considered fit for the analysis but are included in percentile calculation.

Please refer to Appendix- A for the list of defaulting projects.

Please refer to Appendix- B for Rules for defining Control Limits.

2.1 Objectives

The purpose of this report is:

To provide basis for estimation planning. To develop and sustain a measurement culture in the organization to support information

needs of the management. To determine the process capability of the organizations’ standard process based on the

process performance data collected from projects. To provide a basis to the projects to establish and improve their process performance goals

and to analyze the performance of project level process as defined in respective PMPs and/or PDPs.

To provide basis to the projects to establish project performance goals based on organization’s strategies and plans, and adjust project level process to accomplish the defined goals.

To enable the Measurement Council & SEPG to take decisions on setting organizational goals.

2.2 Scope

All projects being implemented are covered by this report unless a waiver have been obtained in project PDP. This report analyses data for the period Apr–June, 2009.

3 ACRONYMS AND DEFINITIONS

CSR Customer Service Request

CMMI-Dev or

CMMI

Capability Maturity Model Integration-Development V 1.2

EV(O) Effort Variance (Original)

EV (R) Effort Variance (Revised)

GDC Global Development Center

KAI Key and Sunrise Accounts India

KAB Key Accounts Bangalore

LCL The minimum value until which the process can go in a controlled

© XYZ Company-Restricted Circulation

Page 10 of 15

Page 11: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

manner. Any value within the lower and the upper control limit means

the process is controlled and stable.

Mean The mean value is simply the average of all data points applicable.

M+E Maintenance plus Enhancement projects

PPBR Process Performance Baseline Report

PA Process Area

PAL Process Asset Library

PDP Project Defined Process

PMP Project Management Plan

PONC Price of Non Conformance

PI SLA Priority 1 (Unplanned request requires immediate execution. Suspend

planned work if necessary) Service Level Agreement

P2 SLA Priority 2 (Unplanned request requires execution soon. Immediately after

Priority One request has been attended)Service Level Agreement

P3 SLA Priority 3 (Request to be scheduled)Service Level Agreement

P10 10th Percentile

P25 25th Percentile

P50 / Median 50th percentile i.e. the middle value of the data set.

P75 75th Percentile

P90 90th Percentile

SEPG Software Engineering Process Group

SAI Strategic Accounts India

SLI Service Line India

SPIN Software Process Improvement Network

SV(O) Schedule Variance (Original)

SV(R) Schedule Variance (Revised)

SD Standard deviation is a measure of how widely values are dispersed

from the Mean.

UCL The maximum value until which the process can go in a controlled and

stable.

UB and LB The Upper and the Lower Bounds. These are the limits specified by the

client, management or the statute.

Note: For other terms and acronyms, please refer to CMMI v 1.2 glossary and acronyms.

© XYZ Company-Restricted Circulation

Page 11 of 15

Page 12: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

4 ORGANIZATIONAL GOALS

Based on the analysis of the data received from projects, following statistical analysis has been done by the measurement Council:

4.1 Maintenance Projects:

Eighty Maintenance projects are considered for this analysis.

Metrics Goal (Oct - Dec’ 07) - Maintenance Projects

Metrics Benchmark Performance Mean

Goal * UCL* LCL*

% Schedule Variance (Original)

9 2.72 5, +15, -20 +20 -15

% Schedule Variance (Revised)

9 0.56 1, +10, -10 +11 -9

% Schedule Slippage (Revised)

9 1.99 1, +10, -10 +11 -9

% Effort Variance 26 0.68 5, + 15, - 15 +20 -10 % Effective Project Utilization

NA 85.95 90, +10, -5 +100 +85

% QA Effectiveness 96 61.12 80, +15, -10 +95 +70 Defect Rate (Defects / Person Month)

NA 1 50, +30, -30 +80 +20

Delivered Defect Rate (defects / Person month)

NA 0.47 4, +2, -2 +6 +2

% Rework Effort NA 3.47 10, +5, -5 +15 +5 * Already published goals and control limits are reproduced here.

4.2 Enhancement & Development Projects:

Forty Three Enhancement & Development projects from Noida and Bangalore are considered for this analysis.

Metrics Goal (Apr - Jun, 09) – Enhancement + Development Projects

Metrics Benchmark Performance Mean

Goal * UCL* LCL*

% Schedule Variance (Original)

9 0.82 10, +20, -15 +30 -5

% Schedule Variance (Revised)

9 0.94 1, +8, -10 +9 -9

% Schedule Slippage (Revised)

9 -1.12 0, +10, -5 +10 -5

% Effort Variance (Original) 26 0.17 -1, +25, -25 +24 -26% Effort Variance (Revised) NA -0.15 0, +5, -5 +5 -5% QA Effectiveness 96 90.53 70, +5, -10 +75 +60Defect Rate (Defects / 100 Person Hrs)

NA 2.18 40, +30, -20 +70 +20

Delivered Defect Rate (Defects / 100 Person Hrs)

NA 1 4, +1, -3 +5 +1

NA: Not Available * Already published goals and control limits are reproduced here.

4.3 Support Projects:

Seven Support projects from Noida and Bangalore are considered for arriving at these Goals.

© XYZ Company-Restricted Circulation

Page 12 of 15

Page 13: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

Metrics Goal (Apr - Jun, 09) - Support Projects

Metrics Benchmark Performance Mean

Goal * UCL* LCL*

% P1 SLA Compliance (Response Time )

NA 95.30 93, +7, -3 100 90

% P2 SLA Compliance (Resolution Time )

NA 95.38 96, +4, -7 100 93

% P3 SLA Compliance (Resolution Time )

NA 100.00 100 100 100

% EPU NA 104.43 95, +5, -5 100 90*Already published goals and control limits are reproduced here.

4.4 QA (Testing) Projects:

Six QA (Testing) projects are considered from Noida and Bangalore for the analysis. The data pertains to Manual Testing only as enough data points were not available for Automated Testing.

Metrics Goal (Apr - Jun, 09) – QA (Manual Testing) ProjectsMetrics Benchmark Performance

MeanGoal* UCL* LCL*

% Effort Under-run NA -0.39 5, +10, -20 +15 -15% Time Crunch NA -3.24 9, +6, -4 15 5% Test Design Efficiency

NA 89.25 97, +3, -12 100 85

% Test Efficiency NA 94.09 96, +4, -11 100 85% Test Case Coverage NA 100.00 100, +0, -10 100 90

* Already published goals and control limits are reproduced here.

5 MAINTENANCE PROJECTS

Refer the attached document for details.

6 ENHANCEMENT AND DEVELOPMENT PROJECTS

Refer the attached document for details.

© XYZ Company-Restricted Circulation

Page 13 of 15

Page 14: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

7 SUPPORT PROJECTS

Refer the attached document for details.

<TBD>

8 QA (TESTING) PROJECTS

Refer the attached document for details.

<TBD>

9 RECOMMENDATIONS

1. The timeliness and the correctness of the data in project trackers need to drastically improve. Making the people who are responsible for projects revenue (AMCs or PMs), accountable for the quality of data in project trackers seems to be the only solution of this problem.

2. Measurement Task Force will evolve suitable Process Performance Models, e.g. complexity model, system dynamics modes or defect prediction models e.t.c. after 2 or 3 PPBR releases, when the data is more reliable.

3. Include PPBRs and measurement council’s proceedings under the scope of internal audits.

4. The inferences in future PPBRs shall be based on further investigation of facts and soft issues- not only on the statistics. This remains the responsibility of Measurement Council.

5. Once the quality of data improves, the council will start analyzing data grouped in various buckets to enable apple to apple comparisons across the organization.

6. It was decided during the Fagan Inspection that effective project’s utilization is a project level or contractual requirement and it doesn’t help to measure it at organizational level. This may also be noted that manpower utilization is being tracked every month through time sheet system.

© XYZ Company-Restricted Circulation

Page 14 of 15

Page 15: Sample Process Performance Report

XYZ Company QMG R-OPP-PPBR-Q22009Process Performance Baseline Report

April to June, 2009v1.2

10 APPENDIX - A: LIST OF DEFAULTING PROJECTS

Sr. No.

Project Name Type Project Considered

Reason

1 P1 Development N No data Available2 p2 Enhancement N No data Available3 P3 Enhancement N No data Available4 e First origin

Phase 5.1_CSREnhancement N Trackers not updated since

august5 Infra Enhancement N  No data Available6 HRINTL Enhancement N No data Available7 Market1 Enhancement N No data Available8 MBT Japan Enhancement N Trackers not updated since June9 CoIT REM Enhancement N Trackers not updated since July

10 Asset Classes P4

Enhancement N Trackers not updated since June

11 P5 Enhancement N Trackers has no sufficient data 12 Agile

Performance Support N No data Available

13 CSR Support N No data Available14 Messaging ATR Support N No data Available15 QA Tq  QA N No Data, Project just started.16 QA Automation QA N Maintains M+E Project Tracker

11 APPENDIX - B: RULES FOR DEFINING CONTROL LIMITS & GRAPHS PLOTTING

1. Control limits [(UCL, LCL) = Mean +/- n (SD)], where n is taken as 3 if SD is less than 10% of mean, else it will be taken as 1.

a. UCL = IF(SD < (Mean * 0.1), the UCL will be (Mean + SD), else it will be (Mean + 3 * SD)

b. LCL = IF(SD < (Mean * 0.1), the LCL will be (Mean – SD), else it will be (Mean - 3 * SD)

2. Levels of Exclusion:

a. Criteria 1: Null/Blank data points are excluded from analysis. Only exception to this rule is DR for development, maintenance and enhancement projects.

b. Criteria 2: Outliers identified using Box Plot method are considered as those pertaining to special causes of variations and are therefore excluded from deriving UCL/LCL.

3. Graphs are plotted without considering Outliers.

© XYZ Company-Restricted Circulation

Page 15 of 15