Upload
others
View
53
Download
2
Embed Size (px)
Citation preview
Slide 224 November 2008
Agenda
Motivation
“Headlights”
Strategies for Leading Indicators
Common Leading Indicators
“Back-up Lights”
Summary
Slide 324 November 2008
Motivation
The outcome of a project always involves uncertainty, especially if more than one dimension of performance is considered
Measurement results often viewed as “snapshot in time”, implications of current conditions not understood
Systematic view of measurement needed to anticipate and understand project performance, enables definition of “leading indicators” or “headlights”
Slide 424 November 2008
Measures and indicators
No “measure” is intrinsically a leading indicator
Leading Indicator = f (measure, time, interpretation)
Not leading indicators:- Customer Satisfaction- Earned Value
Slide 524 November 2008
Types of Indicators
LeadingIndicators
TrailingIndicators
CurrentIndicators
RR
Slide 624 November 2008
“Headlights” (Leading Indicators)
Under specific conditions, an individual measure or collection of measures may be predictive of future performance
“Headlights” should be planned into the project – can be expensive to mount as an option
No generic answer as to exactly what to measure for a specific project
Many common measurement practices obscure the actual situation, providing “back-up lights” instead
Slide 724 November 2008
Requirements for Leading Indicators
Timely data collection and analysis
Knowledge of what is important to success
Measures with leading indicator properties (strategies)
Interpretation and use of the measures as leading indicators
Slide 824 November 2008
Projects are Systems
Many interacting internal and external factors
Influence of any individual factor varies over time
Measure factors likely to affect the performance factor of interest, not just the performance factor directly
Common tendency to avoid recognition of problems as opposed to searching for potential problems
Slide 924 November 2008
Interactions Among Factors
FunctionalSize
ProductSize
Effort
Schedule
ProductQuality
CustomerSatisfaction
ProcessPerformance
Adapted from J. McGarry, D.Card, et al., Practical Software Measurement, Addison Wesley, 2002
Likely Causes
Likely Effects
Slide 1024 November 2008
Strategies for Leading Indicators
One measure predicts future values of another measure
Values of a measure predict future values of the same measure
A measure tracks a basic constraint or limit to performance
A measure captures risk or uncertainty
Slide 1124 November 2008
Three Common Leading Indicators
Process Compliance – failure to follow the defined plan and process usually results in failure to meet budget, schedule, and quality objectives
Requirements Volatility – uncertainty about the project objectives usually results in delays, rework, and inadequate testing
Risk Exposure – project activities must reduce risk in order to reach a successful conclusion
Slide 1224 November 2008
Measurement Influences
Requirements
Effort
Schedule
DefectsRisk
Compliance
Unusual results in one dimension may predict problems in others!
LEADING INDICATOR
LEADING INDICATOR
LEADING INDICATOR
TRAILING INDICATOR
Slide 1324 November 2008
Quantification of Risk and Uncertainty
Risk of undesirable events
Lack of information
Variability in performance
Slide 1424 November 2008
1.0
0.8
0.6
0.4
0.2
0Time
Risk Referent(Planned Exposure)
Ris
k Ex
posu
reRisk Exposure
Current Estimated Exposure
Slide 1524 November 2008
0
2
4
6
8
10
12
14
02/28/97 03/30/97 04/30/97 05/30/97 06/30/97 07/30/97 08/30/97 09/30/97 10/30/97 11/30/97 12/30/97 More
Months
Freq
uenc
y (#
Mile
ston
es)
Frequency
Project End
On-Time Late Don’t Know
Planning Uncertainty into a Project
Slide 1624 November 2008
Variation in Performance
Time or Sequence
Perf
orm
ance
Mea
sure
Common Causes Determine Overall Level of Performance
Special Causes Produce Unusual Differences
Unmanaged Variation = Unmanaged Risk
Slide 1724 November 2008
From D.Card, Controlling the Object-Oriented Design Process, CNRC Conference on Quality Assurance of Object-Oriented Software, February 2000
Upper Limit
ActualsCenter
Inspections of State Transition Diagrams
Process Variability
Slide 1824 November 2008
Longitudinal Predictions
Involves chains of activities (e.g., inspections) or continuing activities (e.g., requirements changes) that span the product life-cycle
Values of performance factor in one activity relate to subsequent activities
May be described analytically, empirically, or simulated
Slide 1924 November 2008
Defect Profile
0
100
200
300
400
500
600
700
800
REQUIREMENTS
DESIGN
CODE
INTEGRATIO
N
DRY_RUN FORMAL T
EST
POST RELE
ASE
Def
ects
Dis
cove
red
Defects per PhaseExpected Defects per Phase
Post deliverey defects are those reported within 6 months following release of the software to the field.
David N. Card, Managing Software Quality with Defects, COMPSAC Proceedings, August 2002
Example Defect Profile
Slide 2024 November 2008
Potential Constraints
Staff Availability
Annual Budget
Specialized Facilities
SPEED LIMIT
50
Slide 2124 November 2008
Common “Back-up Lights”
Cumulative measures
Percentages
Focus on a single factor
Ambiguous and inconsistent measurement definitions
Slide 2224 November 2008
Typical View of Cost
0.750.800.850.900.951.00
1 4 7 10 13 16 19
Month
Cos
t Pe
rfor
man
ce
Inde
x
Series1
Cost Performance Index = Sum to date of Budgeted Cost of Work PerformedSum to date of Actual Cost of Work Performed
Cumulative View
Slide 2324 November 2008
Individuals chart with Shewhart Control Limits
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
Run
Indi
vidu
als
Valu
e Series1Center = 0.97UCL = 1.2466LCL = 0.6934Zone A AboveZone B AboveZone A BelowZone B Below
Individual View
Month
Indi
vidu
al C
PI
Changes can be detected as they occur
Slide 2424 November 2008
Process Performance Models
All effective PPMs are leading indicators
Not all leading indicators are valid PPMs
Slide 2524 November 2008
Summary
Consider the project as a system
Plan “Headlight” measures into the project
Avoid measurement practices that obscure the situation
Ensure that measures are well-defined
Remember “leading” is relative
Don’t forget about constraints and risks
Get managers to think in terms of leading indicators