28
Forensic Analysis of SRB Findings FY09 - FY10 PM Challenge February 2011

Burleson.amer

  • Upload
    nasapmc

  • View
    13.129

  • Download
    0

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Burleson.amer

Forensic Analysis of SRB Findings

FY09 - FY10

PM ChallengeFebruary 2011

Page 2: Burleson.amer

Purpose of Forensic Analysis

• The ultimate purpose of this analysis is to identify trends and/or systemic issues in terms of what NASA is doing well and not doing so well.

• With the results of this analysis, it may be possible to affect policies and procedures that better ensure success across the Agency.

2

“In the past, NASA has had difficulty meeting cost, schedule, and performance objectives for many of its projects. The need to effectively manage projects will gain even more importance as NASA seeks to manage its wide-ranging portfolio in an increasingly constrained fiscal environment.” – GAO, Assessments of Selected Large-Scale Projects, Feb 2010

Page 3: Burleson.amer

Current Analysis Approach

• A subjective analysis of the data was executed based on the type (strength or weakness), frequency, and consistency of findings per each assessment criterion in NPR 7120.5D.

• This is not, yet, a rigorous scientific study and may never be. However, IPAO is implementing a new, more robust database that should enable a more in-depth analysis of SRB findings in the future.

3

Page 4: Burleson.amer

Scope of the Data

• All findings evaluated for this study were the result of forty-seven reviews conducted by IPAO in FY09 and FY10.

• Final Reports, Management Briefings, and RM Notes were the sources for data.

4

Page 5: Burleson.amer

NPR 7120.5D Assessment Criteria• Goals: Alignment with and contributing to Agency needs, goals, and

objectives, and the adequacy of requirements flow-down from those.• Technical: Adequacy of technical approach, as defined by NPR 7123.1

entrance and success criteria.• Budget: Adequacy of estimated costs (total and by fiscal year), including

Independent Cost Analyses (ICAs) and Independent Cost Estimates (ICEs), against approved budget resources.

• Schedule: Adequacy of schedule.• Resources: Adequacy/availability of resources other than budget.• Risk: Adequacy of risk management approach and risk

identification/mitigation.• Management: Adequacy of management approach.

5

NPR 7120.5D (NID) combines budget and schedule; however, many reviews included in this study were conducted before release of the NID.For the purpose of this study, all findings were allocated to these seven criteria/categories.

Page 6: Burleson.amer

Limitations of the Analysis• Every review team has its own culture and team dynamic due

to the varying backgrounds and personalities of the individual team members.– Some teams/members are more inclined to document findings than

others.– Perceptions of the definitions are not precisely identical for each team

(or even each team member within a team).

• Final reports/management briefings vary in structure.– Evolving formats

• Interpretation of reports/briefings needed to associate findings with the appropriate criterion; some (very few) were split and associated with multiple criteria.– Subjective interpretation of the data allows for error and bias.– Data was sent back to RMs for review and confirmation.

6

Page 7: Burleson.amer

7

Results & Analysis

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

CY08 – CY09 Strengths (40 Reviews)

* Overlap of 15 months (28 Reviews)

274

276

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

FY09 - FY10 Strengths (47 Reviews)

Page 8: Burleson.amer

8

Results & Analysis

* Overlap of 15 months (27 Reviews)

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

CY08-CY09 Strengths (39 Reviews)

249

251

ONE OUTLIER DROPPED

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

FY09 - FY10 Strengths (46 Reviews)

ONE OUTLIER DROPPED

Page 9: Burleson.amer

Results & AnalysisStrengths

9

Reviews Goals Technical Budget Schedule Resources Risk Management

Category Type of Review Authority Final Briefing CY

1 Category 1 MCR APMC October-08 2008 1 1 4

2 Program PIR APMC November-08 2008 1 3 2 11

3 Category2 PDR/NAR DPMC December-08 2008 2 2

4 Category 2 PDR/NAR DPMC January-09 2008 10 1 14

5 Category 2 SIR DPMC February-09 2008 5

6 Flt Test CDR APMC December-09 2008

7 Category 2 ORR/FRR DPMC February-09 2009 1 6

8 Flt Test Special APMC April-09 2009 1 1 1 5

9 Program PIR APMC May-09 2009 1 1 1

10 Category 1 Rebaseline APMC June-09 2009 1 1 2

11 Category 1 PDR/NAR APMC July-09 2009 1 1 1 1 1 1 1

12 Category 1 CDR DPMC July-09 2009 1 1 1 1 6

13 Program PSDR/KDP I APMC August-09 2009 1 1 1

14 Category 2 CERR/PLAR DPMC September-09 2009

15 Category 2 Special DPMC September-09 2009 1 2

16 Category 2 SIR DPMC October-09 2009 1

17 Category 1 MCR APMC November-09 2009 4 1 1

18 Category 2 ORR/FRR DPMC November-09 2009

19 Program PIR APMC November-09 2009 2 1 1 5

20 Category 2 Rebaseline DPMC November-09 2009 1

21 Category 1 PSDR/KDP I APMC December-09 2009 1

22 Category 1 PDR/NAR APMC December-09 2009 2 2 2 2 1 1

23 Category 1 PDR/NAR APMC December-09 2009 1 1

24 Category 2 CDR DPMC December-09 2009 1 5 7

25 Category 2 SRR/MRR/PNAR DPMC December-09 2009 1 1 3

26 Program Incremental PPMC December-09 2009 1 2

27 Category 1 KDP-I APMC January-10 2009

28 Category 1 PDR APMC June-09/Dec-09 2009 1 1 1 2

29 Program PIR APMC January-10 2010 1 4

30 Category 2 PLAR DPMC January-10 2010

31 Category 1 CDR DPMC February-10 2010 1 2 1 1 1 1 1

32 Program Replan APMC March-10 2010 1

33 Category 1 SIR APMC March-10 2010 6 3 4 2 5

34 Category 1 SIR APMC March-10 2010 2 1

35 Program PIR APMC April-10 2010 2 6

36 Category 1 PDR APMC April-10 2010 1 1 3

37 Category 2 SIR DPMC June-10 2010 7 1 2

38 Category 2 PDR DPMC July-10 2010 2 1 1

39 Program PIR APMC July-10 2010 4

40 Program PAR APMC August-10 2010 1 3

41 Category 1 SDR GSFC August-10 2010 6 6

42 Category 1 CDR APMC August-10 2010 3 1 5

43 Category 2 CDR DPMC August-10 2010 1 1 3

44 Program PIR-MFR APMC August-10 2010 1 2

45 Category 2 SIR October-10 2010 1 1 1 1 2

46 Category 1 CDR Re-BL APMC December-10 2010

Page 10: Burleson.amer

Results & AnalysisStrengths

• Programs were attributed strengths most frequently with respect to the Management and Technical assessment criteria. – It may be that there are more opportunities for strengths

with respect to Management and Technical criteria.

• Relative to the Management and Technical criteria; Goals, Budget, Schedule, and Risk Management tend to gravitate to very few strengths per review.

• Resources Other Than Budget received the fewest strengths.

10

Page 11: Burleson.amer

11

Results & Analysis

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

CY08 – CY09 Issues (40 Reviews)

123

132

* Overlap of 15 months (28 of 46 Reviews)

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

FY09-FY10 Issues (47 Reviews)

Page 12: Burleson.amer

Results & AnalysisIssues

12

Reviews Goals Technical Budget Schedule Resources Risk Management

Category Type of Review Authority Final Briefing CY

1 Category 1 MCR APMC October-08 2008 2

2 Program PIR APMC November-08 2008 1 1 2 2 1

3 Category2 PDR/NAR DPMC December-08 2008 1 1

4 Category 2 PDR/NAR DPMC January-09 2008

5 Category 2 SIR DPMC February-09 2008

6 Flt Test CDR APMC December-09 2008

7 Category 2 ORR/FRR DPMC February-09 2009 1 1 1

8 Flt Test Special APMC April-09 2009 1

9 Program PIR APMC May-09 2009

10 Category 1 Rebaseline APMC June-09 2009 3 1 4

11 Category 1 PDR APMC July-09 2009 1

12 Category 1 CDR DPMC July-09 2009 3

13 Program PSDR/KDP I APMC August-09 2009 1 1 1

14 Category 2 CERR/PLAR DPMC September-09 2009

15 Category 2 Special DPMC September-09 2009 2 2 1

16 Category 2 SIR DPMC October-09 2009 3

17 Category 1 MCR APMC November-09 2009 2

18 Category 2 Rebaseline DPMC November-09 2009 2

19 Category 2 ORR/FRR DPMC November-09 2009

20 Program PIR APMC November-09 2009 1 1

21 Category 2 CDR DPMC December-09 2009

22 Category 1 PSDR/KDP I APMC December-09 2009 1 1 2

23 Category 1 PDR/NAR APMC December-09 2009 4 2 2

24 Category 1 PDR/NAR APMC December-09 2009 1 1 2

25 Category 2 SRR/MRR/PNAR DPMC December-09 2009 1 1 1

26 Program Incremental PPMC December-09 2009 1 1 1

27 Category 1 KDP-I APMC January-10 2009

28 Category 1 PDR APMC June-09/Dec-09 2009 1 3 1 1

29 Program PIR APMC January-10 2010 1 1 1

30 Category 2 PLAR DPMC January-10 2010

31 Category 1 CDR DPMC February-10 2010

32 Program Replan APMC March-10 2010

33 Category 1 SIR APMC March-10 2010 2 1 1

34 Category 1 SIR APMC March-10 2010 2 1

35 Program PIR APMC April-10 2010 4

36 Category 1 PDR APMC April-10 2010 1 1 1 1

37 Category 2 SIR DPMC June-10 2010 2 2 1

38 Category 2 PDR DPMC July-10 2010 2 1

39 Program PIR APMC July-10 2010

40 Program PAR APMC August-10 2010 3 1 3

41 Category 1 SDR GSFC August-10 2010 7 2 4 2

42 Category 1 CDR APMC August-10 2010

43 Category 2 CDR DPMC August-10 2010

44 Program PIR-MFR APMC August-10 2010 1 1 3

45 Category 2 SIR October-10 2010

46 Category 1 CDR Re-BL APMC December-10 2010 3 1

Page 13: Burleson.amer

Results & AnalysisIssues

• Technical issues were the most frequent issues documented by SRB Review Teams over this two-year period, but they appear to be mostly isolated events.– Integration may be a common theme.

• Schedule issues were the second most frequent issues identified and may be an easier opportunity for improvement.– Schedule issues may be caused by poor planning or an overall lack

of emphasis on good schedule practices.– Insufficient time may have been allowed to achieve Level I

milestones, perhaps complicated by insufficient funding.• Budget issues, although the third most frequent, may be

another opportunity for improvement if caused by management practices or Agency allowances.

13

Page 14: Burleson.amer

Results & AnalysisIssues

• Management issues were the fourth most frequently identified; appear to be mostly isolated events.– Communication and integration may be common themes.– May be coupled or related to issues in other areas.

• More strengths than issues were associated with Goals (12 strengths vs. 8 issues).– … or 5 of 47 (10.6%) of Programs/projects had at least one

issue “aligning with and contributing to Agency needs, goals, and objectives and the adequacy of requirements flow-down from those.”

– Opportunity for improvement

14

Page 15: Burleson.amer

Results & AnalysisIssues

• Previously noted that Resources had the fewest strengths; but Resources also had the 3rd fewest issues recognized by review teams – (11 issues vs. 9 strengths)

• Only four issues (over two years and forty-seven reviews) were associated with “Adequacy of risk management approach and risk identification/mitigation.”

15

Page 16: Burleson.amer

16

Results & Analysis

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

CY08 – CY09 Issues & Concerns (40 Reviews)

341

362

* Overlap of 15 months (28 of 46 Reviews)

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

FY09-FY10 Issues & Concerns (47 Reviews)

Page 17: Burleson.amer

17

Results & Analysis

* Overlap of 15 months (27 Reviews)

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

CY08-CY09 Issues & Concerns (39 Reviews)

282

303ONE OUTLIER DROPPED

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140 160

FY09-FY10 Issues & Concerns (46 Reviews)

ONE OUTLIER DROPPED

Page 18: Burleson.amer

Results & AnalysisIssues & Concerns

18

Reviews Goals Technical Budget Schedule Resources Risk Management

Category Type of Review Authority Final Briefing CY

1 Category 1 MCR APMC October-08 2008 4 2 3

2 Program PIR APMC November-08 2008 2 1 3 2 2 3

3 Category2 PDR/NAR DPMC December-08 2008 2 1

4 Category 2 PDR/NAR DPMC January-09 2008 2 27 7 7 16

5 Category 2 SIR DPMC February-09 2008 6 1 2 1

6 Flt Test CDR APMC December-09 2008 12 1

7 Category 2 ORR/FRR DPMC February-09 2009 1 4 1 1 2

8 Flt Test Special APMC April-09 2009 3 1 1

9 Program PIR APMC May-09 2009 2

10 Category 1 Rebaseline APMC June-09 2009 2 4 1 5 1

11 Category 1 PDR APMC July-09 2009 1 1 1

12 Category 1 CDR DPMC July-09 2009 5

13 Program PSDR/KDP I APMC August-09 2009 2 1 1 1 1 3

14 Category 2 CERR/PLAR DPMC September-09 2009

15 Category 2 Special DPMC September-09 2009 2 3 1

16 Category 2 SIR DPMC October-09 2009 4 5

17 Category 1 MCR APMC November-09 2009 4 1

18 Category 2 Rebaseline APMC November-09 2009 1 7 1

19 Category 2 ORR/FRR DPMC November-09 2009

20 Program PIR APMC November-09 2009 1 1 2

21 Category 2 CDR DPMC December-09 2009 6 3

22 Category 1 PSDR/KDP I APMC December-09 2009 1 1 2

23 Category 1 PDR/NAR APMC December-09 2009 1 5 4 2 2 2

24 Category 1 PDR/NAR APMC December-09 2009 1 1 2

25 Category 2 SRR/MRR/PNAR DPMC December-09 2009 1 1 1

26 Program Incremental PPMC December-09 2009 1 1 1

27 Category 1 KDP-I APMC January-10 2009

28 Category 1 PDR APMC June-09/Dec-09 2009 1 3 1 1

29 Program PIR APMC January-10 2010 3 1 2

30 Category 2 PLAR DPMC January-10 2010 2

31 Category 1 CDR DPMC February-10 2010 2 1 1

32 Category 1 SIR APMC March-10 2010 1 3

33 Category 1 SIR APMC March-10 2010 1 2 4 1

34 Program PIR APMC April-10 2010 5 1 1

35 Category 1 PDR APMC April-10 2010 1 1 6 1

36 Category 2 SIR DPMC June-10 2010 1 1 1 1

37 Category 2 PDR DPMC July-10 2010 2 2 1

38 Program PIR APMC July-10 2010 2 1

39 Program PAR APMC August-10 2010 1 1

40 Category 1 SDR GSFC August-10 2010 1 5 2 1 2 1 7

41 Category 1 CDR APMC August-10 2010 7 2 4 2

42 Category 2 CDR DPMC August-10 2010 1 1 1

43 Program PIR-MFR APMC August-10 2010

44 Category 2 SIR October-10 2010 1 1 1 7

45 Category 1 CDR Re-BL APMC December-10 2010 1 1

46 Program Replan APMC March-10 2010 3 1

Page 19: Burleson.amer

Results & AnalysisIssues & Concerns

• When combining issues and concerns (i.e. weaknesses), technical weaknesses appear to be the most frequently observed.– There may be more opportunities for findings with respect

to the Technical criterion. – Mostly isolated events.– Integration may be a common theme.

19

Page 20: Burleson.amer

Results & AnalysisIssues & Concerns

• Management and Schedule weaknesses appear to be the second most frequent type of weaknesses observed.– There may be more opportunities for findings with respect

to the Management criterion. – Schedule issues may be caused by poor planning or an

overall lack of emphasis on good schedule practices.

20

Page 21: Burleson.amer

Results & AnalysisIssues & Concerns

• Schedules received for assessment are typically not of good quality due to poor scheduling practices.– Missing predecessors and successors– Containing constrained data– Risks are not mapped to the schedule– Lack of a true critical path

• IPAO schedule analysts typically have to amend the schedules in order to be able to assess the schedule.

• NASA Schedule Management Handbook provides guidance on good scheduling practices.

21

Page 22: Burleson.amer

Limitations of the Methodology• The current methodology in its present state of

development is inadequate to fulfill the ultimate purpose of this study.– To Identify Systemic Findings– To Assess Trends

• A more in-depth analysis is needed.– Should include “why” or “root-cause”– To be facilitated by a more robust database of the

characteristics of the findings.

22

Page 23: Burleson.amer

Conclusions• NASA appears to have offsetting strengths and

weaknesses with respect to the Technical and Management criteria.– Communication and Integration may be areas for

improvement

• Schedule management may be the area that presents the best opportunity for improvement.– IPAO has seen moderate improvement with respect to

quality of schedules over the last year.

• SRBs appear to find more positive than negative findings with respect to Risk Management.– 36% received at least one strength; 8.5% received at least

one issue; 19% received either an issue or concern 23

Page 24: Burleson.amer

Recommendations

• Training for the Program/project personnel– Emphasize the programmatics (Schedule and Budget

combined dominate issues/concerns)

• Training for the SRB– Emphasize the programmatics among the more technical

members. Their input helps facilitate the programmatic analysis.

– Risk Management Process … • Make sure we are evaluating more than the existence of a top ten

list

– Identify the “why” or “root cause” with each finding

24

Page 25: Burleson.amer

Forward Work

• Further develop the methodology by formulating a classification for root cause and setting up a database that contains the entire context and associated information with respect to each finding– Mine for systemic findings– Ask SRB to identify root cause

• Select a platform (Excel, Access, SQL, etc.) that will enable the extraction of more meaningful information

• Coordinate with/solicit input from CAD & SID – Explanation of Change Study

25

Page 26: Burleson.amer

Back Up Slides

Page 27: Burleson.amer

Results & AnalysisFY09-FY10 (47 Reviews)

27

Goals

Technical

Budget

Schedule

Resources

Risk

Management

0 20 40 60 80 100 120 140

Issues & ConcernsConcernsIssuesStrengths

Page 28: Burleson.amer

Definitions of Findingsper SRB Handbook (effective Nov 12, 2009)

• Strength: A strength is a finding of the SRB that describes a feature of the P/p that in the judgment of the SRB is better than expected at a particular stage of the life-cycle.

• Issue: A finding by the SRB; SRB issues are documented and briefed to the P/p and the management councils; issues typically drive the SRB’s success criteria assessment and ultimate determination of the SRB rating for each review.

• Concern: A finding identified by the SRB; SRB concerns are typically documented and briefed to the P/p, but not specifically addressed with the management councils (unless asked).

28