20
How to Assess a Joint Confidence Level (JCL) Model Brian Berry 2017 NASA Cost and Schedule Symposium August 29 – 31 Washington, D.C.

How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

How to Assess a Joint Confidence Level (JCL) Model

Brian Berry

2017 NASA Cost and Schedule Symposium

August 29 – 31

Washington, D.C.

Page 2: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Outline

Background

JCL Process

NASA JCL Modeling Lessons Learned

JCL Model Assessment

Identify and track model and input changes over time

Validate calculations

Evaluate Active Risks

Evaluate deterministic critical path and probabilistic critical path and tasks

Quickly access/understand model results

Summary

2017 NASA Cost and Schedule Symposium 2

Page 3: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Background

Numerous JCL analyses have been performed since NASA policy established JCL requirement in 2009

Many lessons learned with improvements to processes, tools, and analysis

70% JCL value is a high bar metric—difficult to meet

JCL model output value can become focus of analysis

Analysts need eyes wide open during entire process

Need to assess model, inputs, and outputs to help ensure the JCL value is robust

2017 NASA Cost and Schedule Symposium 3

Page 4: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

JCL Process*

0. Identify goals of the JCL analysis

1. Develop a summary analysis schedule

2. Load cost onto the schedule activities

3. Incorporate the risk list

4. Conduct an uncertainty analysis

5. Calculate and view the results

2017 NASA Cost and Schedule Symposium

* NASA Cost Estimating Handbook V4.0, pJ-11

4

Page 5: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

NASA JCL Modeling Lessons Learned

Durations that may be lengthy (> 44 days per best practices) can be acceptable for JCL analysis schedules

Analysis schedules must be structurally sound with solid logic and be accurately statused. For use in SRAs and JCLs the network must be able to move freely without limitations of constraints

All main phases of the schedule are linked together so as durations fluctuate during simulation, the schedule on a whole expands and contracts appropriately and no unnecessary constraintsLinkage (relationship) types are important—Strive for >90% FS and avoid tempting shortcuts such as creating many Start-To-Start linkages with lags to mimic Finish-To-Start

JACS Reporting capabilities can assist in model verification Preds tool-see what tasks each task feeds—should support milestone

Cost contributors chart-delta between estimate & mean should reflect intended cost uncertaintyCriticality Index-ensure relative differences between tasks makes senseRisk Event Ranking-Use Prob./Impact outputs to understand model impacts

2017 NASA Cost and Schedule Symposium 5

Page 6: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

NASA Emphasized JACS Development and Reports Features to Support Key Elements of the JCL Analysis Lifecycle

Assess Model Quality

2017 NASA Cost and Schedule Symposium

Health ChecksJCL Inputs

Identify Model Changes Model Comparison

Validate Calculations Baseline Comparison

Insight into Model Results Critical Path ReportInsight Application

Compare Scenario Results Insight Report Compare

6

Page 7: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Assessing Model Quality

Key to a JCL Model is the underlying Schedule Structure and Cost, Risk, and Uncertainty Inputs

Several features were implemented in JACS to provide rapid access to these areas

Health Checks

– Checks quality of underlying schedule and reviews syntax, logic, and specification of JCL model inputs to ensure they are proper

JCL Inputs

– Provides insight into application of JCL uncertainty distributions and risks to the underlying schedule and cost

2017 NASA Cost and Schedule Symposium 7

Page 8: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Health Checks

Health Check Report Includes GAO Best

PracticesDCMA 14-Pt AssessmentJACS HealthIssue CountsIssue List

Provides many ways to help ensure that a sound schedule, uncertainty, and cost loading will underpin the JCL

2017 NASA Cost and Schedule Symposium 8

Page 9: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Task

ID

Unique

Task ID Name Task Details Uncertainty Correlation Group Task Type Status Start Finish

Duration

(Days) % Complete

Remaining

Duration Constraint

Constraint

Date

Actual

Start

Actual

Finish Predecessors Successors

WBS

Number

Total Cost

($)

TI Cost

($)

TD Cost

($)

4 331 Delayed Award Activity With Duration UncertaintyN/A Activity Future Task 10/1/2013 12/2/2013 45 0.00% 45 ASAP 10/1/2013 NA NA 3 5 1.1.2 $0 $0 $0

12 16 EMD HW System Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 12/3/2013 8/4/2014 175 0.00% 175 ASAP 10/1/2013 NA NA 5 7,27,52 1.4.1.1 $2,400 $720 $1,680

13 15 EMD HW Initial Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 8/5/2014 4/13/2015 180 0.00% 180 ASAP 10/1/2013 NA NA 7 8,30,52 1.4.1.2 $4,800 $1,440 $3,360

14 14 EMD HW Detailed Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 4/14/2015 5/16/2016 285 0.00% 285 ASAP 10/1/2013 NA NA 8 9,51,52 1.4.1.3 $9,000 $2,700 $6,300

15 13 EMD HW Final Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 5/17/2016 8/7/2017 320 0.00% 320 ASAP 10/1/2013 NA NA 9 32,25FF,16,51,521.4.1.4 $5,200 $1,560 $3,640

18 78 EMD SW System Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 12/3/2013 8/4/2014 175 0.00% 175 ASAP 10/1/2013 NA NA 5 7,28,53 1.4.2.1 $6,150 $1,538 $4,613

19 83 EMD SW Initial Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 8/5/2014 4/13/2015 180 0.00% 180 ASAP 10/1/2013 NA NA 7 8,31,53 1.4.2.2 $12,300 $4,920 $7,380

20 88 EMD SW Detailed Design Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 4/14/2015 5/16/2016 285 0.00% 285 ASAP 10/1/2013 NA NA 8 9,21,51,53 1.4.2.3 $21,600 $5,400 $16,200

22 110 EMD SW Refinements Activity With Duration UncertaintyEMDDURATION, EMDCOSTSActivity Future Task 5/17/2016 1/15/2018 435 0.00% 435 ASAP 10/1/2013 NA NA 9 32,25FF,51,53 1.4.2.5 $19,450 $3,890 $15,560

36 313 Prod Procure Materials Activity With Duration UncertaintyPRODDURATION, PRODCOSTSActivity Future Task 1/16/2018 1/27/2020 530 0.00% 530 ASAP 10/1/2013 NA NA 34 37SS+140Days,49,541.5.2.1 $28,700 $20,090 $8,610

37 314 Prod Air Vehicle Activity With Duration UncertaintyPRODDURATION, PRODCOSTSActivity Future Task 7/31/2018 6/14/2021 750 0.00% 750 ASAP 10/1/2013 NA NA 36SS+140Days 38SS+80Days,49,48FF,541.5.2.2 $48,650 $43,785 $4,865

38 316 Prod IAT&C Activity With Duration UncertaintyPRODDURATION, PRODCOSTSActivity Future Task 11/20/2018 3/8/2021 600 0.00% 600 ASAP 10/1/2013 NA NA 37SS+80Days 41SS,42SS,40SS+180Days,39,48FF,45FF+45Days1.5.2.3 $16,426 $0 $16,426

40 317 Prod ST&E Activity With Duration UncertaintyPRODDURATION, PRODCOSTSActivity Future Task 7/30/2019 7/26/2021 520 0.00% 520 ASAP 10/1/2013 NA NA 38SS+180Days 48FF,49 1.5.3 $8,000 $6,400 $1,600

41 319 Prod Common Support EquipmentActivity With Duration UncertaintyPRODDURATION, PRODCOSTSActivity Future Task 11/20/2018 12/16/2019 280 0.00% 280 ASAP 10/1/2013 NA NA 38SS 49,54 1.5.4 $12,000 $9,600 $2,400

42 320 Prod Peculiar Support EquipmentActivity With Duration UncertaintyPRODDURATION, PRODCOSTSActivity Future Task 11/20/2018 10/21/2019 240 0.00% 240 ASAP 10/1/2013 NA NA 38SS 49,54 1.5.5 $16,500 $14,025 $2,475

Inputs Details

Schedule Statistics

Description Count % of Total Count % of Total

Total Lines in IMS 26 48% 28 52%

Total Activities 15 100% 0 0%

Total Milestones 11 44% 14 56%

Schedule Milestones 0 0% 8 100%

Risk Events 7 100% 0 0%

Risk Factors 4 100% 0 0%

Start/Finish Hammock Milestones 0 0% 6 100%

Total # of Summary tasks 0 0% 14 100%

Schedule Summary Tasks 0 0% 8 100%

Hammock tasks 0 0% 3 100%

Multi-Impact Risk Events 0 0% 2 100%

Risk Factor Summary Section Heading 0 0% 1 100%

COMPLETED Activities 0 0% 0 0%

IN PROGRESS Detail Activities 0 0% 0 0%

Activities with <5 workdays remaining 0 0% 0 0%

Activities with 6 to 20 workdays remaining 0 0% 0 0%

Activities with 20 to 100 workdays remaining 0 0% 0 0%

Task with >100 workdays remaining 0 0% 0 0%

FUTURE Detail Activities 15 100% 0 0%

Activities with <5 workdays duration 0 0% 0 0%

Activities with 6 to 20 workdays duration 0 0% 0 0%

Activities with 20 to 100 workdays duration 1 100% 0 0%

Task with >100 workdays duration 14 100% 0 0%

COST STATISTICS Count % of Total Count % of Total

Total Lines in IMS 27 50% 27 50%

Total Activities 14 93% 1 7%

Total Milestones 10 40% 15 60%

Total # of Summary tasks 3 21% 11 79%

Total # of Hammock tasks 3 100% 0 0%

Total number of Complete Cost loaded Tasks 0 0% 0 0%

Total number of In Progress and Future Cost loaded Tasks 14 100% 0 0%

Tasks with only Time-Independent (TI) Cost Specified 0 0% 0 0%

Tasks with only Time-Dependent (TD) Cost Specified 1 100% 0 0%

Tasks with both TI and TD Cost Specified 13 100% 0 0%

Point Estimate Cost

Sum of all TI cost

Sum of all TD cost

Uncertainty OverviewBOTH Duration and Cost

RISK EVENT/FACTOR STATISTICS Count % of Total Count % of Total Count % of Total Count % of Total

Total Number of Risk Events/Factors 13 100% 10 77% 1 8% 0 0%

Number of RISK EVENTS 9 69% 7 78% 0 0% 0 0%

Likelihood of occurrence <25% 4 44% 2 50% 0 0% 0 0%

Likelihood of occurrence 25% to 75% 1 11% 1 100% 0 0% 0 0%

Likelihood of occurrence > 75% 4 44% 4 100% 0 0% 0 0%

Number of RISK FACTORS 4 31% 3 75% 1 25% 0 0%

Likelihood of occurrence <25% 4 100% 3 75% 1 25% 0 0%

Likelihood of occurrence 25% to 75% 0 0% 0 0% 0 0% 0 0%

Likelihood of occurrence > 75% 0 0% 0 0% 0 0% 0 0%

With COST Uncertainty OnlyTotal

WITHOUT Cost Uncertainty

Count % of Total

0%

14 26%

3 6%

Total

54 100%

15 28%

0

15 28%

25 46%

0

0%

7%

16%

24%

7

4

WITHOUT Duration UncertaintyTotal

Count % of Total

87%

0

7%1

14

25 46%

1

14

93%

100%

14

WITH Duration Uncertainty

13

54 100%

0

0

1

8

With DURATION Uncertainty Only

0%0

21%

0%

0%

0%

7%

93%

0

0

0%

0%

0%

0

$267,476.000 100%

$116,067.500 43%

$151,408.500 57%

WITH Cost Uncertainty

26%

2

8 57%

14%

3

15

6

32%

28%

0%

0

Inputs

Summary

JCL Inputs

Comprehensive metrics/information on

Baseline Schedule

Risk events, risk factors

Cost Loading

Uncertainty

Correlation

Goes beyond basic health characteristics to assess reasonableness of inputs

2017 NASA Cost and Schedule Symposium 9

Page 10: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Identifying Model Changes

JCL models go through several changes during their lifecycle During project development the underlying schedule structure may change

During drops to the SRB leading up to the final review the schedule and modeling inputs may changeAfter model is provided to SRB, SRB analysts may change model inputs (e.g., uncertainty distributions, risk attributes, add/delete risks)

The JACS development team assessed the requirement and identified the need to compare differences in schedule structure and model inputs Design was based on adding functionality complementary (not supplementary) to

MS Project– MS Project desktop does not have a model tracking feature

MS Project Server has functionality to conduct journal tracking if the feature is enabled and server database is structured appropriately

Design was based on flow of models, typically models are dropped/released over periods of time; so a process structured on comparing model snapshots was deemed the best approach

A new feature was implemented in JACS to provide model comparison Model Comparison – checks structure and inputs between JCL model files and JCL

scenarios within a specific Model

2017 NASA Cost and Schedule Symposium 10

Page 11: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

JCL Model Comparison - Details

Unique

Task ID

Active

Task ID

Comparison

Task ID Active Schedule Task Names Comparison Schedule Task Names Task Type Active Comparison Match Active Comparison Delta Active Comparison Delta Active

Comparis

on Delta

230 1 1 Missile System Project Missile System Project Summary Standard Standard Yes 10/1/2013 10/1/2013 7/26/2021 8/23/2021 -20 Days 2040 2060 -20

329 2 2 Contract Award Delay Contract Award Delay Summary Standard Standard Yes 10/1/2013 10/1/2013 12/2/2013 12/2/2013 45 45

330 3 3 Planned Award Planned Award Milestone Standard Standard Yes 10/1/2013 10/1/2013 10/1/2013 10/1/2013 0 0

331 4 4 Delayed Award Delayed Award Activity Standard Standard Yes 10/1/2013 10/1/2013 12/2/2013 12/2/2013 45 45

284 5 5 Contract Award Contract Award ProgramEvent Standard Standard Yes 12/2/2013 12/2/2013 12/2/2013 12/2/2013 0 0

280 6 6 Milestones Milestones Summary Standard Standard Yes 8/4/2014 8/4/2014 5/16/2016 6/13/2016 -20 Days 465 485 -20

281 7 7 EMD SDR Milestone EMD SDR Milestone ProgramEvent Standard Standard Yes 8/4/2014 8/4/2014 8/4/2014 8/4/2014 0 0

282 8 8 EMD PDR Milestone EMD PDR Milestone ProgramEvent Standard Standard Yes 4/13/2015 5/11/2015 -20 Days 4/13/2015 5/11/2015 -20 Days 0 0

283 9 9 EMD CDR Milestone EMD CDR Milestone ProgramEvent Standard Standard Yes 5/16/2016 6/13/2016 -20 Days 5/16/2016 6/13/2016 -20 Days 0 0

190 10 10 EMD Phase EMD Phase Summary Standard Standard Yes 12/2/2013 12/2/2013 1/15/2018 2/12/2018 -20 Days 1075 1095 -20

2 11 11 EMD Hardware EMD Hardware Summary Standard Standard Yes 12/3/2013 12/3/2013 8/7/2017 9/4/2017 -20 Days 960 980 -20

16 12 12 EMD HW System Design EMD HW System Design Activity Standard Standard Yes 12/3/2013 12/3/2013 8/4/2014 8/4/2014 175 175

15 13 13 EMD HW Initial Design EMD HW Initial Design Activity Standard Standard Yes 8/5/2014 8/5/2014 4/13/2015 5/11/2015 -20 Days 180 200 -20

14 14 14 EMD HW Detailed Design EMD HW Detailed Design Activity Standard Standard Yes 4/14/2015 5/12/2015 -20 Days 5/16/2016 6/13/2016 -20 Days 285 285

13 15 15 EMD HW Final Design EMD HW Final Design Activity Standard Standard Yes 5/17/2016 6/14/2016 -20 Days 8/7/2017 9/4/2017 -20 Days 320 320

286 16 n/a EMD HW Malfunction (RR#1) no match RiskEvent

Start Finish Duration (Days)Calendar

DETAILS

Model Comparison—Example

Feature compares

Tasks

Calendars

Dates

Duration

Logic

Costs

Distributions

Risks

112017 NASA Cost and Schedule Symposium

JCL Model Comparison - Summary

Report Date 8/7/2017

Active Project Name: JACS Example File 02a Advanced - with Risk Factors

Comparison Project Name: JACS Example File 02a Advanced - with Risk Factors chng for model comp

Schedule Status

Description

Status Date

Current Start

Current Finish

Project Duration

Project Total Cost

Estimated Workdays Remaining

Project Percent Complete

Schedule Statistics

Description Delta Active Schedule Comparison Schedule Delta Active Schedule Comparison Schedule Delta

Total Lines in IMS 2 26 24 2 28 28

Total Activities 15 15 0 0

Total Milestones 2 11 9 2 14 14

Schedule Milestones 0 0 8 8

Risk Events 2 7 5 2 0 0

Risk Factors 4 4 0 0

Start/Finish Hammock Milestones 0 0 6 6

Active Schedule

54

15

25

8

7

4

6

Comparison Schedule

0.0%

WITH Duration Uncertainty WITHOUT Duration Uncertainty

8

5

4

52

15

23

6

$267,476.000$267,476.000

Comparison Schedule

Total

Active Schedule

9/1/2013

10/1/2013

7/26/2021

2040 Days

2040 Days

9/1/2013

10/1/2013

8/23/2021

2060 Days

2060 Days

0.0%

SUMMARY

Go To Excel File

Page 12: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Validating Calculations

Baseline Delta Report was added to verify that JACS calculation will return similar values* as entered into the Project/P6 model

Task NameDuration

(Project)

Duration

(JACS)

Duration Δ

(days)

Start

(Project)

Start

(JACS)

Start Δ

(days)

Finish

(Project)

Finish

(JACS)

Finish Δ

(days)

Is Critical

(Project)

Is Critical

(JACS)

Is Critical

Match

Total Slack

(Project)

Total Slack

(JACS)

Total Slack Δ

(days)

Missile System Project 2040 2039 (1) 10/1/2013 10/1/2013 0.00 7/26/2021 7/26/2021 0.00 Yes Yes Yes 0 -1 (1)

Contract Award Delay 45 45 0 10/1/2013 10/1/2013 0.00 12/2/2013 12/2/2013 0.00 Yes Yes Yes 0 0 0

Planned Award 0 0 0 10/1/2013 10/1/2013 0.00 10/1/2013 10/1/2013 0.00 Yes Yes Yes 0 0 0

Delayed Award 45 45 0 10/1/2013 10/1/2013 0.00 12/2/2013 12/2/2013 0.00 Yes Yes Yes 0 -1 (1)

Contract Award 0 0 0 12/2/2013 12/2/2013 0.00 12/2/2013 12/2/2013 0.00 Yes Yes Yes 0 -1 (1)

Milestones 465 465 0 8/4/2014 8/4/2014 0.00 5/16/2016 5/16/2016 0.00 Yes Yes Yes 0 0 0

EMD SDR Milestone 0 0 0 8/4/2014 8/4/2014 0.00 8/4/2014 8/4/2014 0.00 Yes Yes Yes 0 0 0

EMD PDR Milestone 0 0 0 4/13/2015 4/13/2015 0.00 4/13/2015 4/13/2015 0.00 Yes Yes Yes 0 0 0

EMD CDR Milestone 0 0 0 5/16/2016 5/16/2016 0.00 5/16/2016 5/16/2016 0.00 Yes Yes Yes 0 0 0

EMD Phase 1075 1075 0 12/2/2013 12/2/2013 0.00 1/15/2018 1/15/2018 0.00 Yes Yes Yes 0 0 0

EMD Hardware 960 960 0 12/3/2013 12/2/2013 0.00 8/7/2017 8/7/2017 0.00 Yes Yes Yes 0 0 0

EMD HW System Design 175 175 0 12/3/2013 12/2/2013 0.00 8/4/2014 8/4/2014 0.00 Yes Yes Yes 0 0 0

EMD HW Initial Design 180 180 0 8/5/2014 8/4/2014 (0.13) 4/13/2015 4/13/2015 0.00 Yes Yes Yes 0 0 0

EMD HW Detailed Design 285 285 0 4/14/2015 4/13/2015 (0.25) 5/16/2016 5/16/2016 0.00 Yes Yes Yes 0 0 0

EMD HW Final Design 320 320 0 5/17/2016 5/16/2016 (0.38) 8/7/2017 8/7/2017 0.00 No No Yes 115 115 0

EMD HW Malfunction (RR#1) 0 0 0 8/7/2017 8/7/2017 0.00 8/7/2017 8/7/2017 0.00 No No Yes 115 115 0

EMD Software 1075 1075 0 12/3/2013 12/2/2013 0.00 1/15/2018 1/15/2018 0.00 Yes Yes Yes 0 0 0

EMD SW System Design 175 175 0 12/3/2013 12/2/2013 0.00 8/4/2014 8/4/2014 0.00 Yes Yes Yes 0 0 0

EMD SW Initial Design 180 180 0 8/5/2014 8/4/2014 (0.13) 4/13/2015 4/13/2015 0.00 Yes Yes Yes 0 0 0

2017 NASA Cost and Schedule Symposium

* Note: JACS calculation engine recreates the schedule logic internally in order to quickly and efficiently run the CPM (critical path method) engine

12

Page 13: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Results Insight

The ability to quickly access and understand the model results is a significant strength of JACS and a key need for JCL analysts

Several New Features were implemented in JACS to provide rapid access for JCL analysts

Critical Path Report– Provides insight into the deterministic critical path and the probabilistic

critical pathEnables identification of tasks that are hidden drivers (on probabilistic and not on deterministic) or are non-drivers (on deterministic and not on probabilistic)

Active Risk Impact– Provides insight into true impact of how a risk will affect results when it occurs

This focuses only on the conditional scenarios when the risk is active and gives better insight into how much of a potential driver a risk can be if it’s likelihood were to increase

Enhanced Insight Viewer– Additional reports were added to the Insight Viewer and also the ability to

specifically identify a task or summary to evaluate

132017 NASA Cost and Schedule Symposium

Page 14: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Results Insight – Critical Path Analysis

14

ACTIVITIES

RISKS

2017 NASA Cost and Schedule Symposium

Critical Path Analysis Report for ActivitiesSchedule: JACS Example File 02a Advanced - with Risk Factors.mpp

Report Date: 8/7/2017

Task ID Activity Name

Unique

ID Risk ID Start

Mean

Duration On Plan CP Criticality Duration Uncertainty Successors Predecessors

4 Delayed Award 331 10/1/2013 55.06 Yes 81% Tri(0,45,120,0,100) 5 3

12 EMD HW System Design 16 12/2/2013 196.73 Yes 46% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)7,27,52 5

18 EMD SW System Design 78 12/2/2013 199.91 Yes 39% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)7,28,53 5

13 EMD HW Initial Design 15 8/4/2014 202.23 Yes 44% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)8,30,52 7

19 EMD SW Initial Design 83 8/4/2014 205.63 Yes 39% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)8,31,53 7

26 EMD System Design Multi-Impact Risk (RR#3) 289 EMD_SysDes_Multi 8/4/2014 42.23 Yes 89%

27 HW System Design Risk 290 HWSysDes3.1 8/4/2014 4.40 Yes 52% Tri(15,25,45,10,90);Correl(SYSDESRISK=0.7)7 12

28 SW System Design Risk 291 SWSysDes3.2 8/4/2014 5.12 Yes 45% Tri(20,30,50,10,90);Correl(SYSDESRISK=0.7)7 18

14 EMD HW Detailed Design 14 4/13/2015 343.02 Yes 33% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)9,51,52 8

20 EMD SW Detailed Design 88 4/13/2015 348.68 Yes 49% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)9,21,51,53 8

29 EMD Initial Design Multi-Impact Risk (RR#4) 292 EMD_InitDes_Multi4/13/2015 45.25 Yes 76%

30 HW Initial Design Risk 293 HWInitDes4.1 4/13/2015 5.85 Yes 32% Tri(15,25,45,10,90);Correl(INITDESRISK=0.7)8 13

31 SW Initial Design Risk 294 SWInitDes4.2 4/13/2015 6.69 Yes 45% Tri(20,30,50,10,90);Correl(INITDESRISK=0.7)8 19

15 EMD HW Final Design 13 5/16/2016 384.86 No 18% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)32,25FF,16,51,52 9

21 EMD SW Design Failure (RR#2 ) 287 SW_RR#2 5/16/2016 39.18 Yes 75% Tri(80,120,180,10,90) 9 20

22 EMD SW Refinements 110 5/16/2016 532.50 Yes 63% Tri*(90,100,120,10,90);Correl(EMDDURATION=0.5)32,25FF,51,53 9

16 EMD HW Malfunction (RR#1) 286 HW_RR#1 8/7/2017 34.86 No 63% Tri(140,160,220,10,90) 32,25FF 15

36 Prod Procure Materials 313 1/15/2018 643.67 Yes 81% Tri*(85,100,125,10,90);Correl(PRODDURATION=0.7)37SS+140Days,49,54 34

37 Prod Air Vehicle 314 7/30/2018 909.57 Yes 81% Tri*(85,100,125,10,90);Correl(PRODDURATION=0.7)38SS+80Days,49,48FF,5436SS+140Days

38 Prod IAT&C 316 11/19/2018 625.86 Yes 40% Tri*(85,100,125,10,90);Correl(PRODDURATION=0.7)41SS,42SS,40SS+180Days,39,48FF,45FF+45Days37SS+80Days

41 Prod Common Support Equipment 319 11/19/2018 341.58 No 0% Tri*(70,105,140,10,90);Correl(PRODDURATION=0.7)49,54 38SS

42 Prod Peculiar Support Equipment 320 11/19/2018 292.75 No 0% Tri*(70,105,140,10,90);Correl(PRODDURATION=0.7)49,54 38SS

40 Prod ST&E 317 7/29/2019 546.06 Yes 40% Tri*(70,105,140,10,90);Correl(PRODDURATION=0.7)48FF,49 38SS+180Days

39 Air Vehicle Test Failure RR#5 332 AV_RR#5 3/8/2021 14.88 No 0% Tri(80,140,220,10,90) 48FF,45FF+45Days 38

Critical Path Analysis Report for RisksSchedule: JACS Example File 02a Advanced - with Risk Factors.mpp

Report Date: #######

Activated Task ID Risk Name

Unique

ID Risk ID Start On Plan CP Criticality Likelihood

Conditional

Criticality Mean Duration Mean TI Cost Mean TD Cost Successors Predecessors

Yes 26 EMD System Design Multi-Impact Risk (RR#3) 289 EMD_SysDes_Multi

Yes 27 HW System Design Risk 290 HWSysDes3.1 8/4/2014 Yes 7.80% 15.00% 52.00% 29.00 $543.00 $0.00 7 12

Yes 28 SW System Design Risk 291 SWSysDes3.2 8/4/2014 Yes 6.80% 15.00% 45.33% 34.00 $888.00 $0.00 7 18

Yes 29 EMD Initial Design Multi-Impact Risk (RR#4) 292 EMD_InitDes_Multi

Yes 30 HW Initial Design Risk 293 HWInitDes4.1 4/13/2015 Yes 6.40% 20.00% 32.00% 29.00 $582.00 $0.00 8 13

Yes 31 SW Initial Design Risk 294 SWInitDes4.2 4/13/2015 Yes 9.00% 20.00% 45.00% 33.00 $1,006.00 $0.00 8 19

Yes 21 EMD SW Design Failure (RR#2 ) 287 SW_RR#2 5/16/2016 Yes 22.60% 30.00% 75.33% 131.00 $3,564.00 $0.00 9 20

Yes 16 EMD HW Malfunction (RR#1) 286 HW_RR#1 8/7/2017 No 12.60% 20.00% 63.00% 174.00 $1,041.00 $0.00 32,25FF 15

Yes 39 Air Vehicle Test Failure RR#5 332 AV_RR#5 3/8/2021 No 0.00% 10.00% 0.00% 149.00 $11,191.00 $0.00 48FF,45FF+45Days 38

When Risk is Active

Page 15: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Results Insight – Active Risk Impact

2017 NASA Cost and Schedule Symposium

Risk Event RankingSchedule: JACS Example File 02a Advanced - with Risk Factors.mpp

Report Date:8/7/2017

Ranking

Activated Risk Description Task ID Risk IDActivity

Impacted

Based on

ActiveOverall Active Likelihood Criticality

Conditional

Criticality

Overall Mean

Duration

Active Mean

Duration

CoC Mean

Duration

Overall CoC

Duration Impact

Active CoC

Duration Impact

Yes EMD HW Malfunction (RR#1) 16 HW_RR#1 15 1 35.9% 41.5% 20.0% 12.6% 63.0% 34.9 174.3 174.9 22.0 110.2

Yes EMD SW Design Failure (RR#2 ) 21 SW_RR#2 20 2 48.4% 37.3% 30.0% 22.6% 75.3% 39.2 130.6 131.6 29.7 99.1

Yes EMD System Design Multi-Impact Risk (RR#3) 26 EMD_SysDes_Multi

Yes HW System Design Risk 27 HWSysDes3.1 12 4 3.8% 5.8% 15.0% 7.8% 52.0% 4.4 29.3 29.8 2.3 15.5

Yes SW System Design Risk 28 SWSysDes3.2 18 3 3.9% 6.1% 15.0% 6.8% 45.3% 5.1 34.1 35.5 2.4 16.1

Yes EMD Initial Design Multi-Impact Risk (RR#4) 29 EMD_InitDes_Multi

Yes HW Initial Design Risk 30 HWInitDes4.1 13 6 3.3% 3.8% 20.0% 6.4% 32.0% 5.8 29.2 31.5 2.0 10.1

Yes SW Initial Design Risk 31 SWInitDes4.2 19 5 4.8% 5.5% 20.0% 9.0% 45.0% 6.7 33.4 32.6 2.9 14.7

Yes Air Vehicle Test Failure RR#5 39 AV_RR#5 38 7 0.0% 0.0% 10.0% 0.0% 0.0% 14.9 148.8 0.0 0.0 0.0

Information Probabilities ImpactsRelative Contribution

Ranking: Ranking of risk events based on contribution to schedule when risk is on the critical path (order of the Active Relative Contribution values)

Relative Contribution: Proportion of a risk to the sum of all risks for the Overall/Active CoC Duration Impacts

Probabilities

Likelihood: Probability risk is active (user-defined)

Criticality: Probability risk is on the critical path

Conditional Criticality: Probability risk is critical, given it is active

Impacts Overall Mean Duration: Average duration of risk, considering all iterations

Active Mean Duration: Average duration of risk, considering only those iterations when it is active

Conditional Criticality Mean Duration: Average duration of risk, considering only those iterations when it is active and lands on the critical path

Overall Conditional Criticality Duration Impact: Expected impact to the critical path considering all iterations (CoC Mean * Criticality)

Active Conditional Criticality Duration Impact: Expected impact to the critical path considering only those iterations when it lands on the critical path (CoC Mean * Conditional Criticality)

15

Page 16: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Compare Scenario Results

2017 NASA Cost and Schedule Symposium

A key aspect of JCL analysis is to assess scenarios, including in the review process where the SRB may adjust inputs they feel should be modified (e.g., Risk and Distributions), and assess the change

A new capability was added to Insight to allow comparison

16

Page 17: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

JCL Model Assessment

2017 NASA Cost and Schedule Symposium

JACS Reports menu—access to tools/outputs to assess models

Note: Dashboard style reports available in JACS – Insight analysis tool

Assesses both the schedule and JACS specific entries and generates a report to help you find and correct deviations from best practices

Summarizes and outlines tasks impacted by risk factors

Summarizes every aspect of the contents of the schedule at a high level as well as a detailed breakdown for each row

Compares the values JACS calculates without uncertainty (often referred to as the plan values or the point estimate values) to the values Microsoft Project calculates

Provides a comprehensive output of all the costs on all the rows in the model broken out by FY

Compares 2 JACS files in great detail—Flags any differences in inputs, outputs, and attributes

Reveals critical information regarding the effects of risk events—metrics include conditional criticality

Provides an overview of each Task and Risk Event’s relationship to the critical path individually that includes key metrics

Classic risk statistics report with filter options for duration & cost by task

17

Page 18: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

JACS is available to all NASA users

JACS is included within the ACEIT software suite NASA has volume licensing for nine (9) NASA centers

NASA Civil Servants and in-situ NASA Contractors have fullaccess to ACEIT and JACS

JACS software is available for download fromwww.oncedata.com

One of the most frequent downloads

JACS technical support is fully covered by the ACEIT license Contact [email protected] to enter a help ticket

Formal JACS Training Course is available Has previously been held at NASA HQ, NASA LaRC, and NASA JSC

2017 NASA Cost and Schedule Symposium 18

Page 19: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

Summary

Developing a sound JCL model is a challenging task involving many teams, inputs, and moving parts

JACS’ analytical tools help assess Schedule Structure, Cost, Risk, and Uncertainty Inputs during JCL modeling process

Exploiting these tools/reports helps ensure the JCL model best simulates the program/project

2017 NASA Cost and Schedule Symposium 19

Page 20: How to Assess a Joint Confidence Level (JCL) Model · JCL model output value can become focus of analysis Analysts need eyes wide open during entire process Need to assess model,

THANK YOU

2017 NASA Cost and Schedule Symposium 20