Upload
judd
View
36
Download
1
Tags:
Embed Size (px)
DESCRIPTION
COSYSMO Working Group Meeting Industry Calibration results. Ricardo “two months from the finish line” Valerdi USC Center for Software Engineering & The Aerospace Corporation. Morning Agenda. - PowerPoint PPT Presentation
Citation preview
COSYSMO Working Group Meeting
Industry Calibration results
Ricardo “two months from the finish line” ValerdiUSC Center for Software Engineering & The Aerospace Corporation
2
Morning Agenda7:30 Continental Breakfast (in front of Salvatori Hall)
8:30 Introductions [All]
9:00 Brief overview of COSYSMO [Ricardo]
9:15 Calibration results [Ricardo]
9:45 Break
10:15 Size driver counting rules exercise [All]
11:15 Mini Delphi for EIA 632 activity distributions [All]
12:00 Lunch (in front of Salvatori Hall)
3
Afternoon Agenda1:00 Joint meeting with COSOSIMO workshop [JoAnn Lane]
2:00 COSYSMO Risk/Confidence Estimation Prototype [John Gaffney]
2:45 Break
3:15 Open issues
Local calibrations
Lies, damned lies, and statistical outliers
Future plans for COSYSMO 2.0 (including ties to SoS work)
4:30 Action items for next meeting: July 2005 in Keystone, CO
5:00 Adjourn
4
7-step Modeling MethodologyAnalyze Existingliterature
1
2
3
4
5
6
7
PerformBehavioral Analysis
Identify RelativeSignificance
Perform Expert-Judgement, DelphiAssessment
Gather Project Data
Determine BayesianA-Posteriori Update
Gather more data;refine model
WE ARE HERE
5
COSYSMO
SizeDrivers
EffortMultipliers
Effort
Calibration
# Requirements# Interfaces# Scenarios# Algorithms
+Volatility Factors
- Application factors-8 factors
- Team factors-6 factors
WBS guided by EIA/ANSI 632
COSYSMO Operational Concept
6
COSYSMO Cost Drivers• Application Factors
– Requirements understanding – Architecture understanding – Level of service requirements– Migration complexity – Technology Maturity – Documentation Match to Life
Cycle Needs– # and Diversity of
Installations/Platforms– # of Recursive Levels in the
Design
• Team Factors– Stakeholder team
cohesion – Personnel/team
capability – Personnel
experience/continuity – Process maturity – Multisite coordination – Tool support
7
COSYSMO 1.0 Calibration Data Set
• Collected 35 data points
• From 6 companies; 13 business units
• No single company had > 30% influence
COSYSMO Data SourcesRaytheon Intelligence & Information Systems (Garland, TX)
Northrop Grumman Mission Systems (Redondo Beach, CA)
Lockheed Martin Transportation & Security Solutions (Rockville, MD)
Integrated Systems & Solutions (Valley Forge, PA)
Systems Integration (Owego, NY)
Aeronautics (Marietta, GA)
Maritime Systems & Sensors (Manassas, VA)
General Dynamics Maritime Digital Systems/AIS (Pittsfield, MA)Surveillance & Reconnaissance Systems/AIS (Bloomington, MN)
BAE Systems National Security Solutions/ISS (San Diego, CA)
Information & Electronic Warfare Systems (Nashua, NH)
SAIC Army Transformation (Orlando, FL)
Integrated Data Solutions & Analysis (McLean, VA)
9
Data Champions• Gary Thomas, Raytheon• Steven Wong, Northrop Grumman• Garry Roedler, LMCO• Paul Frenz, General Dynamics• Sheri Molineaux, General Dynamics• Fran Marzotto, General Dynamics• John Rieff, Raytheon• Jim Cain, BAE Systems• Merrill Palmer, BAE Systems• Bill Dobbs, BAE Systems• Donovan Dockery, BAE Systems• Mark Brennan, BAE Systems• Ali Nikolai, SAIC
10
Meta Properties of Data Set
Almost half of the data received was from
Military/Defense programs
55% was from Information Processing
systems and 32% was from C4ISR
11
Meta Properties of Data SetTwo-thirds of the projects
were software-intensive
First 4 phases of the SE
life cycle were adequately
covered
12
Industry Calibration Factor
14
1,,,,,, )(
jj
E
kkdkdknknkekeNS EMwwwAPM
Calculation is based on aforementioned data (n = 35)
)ln(01.114.3)_ln( SizeHRSSE
01.187.22_ SizeHRSSE This calibration factor must be
adjusted for each organization Evidence of diseconomies of scale
(partially captured in Size driver weights)
13
Size Driver Influence on Functional Size
# of Interfaces and # of Algorithms drivers proved to be less significant
# of scenarios and # of requirements accounted for 83% of functional size
N = 35
14
Parameter Transformation
15
Size vs. Effort35 projects
R-squared = 0.55
Range of SIZE: Min = 82, Max = 17,763
Ran
ge o
f S
E_H
RS
: M
in =
881
, M
ax =
1,3
77,4
58
16
Intra-Size Driver Correlation
REQ INTF ALG OPSC
REQ 1.0
INTF 0.63 1.0
ALG 0.48 0.64 1.0
OPSC 0.59 0.32 0.05 1.0
• REQ & INTF are highly correlated (0.63)• ALG & INTF are highly correlated (0.64)
17
A Day In the Life…Common problems
• Requirements reported at “sky level” rather than “sea level”– Test: if REQS < OPSC, then investigate– Often too high; requires some decomposition
• Interfaces reported at “underwater level” rather than “sea level”– Test: if INTF source = pin or wire level, then
investigate– Often too low; requires investigation of physical or
logical I/F We will revisit these issues later
18
A Day In the Life… (part 2)Common problems (cont.)• Algorithms not reported
– Only size driver omitted by 14 projects spanning 4 companies
– Still a controversial driver; divergent support
• Operational Scenarios not reported– Only happened thrice (scope of effort reported was
very small in all cases) – Fixable; involved going back to V&V
documentation to extract at least one OPSC
We will revisit these issues later
19
The Case for Algorithms
• Reasons to keep ALG in model– Accounts for 16% of the total size
in the 21 projects that reported ALG– It is described in the INCOSE
SE Handbook as a crucial part of SE
• Reasons to drop ALG from model– Accounts for 9% of total SIZE contribution– Omitted by 14 projects, 4 companies– Highly correlated with INTF (0.64)– Has a relatively small (0.53) correlation with Size
(compared to REQ 0.91, INT 0.69, and OPSN 0.81)
N = 21
20
Cost Drivers
• Original set consisted of > 25 cost drivers
• Reduced down to 8 “application” and 6 “team” factors
• See correlation handout
• Regression coefficient improved from 0.55 to 0.64 with the introduction of cost drivers
• Some may be candidates for elimination or aggregation
21
Cost Drivers: Application Factor Distribution(RQMT, ARCH, LSVC, MIGR)
Requirements Understanding (RQMT)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Architecture Understanding (ARCH)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Level of Service Requirements (LSVC)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Migration Complexity (MIGR)
0
5
10
15
20
Nominal High Very High Extra High
22
Technology Maturity (TMAT)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Documentation (DOCU)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Installations & Platforms (INST)
0
5
10
15
20
25
Nominal High Very High Extra High
Recursive Levels in the Design (RECU)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Cost Drivers: Application Factor Distribution(TMAT, DOCU, INST, RECU)
23
Cost Drivers: Team Factor Distribution(TEAM, PCAP, PEXP, PROC)
Stakeholder Team Cohesion (TEAM)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Personnel/Team Capability (PCAP)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
Personnel Experience (PEXP)
05
10152025
VeryLow
Low Nominal High VeryHigh
Process Capability (PROC)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
ExtraHigh
24
Cost Drivers: Team Factor Distribution(SITE, TOOL)
Multisite Coordination (SITE)
0
5
10
15
VeryLow
Low Nominal High VeryHigh
ExtraHigh
Tool Support (TOOL)
0
5
10
15
20
VeryLow
Low Nominal High VeryHigh
25
Top 10 Intra Driver Correlations• Size drivers correlated to cost drivers
0.39 Interfaces & # of Recursive Levels in the Design -0.40 Interfaces & Multi Site Coordination 0.48 Operational Scenarios & # of Recursive Levels in Design
• Cost drivers correlated to cost drivers 0.47 Requirements Und. & Architecture Und. -0.42 Requirements Und. & Documentation 0.39 Requirements Und. & Stakeholder Team Cohesion 0.43 Requirements Und. & Multi Site Coordination 0.39 Level of Service Reqs. & Documentation 0.50 Level of Service Reqs. & Personnel Capability 0.49 Documentation & # of Recursive Levels in Design
26
Candidate Parameters for Elimination• Size Drivers
– # of Algorithms*^
• Cost Drivers (application factors)– Requirements Understanding*^– Level of Service Requirements^– # of Recursive Levels in the Design*– Documentation^– # of Installations & Platforms^– Personnel Capability^– Tool Support^
*Due to high correlation^Due to regression insignificance
Motivation for eliminating parameters is based on the high ratio of parameters (18)to data (35) and the need for degrees of freedom
By comparison, COCOMO II has 23 parameters and over 200 data points
27
The Case for# of Recursive Levels in the Design
• Reasons to keep RECU in model– Captures emergent properties of systems– Originally thought of as independent from other
size and cost drivers
• Reasons to drop RECU from model– Highly correlated to
• Size (0.44)• Operational Scenarios (0.48)• Interfaces (0.39)• Documentation (0.49)
28
Size driver counting rulesAre there any suggested improvements?
• Requirements– Need to add guidance with respect to
• “system” vs. “system engineered” vs. “subsystem” requirements
• “decomposed” vs. “derived” requirements
– Current guidance includes• Requirements document, System Specification,
RVTM, Product Specification, Internal functional requirements document, Tool output such as DOORS, QFD.
29
Counting Rules: Requirements
Number of System Requirements
This driver represents the number of requirements for the system-of-interest at
a specific level of design. The quantity of requirements includes those related
to the effort involved in system engineering the system interfaces, system
specific algorithms, and operational scenarios. Requirements may be
functional, performance, feature, or service-oriented in nature depending on the
methodology used for specification. They may also be defined by the customer
or contractor. Each requirement may have effort associated with is such as
V&V, functional decomposition, functional allocation, etc. System requirements
can typically be quantified by counting the number of applicable
shalls/wills/shoulds/mays in the system or marketing specification. Note: some
work is involved in decomposing requirements so that they may be counted at
the appropriate system-of-interest.
How can we prevent requirements count from being provided too high?
30
Number of System Interfaces
This driver represents the number of shared physical and logical
boundaries between system components or functions (internal
interfaces) and those external to the system (external interfaces).
These interfaces typically can be quantified by counting the number of
external and internal system interfaces among ISO/IEC 15288-defined
system elements.
• Examples would be very useful• Current guidance includes
– Interface Control Document, System Architecture diagram, System block diagram from the system specification, Specification tree.
Counting Rules: Interfaces
How can we prevent interface count from being provided too low?
31
Number of System-Specific Algorithms
This driver represents the number of newly defined or significantly
altered functions that require unique mathematical algorithms to be
derived in order to achieve the system performance requirements. As
an example, this could include a complex aircraft tracking algorithm like
a Kalman Filter being derived using existing experience as the basis for
the all aspect search function. Another example could be a brand new
discrimination algorithm being derived to identify friend or foe function
in space-based applications. The number can be quantified by counting
the number of unique algorithms needed to realize the requirements
specified in the system specification or mode description document.
• Current guidance includes– System Specification, Mode Description Document, Configuration Baseline,
Historical database, Functional block diagram, Risk analysis.
Counting Rules: Algorithms
Are we missing anything?
32
Number of Operational Scenarios
This driver represents the number of operational scenarios that a
system must satisfy. Such scenarios include both the nominal stimulus-
response thread plus all of the off-nominal threads resulting from bad or
missing data, unavailable processes, network connections, or other
exception-handling cases. The number of scenarios can typically be
quantified by counting the number of system test thread packages or
unique end-to-end tests used to validate the system functionality and
performance or by counting the number of use cases, including off-
nominal extensions, developed as part of the operational architecture.
• Current guidance includes– Ops Con / Con Ops, System Architecture Document, IV&V/Test Plans,
Engagement/mission/campaign models.
Counting Rules: Op Scn
How can we encourage Operational Scenario reporting?
33
Effort Profiling mini-Delphi• Step 4 of the 7-step
methodology • Two main goals
1. Develop a typical distribution profile for systems engineering across 4 of the 6 life cycle stages (i.e., how is SE distributed over time?)
2. Develop a typical distribution profile for systems engineering across 5 effort categories (i.e., how is SE distributed by activity category?)
34
COCOMO II Effort DistributionMBASE/RUP phases and activities
Source : Software Cost Estimation with COCOMO II, Boehm, et al, 2000
35
ISO/IEC 15288
Conceptualize DevelopTransition to
Operation
Operate,Maintain,
or Enhance
Replace or Dismantle
Our Goal for COSYSMO
EIA
/AN
SI 6
32
Acquisition & Supply
Technical Management
System Design
Product Realization
Technical Evaluation
Operational Test &
Evaluation
36
Mini Delphi Part 1
5x6 matrix of EIA 632
processes vs. ISO 15288
life cycle phases
33 EIA 632 requirements
(for reference)
Goal: Develop a distribution profile for 4 of the 6
life cycle phases
EIA/ANSI 632 - Pre-System %
EIA/ANSI 632 - Sys Definition
%
EIA/ANSI 632 -
Subsystem Design
%
EIA/ANSI 632 -
Detailed Design %
EIA/ANSI 632 -
Integration, Test, and
Evaluation %
ISO-IEC 15288-
Operations %
ISO-IEC 15288 -
Maintenance or
Support %
ISO-IEC 15288 -
Retirement %
Product Supply - 4.1.1 (1) 40% 30% 20% 10% 0% 0% 0% 0%
Product Acquisition - 4.1.2 (2) 40% 30% 20% 10% 0% 0% 0% 0%
Supplier Performance - 4.1.2 (3) 30% 30% 20% 20% 0% 0% 0% 0%
Technical Management - 4.2 (4-13) 15% 20% 20% 20% 25% 0% 0% 0%
Requirements Definition - 4.3.1 (14-16) 35% 30% 20% 10% 5% 0% 0% 0%
Solution Definition - 4.3.2 (17-19) 25% 35% 30% 5% 5% 0% 0% 0%
Implementation - 4.4 (20) 5% 10% 25% 40% 20% 0% 0% 0%Transition to Use - 4.4
(21) 5% 10% 25% 30% 30% 0% 0% 0%Systems Analysis -
4.5.1 (22-24) 25% 40% 25% 5% 5% 0% 0% 0%Requirements Validation
- 4.5.2 (25-29) 10% 35% 30% 15% 10% 0% 0% 0%System Verification -
4.5.3 (30-32) 10% 25% 20% 20% 25% 0% 0% 0%End Products Validation
- 4.5.4.1 (33) 20% 25% 20% 15% 20% 0% 0% 0%
EIA/ANSI 632 & ISO/IEC 15288 Allocation -
Clause No. (Requirements)
Previous Results Are Informative
Acquisition & Supply
Technical Management
System Design
Product Realization
Technical Evaluation
38
EIA/ANSI 632
EIA/ANSI 632 - Provide an integrated set of fundamental processes to aid a developer in the engineering or re-engineering of a system
Breadth and Depth of Key SE StandardsSystem life
ISO/IEC 15288
Le
vel o
f d
etai
l
Conceptualize DevelopTransition to
Operation
Operate,Maintain,
or EnhanceReplace
or Dismantle
Processdescription
High levelpractices
Detailedpractices
ISO/IEC 15288 - Establish a common framework for describing the life cycle of systems
Purpose of the Standards:Purpose of the Standards:
IEE
E 1
220
IEEE 1220 - Provide a standard for managing systems engineeringSource : Draft Report ISO Study Group May 2, 2000
5 Fundamental Processes for Engineering a System
Source: EIA/ANSI 632 Processes for Engineering a System (1999)
33 Requirements for Engineering a System
Source: EIA/ANSI 632 Processes for Engineering a System (1999)
41
Mini Delphi Part 2
5 EIA 632 fundamental
processes
33 EIA 632 requirements
(for reference)
Goal: Develop a typical distribution profile for systems
engineering across 5 effort categories
42
Preliminary results
4 person Delphi done last week at GSAW
EIA 632 Fundamental Process
Average Standard Deviation
Acquisition & Supply 5% 0
Technical Management 13.75% 2.5
System Design 26.25% 9.4Product
Realization 22.5% 6.4
Technical Evaluation 32.5% 15
COSYSMO Invasion
In chronological order:
Developer Implementation Availability
Gary Thomas (Raytheon)
myCOSYSMO v1.22 Prototype at: www.valerdi.com/cosysmo
Ricardo Valerdi (USC)
AcademicCOSYSMO August 2005
John Gaffney (Lockheed Martin)
Risk add-on Prototype developed, not yet integrated
Dan Liggett
(Costar)
commercialCOSYSMO TBD
COSYSMO Risk Estimation Add-on
Justification– USAF (Teets) and Navy acquisition chief (Young) require "High Confidence Estimates“– COSYSMO currently provides a single point solution– Elaboration of the “Sizing confidence level” in myCOSYSMO
Final Items• Open issues• Local calibrations• Lies, damned lies, and statistical outliers• Future plans for COSYSMO 2.0 (including
ties to SoS work)• Action items for next meeting: July 2005 in
Keystone, CO• Combine Delphi R3 results and perform
Bayesian approximation• Dissertation defense: May 9
46
Ricardo Valerdi
Websites
http://sunset.usc.edu
http://valerdi.com/cosysmo