Upload
curtis-henderson
View
221
Download
1
Tags:
Embed Size (px)
Citation preview
3/26/10 1© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-BasedSoftware Engineering
CS 577b
Winsor Brown(Courtesy Ray Madachy, Barry Boehm,
Keun Lee & Apurva Jain)
March 26, 2010
3/26/10 2© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Outline
• VBSE Refresher and Background• Example: Value-Based Reviews• Example: Value-Based Product and Process
Modeling
3/26/10 3© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
A Short History of Software Processes- necessarily oversimplified
Decade Orientation Example(s) Problems
1950's Engineering SAGE Hardware engineering
orientation
1960's Programming Code and fix Rework, scalability
1970's Requirements Waterfall Risk, GUI, COTS/reuse
1980's Many Evolutionary, Incremental,
Spiral, Helix, JAD, RAD Overgeneralization/ Overspecialization
1990's Emergence Win-Win Spiral, Rational,
Adaptive (model generators) Value and economics
2000's Value Benefits realization, VBSE Integration of stovepipes
2010’s Enterprise Enterprise architectures,
systems of systems Human/ computer primary
3/26/10 4© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
16%
31%
53%
Software Engineering Is Not Well-Practiced Today-Standish Group CHAOS Report 1995
On-time On-budget
Discontinued
Averages• 189% of original budget• 221% of original schedule• 61% of original functionality
Seriously Overrun
3/26/10 5© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
16%
31%
53%
Less Chaos Today-Standish Group CHAOS Report 2007
On-time On-budget
35%
Averages• 189% of original budget• 221% of original schedule• 61% of original functionality
DiscontinuedFailures
19%
Seriously OverrunChallenged
46%
3/26/10 6© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Why Software Projects Fail
3/26/10 7© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Software Testing Business Case• Vendor proposition
– Our test data generator will cut your test costs in half
– We’ll provide it to you for 30% of your test costs
– After you run all your tests for 50% of your original cost, you are 20% ahead
• Any concerns with vendor proposition?– Test data generator is value-neutral– Every test case, defect is equally important– Usually, 20% of test cases cover 80% of
business case
3/26/10 8© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000)
% of Valuefor
CorrectCustomer
Billing
Customer Type
100
80
60
40
20
5 10 15
Automated test generation tool - all tests have equal value
3/26/10 9© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-Based Testing Provides More Net Value
Net Value
NV
60
40
20
0
-20
20 40 10060 80
-40
(100, 20)
Percent of tests run
Test Data Generator
Value-Based Testing(30, 58)
% Tests
Test Data Generator Value-Based Testing
Cost Value NV Cost Value NV
0 30 0 -30 0 0 0
10 35 10 -25 10 50 40
20 40 20 -20 20 75 55
30 45 30 -15 30 88 58
40 50 40 -10 40 94 54
…. …. …. …. …. …. ….
100 80 100 +20 100 100 0
3/26/10 10© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Motivation for Value-Based SE• Current SE methods are basically value-
neutral– Every requirement, use case, object, test case, and defect
is equally important– Object oriented development is a logic exercise– “Earned Value” Systems don’t track business value– Separation of concerns: SE’s job is to turn requirements
into verified code– Ethical concerns separated from daily practices
• Value-neutral SE methods are increasingly risky– Software decisions increasingly drive system value– Corporate adaptability to change achieved via software
decisions– System value-domain problems are the chief sources of
software project failures
3/26/10 11© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Key Definitions
• Value (from Latin “valere” – to be worth)1. A fair or equivalent in goods, services, or money
2. The monetary worth of something
3. Relative worth, utility or importance
• Software validation (also from Latin “valere”)– Validation: Are we building the right product?– Verification: Are we building the product right?
3/26/10 12© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
7 Key Elements of VBSE
1. Benefits Realization Analysis
2. Stakeholders’ Value Proposition Elicitation and Reconciliation
3. Business Case Analysis
4. Continuous Risk and Opportunity Management
5. Concurrent System and Software Engineering
6. Value-Based Monitoring and Control
7. Change as Opportunity
3/26/10 13© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Maslow Human Need Hierarchy
A. Maslow, Motivation and Personality, 1954.
Self-Actualization
Esteem and Autonomy
Belongingness and love
Safety and Security
Physiological (Shelter, Food and Drink)
3/26/10 14© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Maslow Need Hierarchy• Satisfied needs aren’t motivators• Unsatisfied lower-level needs dominate higher-level needs• Management implications
– Create environment and subculture which
satisfies lower-level needs
• Stability
• Shared values, community• Match to special needs
– Tailor project objectives, structure to
participants’ self-actualization priorities
3/26/10 15© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
People Self-Actualize in Different Ways
• Becoming a Better Manager
• Becoming a Better Technologist
• Helping Other Developers
• Helping Users
• Making People Happy
• Making People Unhappy
• Doing New Things
• Increasing Professional Stature
3/26/10 16© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Theory W: WinWin Achievement Theorem
Making winners of your success-critical stakeholders requires:
i. Identifying all of the success-critical stakeholders (SCSs).
ii. Understanding how the SCSs want to win.
iii. Having the SCSs negotiate a win-win set of product and process plans.
iv. Controlling progress toward SCS win-win realization, including adaptation to change.
3/26/10 17© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
VBSE Theory 4+1 Structure
Utility Theory
Theory W:SCS Win-Win
Decision Theory
Dependency Theory
Control Theory
How do dependencies affect value realization?
How to adapt to change and control value realization?
How do values determine decision choices?
How important are the values?
What values are important?How is success assured?
3/26/10 18© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
VBSE Component Theories• Theory W (Stakeholder win-win)
– Enterprise Success Theorem, Win-Win Achievement Theorem
• Dependency Theory (Product, process, people interdependencies)– Systems architecture/performance theory, costing and
scheduling theory; organization theory
• Utility Theory– Utility functions, bounded rationality, Maslow need hierarchy,
multi-attribute utility theory
• Decision Theory– Statistical decision theory, game theory, negotiation theory,
theory of Justice
• Control Theory– Observability, predictability, controllability, stability theory
3/26/10 19© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Dependency Theory - Example
3/26/10 20© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Utility Theory - Example
3/26/10 21© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Decision Theory - Example
3/26/10 22© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Decision Theory – Example (2)[Apply Models to Predict Balance]
low P(L): thorough planslow S(L): minor problems
Time and Effort Invested in Plans
RE
= P
( L) *
S( L
)
low P(L): few plan delayslow S(L): early value capture
high P(L): plan breakage, delayhigh S(L): value capture delays
Sweet Spot
high P(L): inadequate planshigh S(L): major problems
(oversights, delays, rework)
- Risk Exposure (RE) due to inadequate plans- RE due to market share erosion
- Sum of risk exposures
3/26/10 23© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Control Theory - Example
Develop/update business case;
time-phased cost, benefit flows; plans
Perform to plans
Value being
realized?
Assump-
tions still valid?
Determine corrective actions
Yes
Yes
No
No
Value Realization Feedback Control
3/26/10 24© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Example Project: Sierra Mountainbikes
– Based on what would have worked on a similar project
• Quality leader in specialty area
• Competitively priced
• Major problems with order processing
– Delivery delays and mistakes
– Poor synchronization of order entry, confirmation, fulfillment
– Disorganized responses to problem situations
– Excess costs; low distributor satisfaction
3/26/10 25© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Order Processing Project GoalsGoals: Improve profits, market share, customer
satisfaction via improved order processing
Questions: Current state? Root causes of problems? Keys to improvement?
Metrics: Balanced Scorecard of benefits realized, proxies– Customer satisfaction ratings; key elements (ITV: in-
transit visibility) – Overhead cost reduction – Actual vs. expected benefit and cost flows, ROI
3/26/10 26© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Expanded Order Processing System Benefits Chain
New order-entry system
New order fulfillment system
New order fulfillment processes,
outreach, training
Improved supplier coordination
Less time, fewer errors in order
processing
Increased customer
satisfaction, decreased
operations costs
Increased profits, growth
New order-entry processes,
outreach, training
Faster order-entry steps, errors
Safety, fairness inputs
Faster,betterorderentry
system
Interoperabilityinputs
On-time assembly
Increasedsales,
profitability,customer
satisfaction
Less time,fewer
errors perorderentry
system
Distributors, retailers, customers
SuppliersSales personnel,
distributors
Developers
Assumptions - Increasing market size - Continuing consumer satisfaction with product - Relatively stable e-commerce infrastructure - Continued high staff performance
3/26/10 27© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Project Strategy and Partnerships • Partner with eServices, Inc. for order processing and
fulfillment system– Profit sharing using jointly-developed business case
• Partner with key distributors to provide user feedback – Evaluate prototypes, beta-test early versions, provide
satisfaction ratings• Incremental development using MBASE/RUP anchor points
– Life Cycle Objectives; Architecture (LCO; LCA)– Core Capability Drivethrough (CCD)– Initial; Full Operational Capability (IOC; FOC)
• Architect for later supply chain extensions
3/26/10 28© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Milestone Due Date Budget ($K) Cumulative Budget ($K)
Inception Readiness 1/1/2004 0 0
Life Cycle Objectives 1/31/2004 120 120
Life Cycle Architecture 3/31/2004 280 400
Core Capability Drivethrough 7/31/2004 650 1050
Initial Oper. Capability: SW 9/30/2004 350 1400
Initial Oper. Capability: HW 9/30/2004 2100 3500
Developed IOC 12/31/2004 500 4000
Responsive IOC 3/31/2005 500 4500
Full Oper. Cap’y CCD 7/31/2005 700 5200
FOC Beta 9/30/2005 400 5600
FOC Deployed 12/31/2005 400 6000
Annual Oper. & Maintenance 3800
Annual O&M; Old System 7600
Order Processing System Schedules and Budgets
3/26/10 29© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Order Processing System: Expected Benefits and Business Case
New System Current System
Financial Customers
Date M
arke
t S
ize
($M
)
Mar
ket
Sh
are
%
Sal
es
Pro
fits
Mar
ket
Sh
are
%
Sal
es
Pro
fits
Co
st
Sav
ing
s
Ch
an
ge
in P
rofi
ts
Cu
m. C
han
ge
in P
rofi
ts
Cu
m. C
ost
RO
I
La
te D
eliv
ery
%
Cu
sto
me
r S
ati
sfa
ctio
n (
0-5
)
In-T
ran
sit
Vis
ibili
ty (
0-5
)
Eas
e o
f U
se (
0-5
)
12/31/03 360 20 72 7 20 72 7 0 0 0 0 0 12.4 1.7 1.0 1.8
12/31/04 400 20 80 8 20 80 8 0 0 0 4 -1 11.4 3.0 2.5 3.0
12/31/05 440 20 88 9 22 97 10 2.2 3.2 3.2 6 -.47 7.0 4.0 3.5 4.0
12/31/06 480 20 96 10 25 120 13 3.2 6.2 9.4 6.5 .45 4.0 4.3 4.0 4.3
12/31/07 520 20 104 11 28 146 16 4.0 9.0 18.4 7 1.63 3.0 4.5 4.3 4.5
12/31/08 560 20 112 12 30 168 19 4.4 11.4 29.8 7.5 2.97 2.5 4.6 4.6 4.6
3/26/10 30© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
A Real Earned Value System• Current “earned value” systems monitor cost and
schedule, not business value– Budgeted cost of work performed (“earned”)– Budgeted cost of work scheduled (“yearned”)– Actual costs vs. schedule (“burned”)
• A real earned value system monitors benefits realized– Financial benefits realized vs. cost (ROI)– Benefits realized vs. schedule
- Including non-financial metrics– Actual costs vs. schedule
3/26/10 31© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-Based Expected/Actual Outcome Tracking Capability
Milestone S
ch
ed
ule
Co
st
($K
)
Op
’l C
os
t S
avi
ng
s
Ma
rke
t S
ha
re %
An
nu
al
Sa
les
($M
)
An
nu
al
Pro
fits
($
M)
Cu
m.
Pro
fits
RO
I
La
te D
eliv
ery
%
Cu
sto
me
r S
ati
sfa
cti
on
ITV
Ea
se
of
Use
Ris
ks
/Op
po
rtu
nit
ies
3/31/04 400 20 72 7.0 12.4 1.7 1.0 1.8 Life Cycle Architecture
3/31/04 427 20 72 7.0 12.4 1.7 1.0 1.8
Increased COTS ITV risk, fallback identified.
7/31/04 1050
7/20/04 1096 2.4* 1.0* 2.7* Core Capability Demo (CCD)
Using COTS ITV fallback; new HW competitor; renegotiating HW.
9/30/04 1400 Software Initial Op’l Capability (IOC) 9/30/04 1532 2.7* 1.4* 2.8*
9/30/04 3500 Hardware IOC
10/11/04 3432
$200K savings from renegotiated HW.
12/31/04 4000 20 80 8.0 0.0 -1.0 11.4 3.0 2.5 3.0 Deployed IOC
12/20/04 4041 22 88 8.6 0.6 -.85 10.8 2.8 1.6 3.2
New COTS ITV source identified, being prototyped.
3/31/05 4500 300 9.0 3.5 3.0 3.5 Responsive IOC
3/30/05 4604 324 7.4 3.3 1.6 3.8
7/31/05 5200 1000 3.5* 2.5* 3.8* Full Op’l Capability CCD 7/28/05 5328 946
New COTS ITV source initially integrated.
9/30/05 5600 1700 3.8* 3.1* 4.1* Full Op’l Capability Beta 9/30/05 5689 1851
12/31/05 6000 2200 22 106 12.2 3.2 -.47 7.0 4.0 3.5 4.0
12/20/05 5977 2483 24 115 13.5 5.1 -.15 4.8 4.1 3.3 4.2 Full Op’l Capability Deployed Release 2.1
6/30/06 6250
3/26/10 32© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Conclusions So Far• Value considerations are software
success-critical
• “Success” is a function of key stakeholder values– Risky to exclude key stakeholders
• Values vary by stakeholder role
• Non-monetary values are important– Fairness, customer satisfaction, trust
3/26/10 33© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Utility Theory
Theory W:SCS Win-Win
Decision Theory
Dependency Theory
Control Theory
6a, 7c. State measurement, prediction, correction; Milestone synchronization
5a. Investment analysis, Risk analysis
1. Protagonist goals3a. Solution exploration7. Risk, opportunity, change management
5a, 7b. Prototyping
2a. Results Chains3b, 5a, 7b. Cost/schedule/performance tradeoffs
2. Identify SCSs
3b, 7a. Solution Analysis
5a, 7b. Option, solution development & analysis
4. SCS expectations management
3. SCS Value Propositions(Win conditions)
SCS: Success-Critical Stakeholder
6, 7c. Refine, Execute, Monitor & Control Plans
5. SCS Win-Win Negotiation
Initial VBSE Theory: 4+1 Process– With a great deal of concurrency and backtracking
July 7, 2009 © USC-CSSE 33
3/26/10 34© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Outline
• VBSE Refresher and Background• Example: Value-Based Reviews• Example: Value-Based Product and Process
Modeling
3/26/10 35© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Problem finding via peer reviews [Lee 2005]
• Hypothesis: Current value-neutral software peer reviews misallocate effort - Every requirement, use case, object, defect is equally important - Too much effort is spent on trivial issues - Current checklist, function, perspective, usage-based reviews are largely value-neutral
3/26/10 36© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Motivation: Value-neutral method vs. Value-based method
20
40
60
80
100
5 10 15
Pareto 80-20 distribution of test value
ATG – all tests have equal value
Pareto testing (Actual business value)
Automated test generation(ATG) vs. Pareto Testing
Cumulativebusiness value(%)
Customer Billing Type
3/26/10 37© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Motivation: Value-neutral review vs. Value-based review
Cost(K) Value(K) Net Value(K) ROI Cost(K) Value(K) Net Value(K) ROI0 1000 0 -1000 -1 1000 0 -1000 -1
20 1040 800 -240 -0.23 1040 3200 2160 2.0840 1080 1600 520 0.48 1080 3840 2760 2.5660 1120 2400 1280 1.14 1120 3968 2848 2.5480 1160 3200 2040 1.76 1160 3994 2834 2.44
100 1200 4000 2800 2.33 1200 4000 2800 2.33
% of reviewed
Value-neutral review Value-based review
• Return of Investment (ROI) = (benefit – costs) / costs• Assumptions
- $1M of the development costs has been invested in the customer billing system by the beginning of reviewing.- The both review techniques will cost 200K.- The business case for the system will produce $4M in business value in return for the $2M investment cost.- The business case will provide a similar 80:20 distribution for the value-based review.
3/26/10 38© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Motivation: Value-neutral review vs. Value-based review
2
1.5
1
0.5
0
- 0.5
-1
-1.520 40 60 80 100
ROI
% Reviews Run
Value-neutral reviewValue-based review
2.5
3
3/26/10 39© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Objective
Adding values into reading process
Focusing on higher-valued issues first
Find more number of higher-valued issues
Increase return value from reading
Adding Priority & Criticality
VBR process
Increase impact of issues
Increase Cost Effectiveness
3/26/10 40© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Objective
• Objectives: Develop and experiment with value-based peer review processes and checklists - Initial value-based checklists available as USC-CSE Tech Report - Experimentally applied across 28 remote IV&V reviewers
3/26/10 41© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Review Techniques
Definition Characteristics Strengths Shortfalls
(also generally value-neutral)
Checklist Based
Reading (CBR)
Reading technique with checklist
Common review technique used in the fields
- Easy to apply
- Checklist is helpful to focus what to do
Weak mapping to artifacts reviewed
Defect Based Reading (DBR)
Reading guided by defect classes
Proposed for requirement documents
- Clearer focus for reviewers
Often weak mapping to artifacts reviewed
Perspective Based
Reading (PBR)
Reading guided by different reviewers perspective
Different reviewers’ points of view
- Clearer focus, less overlaps
Less redundancy; little backup for less-effective reviewers.
Functionality Based
Reading (FBR)
Reading guided by Functionality Types
Function-oriented - Good for functional specifications
Mismatches to object-oriented specifications
Usage Based Reading (UBR)
Reading guided by use cases
Sometimes prioritized based on use cases
- Very good for usage problems
Weaker coverage of other problems
Review/Reading Techniques
3/26/10 42© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-Based Review Concepts
Priority
• The priority of the system capability in the artifact. In MBASE, the priority is determined from negotiations, meetings with clients and priorities indicated in the MBASE Guidelines.• The values of priority are High, Medium, Low or 3, 2, 1. Higher priority capabilities will be reviewed first. The value will be used to calculate effectiveness metrics.
Criticality
• Generally, the values of criticalities are given by SE experts, but IV&Vers can determine the values when better qualified. • The values of criticality are High, Medium, Low or 3, 2, 1. Higher criticality issues will be reviewed first at a given priority level. The value will be used to calculate effectiveness metrics.
3/26/10 43© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-Based Review Process (II)
Negotiation
Meetings
Developers
Customers
Users
Other stakeholders
Priorities of system
capabilities
Artifact-oriented checklist
Criticalities of issues
General Value-based checklist
Domain Expert
Priority
HighMedium
Low
Criticality
High
Medium
Low
1
2
3
4
5
optional
6
optional
optional
ReviewingArtifacts
Number indicates the usual ordering of review*
* May be more cost-effective to review highly-coupled mixed-priority artifacts.
3/26/10 44© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-Based Checklist (I) <General Value-Based Checklist>High-Criticality Issues Medium-Criticality Issues Low-Criticality Issues
Completeness •Critical missing elements: backup/ recovery, external interfaces, success-critical stakeholders; critical exception handling, missing priorities
•Critical missing processes and tools; planning and preparation for major downstream tasks (development, integration, test, transition)
•Critical missing project assumptions (client responsiveness, COTS adequacy, needed resources)
•Medium-criticality missing elements, processes and tools: maintenance and diagnostic support; user help
•Medium-criticality exceptions and off-nominal conditions; smaller tasks (review, client demos), missing desired growth capabilities, workload characterization
•Easily-deferrable, low-impact missing elements: straightforward error messages, help messages, GUI details doable via GUI builder, project task sequence details
Consistency/
Feasibility
•Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other
•Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions
•Missing evidence of critical consistency/feasibility assurance in FRD
•Medium-criticality shortfalls in traceability, inter-artifact inconsistencies, evidence of consistency/feasibility in FRD
•Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors
Ambiguity •Vaguely defined critical dependability capabilities: fault tolerance, graceful degradation, interoperability, safety, security, survivability
•Critical misleading ambiguities: stakeholder intent, acceptance criteria, critical user decision support, terminology
•Vaguely defined medium-criticality capabilities, test criteria
•Medium-criticality misleading ambiguities
•Non-misleading, easily deferrable, low-impact ambiguities: GUI details, report details, error messages, help messages, grammatical errors
Conformance •Lack of conformance with critical operational standards, external interfaces
•Lack of conformance with medium-criticality operational standards, external interfaces
•Misleading lack of conformance with document formatting standards, method and tool conventions
•Non-misleading lack of conformance with document formatting standards, method and tool conventions, optional or low-impact operational standards
Risk •Missing FRD evidence of critical capability feasibility: high-priority features, levels of service, budgets and schedules
•Critical risks in top-10 risk checklist: personnel, budgets and schedules, requirements, COTS, architecture, technology
•Missing FRD evidence of mitigation strategies for low-probability high-impact or high-probability, low-impact risks: unlikely disasters, off-line service delays, missing but easily-available information
•Missing FRD evidence of mitigation strategies for low-probability, low-impact risks
3/26/10 45© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Value-Based Checklist (II) – Example of General Value-Based Checklists
Consistency/Feasibility High-Criticality Issues
Consistency/Feasibility Low-Criticality Issues
• Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other
• Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions
• Missing evidence of critical consistency/feasibility assurance in FRD
• Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors
7
3/26/10 46© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Question Criticality
Are the system capabilities consistent with the system services provided as described in OCD 2.3?
3
Are there critical missing capabilities needed to perform the system services?
3
Are capabilities prioritized as High, Medium, or Low? 3
Are capability priorities consistent with current system shortcoming priorities (OCD 3.3.5)?
3
Are capabilities traced back to corresponding project goals and constraints (OCD 4.2)?
3
Are simple lower-priority capabilities (e.g., login) described in less detail? 2
Are there no levels of service goals (OCD 4.4) included as system capabilities?
2
Value-Based Checklist (III)
<Example : OCD 4.3 system capability>
<Artifact-oriented Value-Based Checklist>
3/26/10 47© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Weight of Review Issues
Effectiveness Metric
ArtifactPriority
IssueCriticality
H M L
H
M
L
9 6 3
6 4 2
3 2 1
issues(Artifact Priority) * (Issue Criticality)=
Generally considered optional to review
3/26/10 48© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Experiment Overall (II) [577A IV&Ver’s 2004]
• IV&Vers are involved in real-client projects (e-services) in the course CSCI 577A• 28 IV&Vers are randomly selected and divided into two groups A(15), B(13)• Group A : Value-Based Review, Group B : Review with traditional checklist• Trained Value-Based Review technique and review with traditional checklist separately.• Reviewed three documents (directly related to development)
• OCD(Operational Concept Description ) • SSRD(System and Software Requirement Description )• SSAD(System and Software Architecture Description)
• Hypotheses tested: No difference between Groups A and B in - Average number of Concerns and Problems - Average impact of Concerns and Problems - Average number of Concerns and Problems per effort hour - Average impact of Concerns and Problems per effort hour
3/26/10 49© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Independent Verification and Validation (IV&V)
Developers
IV&Vers
OCD
SSRD
SSAD
Checklist Training
Group A Group B
VBR CBR
Review
Develop
Concerns
Problems
identify
filter for
fix
provide
3/26/10 50© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Result (V) – T-Test p-values and other
By Number P-value
% Group A higher
By Impact P-value
% Group A higher
Average of Concerns
0.202 34 Average Impact of Concerns
0.049 65
Average of Problems
0.056 51 Average Impact of Problems
0.012 89
Average of Concerns per hour
0.026 55 Average Cost Effectiveness of Concerns
0.004 105
Average of Problems per hour
0.023 61 Average Cost Effectiveness of Problems
0.007 108
• Statistically, group A performed the review value-effectively to find concerns and problems.
• Group B had significantly higher numbers of trivial concerns and problems found (typo and grammar faults)
3/26/10 51© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
5 6 . 9 3
3 7 . 5 3
1 9 . 4
5 5 . 4 6
4 0 . 5
1 4 . 9 6
0 2 0 4 0 6 0
T o t a l E f f o r t
R e v i e wE f f o r t
P r e p a r a t i o nE f f o r t
G r o u p A
G r o u p B
Result (VI-A) – Effort Comparison
3/26/10 52© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
446
236
275
138 136
167
49
132
472
164177
6578
187
251
62
8569
125
198
328
167
135
89 95
13
47
102
219
126
157
9379
134
24
46
269
150130
40
100116
29
6045
81
104
303
6481
51 47
7
188
114
61
0
50
100
150
200
250
300
350
400
450
500
A1 A2 A3 A4 A5 A6 A7 A8 A9 A10 A11 A12 A13 A14 A15 B1 B2 B3 B4 B5 B6 B7 B8 B9 B10 B11 B12 B13
Concerns
Problems
Group A (Value Based) Group B (Traditional)
Number of concerns and problems found by IV&Vers
3/26/10 53© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Result (I-A) – Average Number of Concerns and Problems
178.15
133
106.46
70.73
0
20
40
60
80
100
120
140
160
180
Average of Concerns Average of Problems
Group A
Group B
By Number
P-value % Group A higher
Average of Concerns
0.202 34
Average of Problems
0.056 51
3/26/10 54© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Result (II-A) – Average Impact of Concerns and Problems
992.23
601.91 604.92
319.55
0
100200
300
400500
600
700800
900
1000
Average Impact of Concern Average Impact of Problems
Group A
Group B
By Impact P-value % Group A higher
Average Impact of Concerns
0.026 65
Average Impact of Problems
0.023 89
Impact =issues
(Artifact Priority) * (Issue Criticality)
3/26/10 55© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Conclusions of the experiment
Conclusions: At least in this small-team, remote IV&V context,
• Value-based reviews had significantly higher payoff than value-neutral reviews• With statistical significance for concerns and problems per hour,
value impact, and value impact per hour• VBR Required minimum effort comparing with CBR• VBR checklists were helpful to understand and review artifacts.
3/26/10 56© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Outline
• VBSE Refresher and Background• Example: Value-Based Reviews• Example: Value-Based Product and Process
Modeling
3/26/10 57© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Model Background [Madachy 2005]• Purpose: Support software business
decision-making by experimenting with product strategies and development practices to assess real earned value
• Description: System dynamics model relates the interactions between product specifications and investments, software processes including quality practices, market share, license retention, pricing and revenue generation for a commercial software enterprise
3/26/10 58© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Model Features
• A Value-Based Software Engineering (VBSE) model covering the following VBSE elements:
– Stakeholders’ value proposition elicitation and reconciliation– Business case analysis– Value-based monitoring and control
• Integrated modeling of business value, software products and processes to help make difficult tradeoffs between perspectives
– Value-based production functions used to relate different attributes
• Addresses the planning and control aspect of VBSE to manage the value delivered to stakeholders
– Experiment with different strategies and track financial measures over time– Allows easy investigation of different strategy combinations
• Can be used dynamically before or during a project– User inputs and model factors can vary over the project duration as opposed to a
static model– Suitable for actual project usage or “flight simulation” training where simulations
are interrupted to make midstream decisions
3/26/10 59© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Model Sectors and Major Interfaces • Software process and
product sector computes the staffing and quality over time
• Market and sales sector accounts for market dynamics including effect of quality reputation
• Finance sector computes financial measures from investments and revenues
Finances
Market andSales
SoftwareProcess and
Product
Staffing Rate
Product Quality
Sales Revenue
Product Specifications
3/26/10 60© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Software Process and Product
~
Function Points effort multiplier
cumulative effort
~
Reliability Setting
defect density
actual qualitydefects
~
defect removal rate
staffing rate
estimated total effort
learning function
manpower buildup parameter
defect generation rate
start staff
product defect flows
effort and schedule calculation with dynamic COCOMO variant
3/26/10 61© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Finances, Market and Sales
cash flow
financials
quality indicators
~
investment rate
desired cumulative revenue
~
desired revenue generation rate
sales and market
sales and market table
financial table
cumulative investment
~
investment rate
license expiration fraction
perceived qualitychange in perceived quality
~
current indicator of quality ~
delay in adjusting perceptions
active licenses
license expiration ratenew license selling rate
~
potential CX percent of market
cumulative revenue
revenue generation rate
desired ROI
average license price
9.6cumulative revenue
ROI
54.5desired cumulative reÉ
0.9ROI
desired cash flow
cumulative investment
9.7desired ROI
5.1cumulative investment
~
market size multiplier
potential market share
potential market share rate change
potential market share increase due to new product
market share delay
12:48 PM Wed, May 14, 2003
sales and market
0.00 1.25 2.50 3.75 5.00Years
1:
1:
1:
2:
2:
2:
3:
3:
3:
4:
4:
4:
0
1500
3000
200
600
1000
20
60
100
10
25
40
1: active licenses 2: new license selling É 3: license expiration rÉ 4: potential market shÉ
12:48 PM Wed, May 14, 2003
quality
0.00 1.25 2.50 3.75 5.00Years
1:
1:
1:
2:
2:
2:
0
50
100
1: perceived quality 2: current indicator of quality
investment and revenue flows software license sales
market share dynamics including quality reputation
3/26/10 62© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Quality Assumptions• COCOMO cost driver Required Software Reliability is a proxy for all
quality practices• Resulting quality will modulate the actual sales relative to the highest
potential• Perception of quality in the market matters
– Quality reputation quickly lost and takes much longer to regain (bad news travels fast) – Modeled as asymmetrical information smoothing via negative feedback loop
12:48 PM Wed, May 14, 2003
quality
0.00 1.25 2.50 3.75 5.00Years
1:
1:
1:
2:
2:
2:
0
50
100
1: perceived quality 2: current indicator of quality
11 1
1
2 2
2
2
3/26/10 63© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Market Share Production Function and Feature Sets
0%
5%
10%
15%
20%
25%
0 2 4 6 8
Cost ($M)
Ad
de
d M
ark
et
Sh
are
Pe
rce
nt
Reference Case(700 Function Points)
Case 1 and Case 2(550 Function Points)
CoreFeatures
High PayoffFeatures
Features withDiminishing Returns
Cases from Example 1
3/26/10 64© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Sales Production Function & Reliability
30%
40%
50%
60%
70%
80%
90%
100%
0.9 1 1.1 1.2 1.3
Relative Effort to Achieve Reliability
Pe
rce
nt
of
Po
ten
tia
l S
ale
s
Low
Nominal
High
Very High
Required ReliabilitySettings
Reference Caseand Case 1
Case 2
Cases from Example 1
3/26/10 65© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Example 1: Dynamically Changing Scope and Reliability
• Shows how model can assess the effects of combined strategies by varying the scope and required reliability independently or simultaneously
• Simulates midstream descoping, a frequent strategy to meet time constraints by shedding features
• Three cases are demonstrated:– Unperturbed reference case– Midstream descoping of the reference case after ½ year– Simultaneous midstream descoping and lowered required
reliability at ½ year
3/26/10 66© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Control Panel and Simulation Results
Page 10.00 1.00 2.00 3.00 4.00 5.00
Years
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
8
15
10
23
35
-1
1
3
1: staffing rate 2: market share 3: ROI
1
1
1 1 1
2 2
2
2 2
3 3
3
3
3
Page 10.00 1.00 2.00 3.00 4.00 5.00
Years
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
8
15
10
23
35
-1
1
3
1: staffing rate 2: market share 3: ROI
11
1 1 1
2 22
2 23 3
3
3
3
Unperturbed Reference Case
Case 2
Case 1
Descope
Descope +Lower Reliability
3/26/10 67© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Case Summaries
Case DeliveredSize(Function Points)
DeliveredReliabilitySetting
Cost ($M)
Delivery Time(Years)
Final Market Share
ROI
Reference Case: Unperturbed
700 1.0 4.78 2.1 28% 1.3
Case 1: Descope at Time = ½ years
550 1.0 3.70 1.7 28% 2.2
Case 2: Descope and Lower Reliabilityat Time = ½ years
550 .92 3.30 1.5 12% 1.0
3/26/10 68© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Example 2: Determining the Reliability Sweet Spot
• Analysis process– Vary reliability across runs– Use risk exposure framework to find process optimum– Assess risk consequences of opposing trends: market delays
and bad quality losses – Sum market losses and development costs– Calculate resulting net revenue
• Simulation parameters– A new 80 KSLOC product release can potentially increase
market share by 15%-30% (varied in model runs)– 75% schedule acceleration– Initial total market size = $64M annual revenue
• vendor has 15% of market• overall market doubles in 5 years
3/26/10 69© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Cost Components
$0
$5
$10
$15
$20
$25
$30
$35
Low Nominal High Very High
Software Reliability
Co
st (
Mil
lio
ns)
development costmarket delay lossbad quality losstotal cost
3-year time horizon
3/26/10 70© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Sweet Spot Depends on Time Horizon
$0
$20
$40
$60
$80
$100
$120
$140
$160
$180
Low Nominal High Very High
Software Reliability
Pro
fit
(Mil
lio
ns)
2 year time horizon3 year time horizon5 year time horizon
3/26/10 71© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
• To achieve real earned value, business value attainment must be a key consideration when designing software products and processes
• Software enterprise decision-making can improve with information from simulation models that integrate business and technical perspectives
• Optimal policies operate within a multi-attribute decision space including various stakeholder value functions, opposing market factors and business constraints
• Risk exposure is a convenient framework for software decision analysis
• Commercial process sweet spots with respect to reliability are a balance between market delay losses and quality losses
• Model demonstrates a stakeholder value chain whereby the value of software to end-users ultimately translates into value for the software development organization
Conclusions
3/26/10 72© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Future Work• Enhance product defect model with dynamic version of
COQUALMO to enable more constructive insight into quality practices
• Add maintenance and operational support activities in the workflows
• Elaborate market and sales for other considerations including– pricing scheme impacts, – varying market assumptions and– periodic upgrades of varying quality
• Account for feedback loops to generate product specifications (closed-loop control) – External feedback from users to incorporate new features– Internal feedback on product initiatives from organizational planning
and control entity to the software process• More empirical data on attribute relationships in the model will help
identify areas of improvement• Assessment of overall dynamics includes more collection and
analysis of field data on business value and quality measures from actual software product rollouts
3/26/10 73© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, 1999.
S. Biffl, A. Aurum, B. Boehm, H. Erdogmus, and P. Gruenbacher (eds.), Value-Based Software Engineering, Springer, 2005 (to appear).
D. Blackwell and M. Girshick, Theory of Games and Statistical Decisions, Wiley, 1954.
B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000.
B. Boehm and L. Huang, “Value-Based Software Engineering: A Case Study, Computer, March 2003, pp. 33-41.
B. Boehm, and R. Ross, Theory-W Software Project Management: Principles and Examples, IEEE Trans. SW Engineering., July 1989, pp. 902-916.
W. Brogan, Modern Control Theory, Prentice Hall, 1974 (3rd ed., 1991).
P. Checkland, Systems Thinking, Systems Practice, Wiley, 1981.
C. W. Churchman, R. Ackoff, and E. Arnoff, An Introduction to Operations Research, Wiley, 1957.
R. M. Cyert and J.G. March, A Behavioral Theory of the Firm, Prentice Hall, 1963.
K. Lee, “Development and Evaluation of Value-Based Review Methods”, USC-CSE, 2005
C. G. Hempel and P. Oppenheim, Problems of the Concept of General Law, in (eds.) A. Danto and S. Mogenbesser, Philosophy of Science, Meridian Books, 1960.
References - I
3/26/10 74© 2005-2010 USC-CSSE
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, 1996.
R. L. Keeney and H. Raiffa, Decisions with Multiple Objectives: Preferences and Value Tradeoffs, Cambridge University Press, 1976.
R. Madachy, “Integrated Modeling of Business Value and Software Processes”, ProSim Workshop, 2005
R. Madachy, Software Process Dynamics, IEEE Computer Society, 2006 (to-be published)
A. Maslow, Motivation and Personality, Harper, 1954.
J. Rawls, A Theory of Justice, Belknap/Harvard U. Press, 1971, 1999.
J. Thorp and DMR, The Information Paradox, McGraw Hill, 1998.
R. J. Torraco, Theory-building research methods, in R. A. Swanson & E. F. Holton III (eds.), Human resource development handbook: Linking research and practice pp. 114–137, Berrett-Koehler, 1997.
S. Toulmin, Cosmopolis: The Hidden Agenda of Modernity, U. of Chicago Press, 1992 reprint edition.
J. von Neumann and O. Morgenstern, Theory of Games and Economic Behavior, Princeton University Press, 1944.
A. W. Wymore, A Mathematical Theory of Systems Engineering: The Elements, Wiley, New York, 1967.
References - II