Upload
others
View
8
Download
0
Embed Size (px)
Citation preview
1
Maturity, Agility, PredictionThree Problems and TSPThree Problems and TSP
Dr. Bill CurtisSVP & Chief Scientist
CAST Software
© CAST 2008
2
Agenda
1. What I learned at college (L1 is fatal)
2. What I learned at TeraQuest (what we missed at L2)
3. What I learned at Agile 2006 (what we missed at L3)
4 What I learned from Shewhart ( h t i d t L4)4. What I learned from Shewhart (what we missed at L4)
5. The answer (question⎯how do you measure high maturity)(q y g y)
2© CAST 2008
3
What I Learned at College
When cowboys are undisciplined
they die!y
A cattle drive is a project withactual milestones
in waterhole increments
3© CAST 2008
4
What I Learned at TeraQuestLearning
ActingAnalyzeandPropose
SEI’s IDEAL®
Refine
Implementsolution
validatePropose
futureactionsInitiating
Establishimprove-
ment infra-structure
Stimulusfor
change
Setcontext
Buildsponsor-
ship Pilot testl i
solution
Character-ize current& desired
statesD l
Createsolution
solution
Developrecommen-
dationsSet
priorities DevelopPlan
actions
solutionDiagnosing
4© CAST 2008
Establishingp
approach
5
Standard Approach to Level 2
AnalyzeAnalyzeandandProposePropose
validatevalidateProposePropose
futurefutureactionsactions
InitiatingActing
EstablishEstablishimprovementimprovementinfrastructureinfrastructure
Stimulusfor
changeSetSet
contextcontextBuildBuild
sponsorshipsponsorship
CharacterizeCharacterizecurrent andcurrent and
desired statesdesired states
DevelopDevelopDevelopDeveloprecommendrecommend--
ationsations
SetSetprioritiespriorities DevelopDevelop
PlanPlanactionsactions
Diagnosing
5© CAST 2008
Establishingapproachapproach
6
Project-Focused IDEALLearning
AnalyzeAnalyzeandandProposePropose
validatevalidateProposePropose
futurefutureactionsactions
InitiatingActing
EstablishEstablishimprovementimprovementinfrastructureinfrastructure
Stimulusfor
changeSetSet
contextcontextBuildBuild
sponsorshipsponsorship
CharacterizeCharacterizecurrent andcurrent and
desired statesdesired states
DevelopDevelopDevelopDeveloprecommendrecommend--
ationsations
SetSetprioritiespriorities DevelopDevelop
PlanPlanactionsactions
Diagnosing
6© CAST 2008
Establishingapproachapproach
7
Project-Focused Improvement
Project 1Project 1
Projectpost-mortem
analysisjj
Project 2Project 2
Project
Project 2Project 2
Projectsupport &coaching
Projectl h
Project nProject n
7© CAST 2008
launchworkshop
Best way to train project managers in Level 2 skills was in project launch workshops with follow-up coaching
8
TSP Fills One Level 2 Engagement Gap
EngagementEngagementRoleRole
g gg gin CMM/CMMIin CMM/CMMI
E ti & Middl M t dExecutives & Middle Managers not engaged
Project Manager Team Leader engagedProject Manager, Team Leader engaged
Support groups (CM, PPQA) engagedSupport groups (CM, PPQA) engaged
Developer not engaged
8© CAST 2008
p g g
9
What I Learned at Agile 2006
9© CAST 2008
10
The True Path to Agility (& Level 3)
10© CAST 2008
11
Business Agility Is Limited by App Quality
Ossified, contorted, complex legacy code
Harmfully complexHarmfully complexHard to change
Easy to hack intoCostly to maintain
Speed in responding to the market depends on the ease of modifyingthe ease of modifying business applications
Secure and robust
W ll i d
Easy to changeCheap to maintain
Faster to build
11© CAST 2008
Well-engineered, high-quality code
12
Internal Quality Is Often Overlooked
The degree to which a product meets its specified requirementsQuality
ProblemProblem⎯Customers struggle to state functional requirements.
meets its specified requirementsQuality
ProblemProblem Customers struggle to state functional requirements. They do not understand non-functional requirements.
“…a failure to satisfy a non-functional requirement can be critical, even
catastrophic…non-functional requirements are sometimes difficult to verify We cannotare sometimes difficult to verify. We cannot
write a test case to verify a system’s reliability…The ability to associate code to
non-functional properties can be a powerful
12© CAST 2008
p p pweapon in a software engineer’s arsenal.”
13
Expanding Technical Diversity in Apps
ASP/JSP/VB/.NET
User Interface Tier
W b
Middle-ware
Enterprise Applications
Application Logic TierJava, C++, …
Frameworks Struts MVC Spring
WebServices
Frameworks Struts MVC, Spring
Legacy ApplicationsCICS Monitor (Cobol)
CICS Connector
Tuxedo Monitor (C)
B t h
Data Management TierEJB Hibernate
DatabasesDatabasesFilesFiles
COBOLCOBOLBatch
Shell Scripts
Database
EJB – Hibernate
13© CAST 2008
Preventing application level defects requires analysis of all the Preventing application level defects requires analysis of all the interactions between components of heterogeneous technologiesinteractions between components of heterogeneous technologies
14
Godot’s Gotta Be in There Somewhere
Product Catalogue
? h t
Resource pools?Connection pools?
? hosts? threads
Error handling?Timeouts?
RetailWebsite? hosts
Credit Card Application
? hosts
Order EntryApplication
? hosts? hosts?K threads ? threads
? hosts? threads
Express Service Application
? hosts
Shipping Application
? hostsEver have an
application hang
14© CAST 2008
? threads? threadswaiting for a response that will never come?
15
Even Worse, Dispersed Development
15© CAST 2008
16
Application Quality vs. Code Quality
Application quality measures how
Application QualityApplication quality measures how
well the individual components work together to make up the
overall business system
Code Quality
Code quality is the measure of individual components for compliance with standards and best practices in theand best practices in the
context of a specific language
16© CAST 2008
Good code quality Good code quality ≠≠ Good application qualityGood application quality
17
Some Level 3-ish Challenges for TSP/PSP
1. Managing architectural quality
2. Managing dispersed knowledge
I t ti d l t di i li3. Integrating development disciplines
4 Creating organizational capability4. Creating organizational capability
17© CAST 2008
18
Consortium for IT Software Quality
CISQCISQIT organizations, Outsourcers, Agencies, Experts
IT & AD TechnicalIT & AD Executives
Technical experts
Define industry issues Create quality standard
18© CAST 2008
Define industry issuesDrive standards adoptionBuild appraiser program
Create quality standardDeveloper certification
Integrate with standards
19
Is This Level 5 Process Under Control?Defects per KLOC during code review
UCL427
C
400450
KLO
C
300350
ts p
er
200250
_X
142
Def
ec
50100150
LCL0
050
1 6 11 16 21 26
19© CAST 2008 Control chart published in a major refereed journal
Units1 6 11 16 21 26
20
How Are Control Charts Used?
How do they plot process data in manufacturing?
20© CAST 2008
21
Then Why Do We Do It This WayPlotting inspection results mixes data across assignable causes:
Across different de elopers• Across different developers• Across different reviewers• Across components with different attributes
21© CAST 2008
22
What I Learned from Shewhart
a phenomenon that can b di t d t l tbe predicted, at least within limits associated with a given probability, is said to be controlled
Constant system of chance causes
a system in which nosource of variation
d i t
22© CAST 2008
predominates
23
What Predominates Software Variation
“ After product size, people factors have the strongest influence in determining the amount of effort required to develop a software product.” (P. 46)
“Personnel attributes and human resource activities provide by far the largest source of opportunity for i i ft d l t
23© CAST 2008 Boehm, et. al (2000)
improving software development productivity.” (Boehm, 1981, p.666)
24
Is This Review Process Under Statistical Control?
Defects per KLOC during code review
s 30
35
YES!??
efec
ts
UCL25
30 YES!??
ized
de UCL
22
15
20
orm
ali
_X10
5
10
No
LCL00
5
1 6 11 16 21 26
24© CAST 2008
Units1 6 11 16 21 26
Reviews
25
Is This Review Process Under Statistical Control?
Defects per KLOC during code review
s
UCL5.5
6
efec
ts
5.5
4
5 NO!??
ized
de
_X
3
orm
ali X
2.3
1
2
No
LCL00
1
1 6 11 16 21 26
25© CAST 2008
Units1 6 11 16 21 26
Reviews
26
This Review Process Is Under Statistical Control
Defects per KLOC during code review
OC 140
160UCL
er C
LO
100120140
_X
108
140
Rat
e pe
6080
100 108
LCL80
view
R
204060
This review process is under control and yielding accurate predictors of product quality, even if the defect data do not appear to be under control, defects characterize the design
Rev
020
1 6 11 16 21 26
pp gprocess, not the review process
26© CAST 2008
Units1 6 11 16 21 26
Reviews
27
This Review Process Is Not Under Statistical Control
Defects per KLOC during code review
OC 140
160UCL
er C
LO
100120140 140
_X
108
Rat
e pe
6080
100LCL80
108
This review process is not under control and b bl d t i ld t di t
view
R
204060 probably does not yield accurate predictors
Rev
020
1 6 11 16 21 26
27© CAST 2008
Units1 6 11 16 21 26
Reviews
28
Shuttle Code Inspection Process
What would you measure about this process?
D pe r I
D te e
Major errorInspectionpreparation
time
Time to close action items
e rv oe cl e
Inspec-tion
packageInspec- Inspec-
tion
Correctproblem
v se tl Pre-
meeting Inspec- Inspec-tionl e
o sp sm Review
ption
meeting
tionaction
closure Closedreview
o pp rm o e c
meetingindividual
review
ption
meeting
tionaction
closure
e n t
schedule reviewpackage
e cn et s
s
Meeting time & # participants
28© CAST 2008 Barnard & Carleton (1999)
Reinspection required
29
Team Predictors of Quality Outcomes
Analyst Designer Developer TesterAnalyst Designer Developer Tester
Preparation time Preparation timeCode & Unit
test timeTest case
creation\ time
Complexity Requirements Design Statistically
Preparation time Preparation time test time creation\ time
analysisInspectiong
Inspection based testing
Complexity ## of defects # of defects Mean time to failure
Projectmanager
α0 + α1X1 + α2X2 + α3X3 + α4X4 + ε = Ydefects
29© CAST 2008
managerQuality predictionQuantitative tools
30
Another Challenge for TSP/PSP
S b P j tS b S bP j t
α0 + α1X1 + α2X2 + α3X3 + ε = Youtcome
∧
Subprocess1 results
Projectoutcome
Subprocess2 results
Subprocess3 results
Projectinput
PSP offers the best opportunity to manage the impact of differences among
30© CAST 2008
developers, better approximating a common cause system. How do we aggregate these data at the team (TSP) and project level (QPM) to predict quality outcomes?
31
The Measure of High Maturity
We measure low maturity i i t th fin resistance, therefore
OhmsOhms
We measure high maturityin power to accomplish
k th fwork, therefore
Watts31
© CAST 2008
Watts