Upload
techwellpresentations
View
400
Download
1
Embed Size (px)
DESCRIPTION
Using an analogy to the building codes followed by architects and contractors in the construction of buildings, Rick Spiewak explores the fundamental principles for developing and delivering high quality, mission-critical systems. Just as buildings are constructed using different materials and techniques, we use a variety of languages, methodologies, and tools to develop software. Although there is no formal "building code" for software, software projects should consider-and judiciously apply-the recognized "best" practices of static analysis, automated unit testing, code re-use, and peer reviews. Rick takes you on a deep dive into each of these techniques where you'll learn about their advantages, disadvantages, costs, challenges, and more. Learn to recognize when you should apply the practices, gaining an appreciation and understanding of how you can achieve better quality without increasing costs or lengthening the development to delivery cycle time.
Citation preview
BW12 Concurrent Session 11/7/2012 3:45 PM
"Back to the Basics: Principles for Constructing Quality
Software"
Presented by:
Rick Spiewak The MITRE Corporation
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073 888‐268‐8770 ∙ 904‐278‐0524 ∙ [email protected] ∙ www.sqe.com
Rick Spiewak The MITRE Corporation
A lead software systems engineer at The MITRE Corporation, Rick Spiewak works at the Electronic Systems Center at Hanscom Air Force Base as part of the Battle Management Systems Group, concentrating on mission planning. In the computer software industry for more than forty years, Rick has developed software and managed software development for data acquisition systems, transaction processing, data communications, and networking. With a focus on the software quality improvement process, Rick has spoken on this topic at a number of conferences and been published in CrossTalk and MSDN magazines.
1
2012 Better Software Conference East7 November 2012
Back to the Basics: Principles for Constructing Quality Software
7 November, 2012
Rick SpiewakThe MITRE [email protected]
Approved for Public Release; Distribution Unlimited. Case Number: 12-3430
© 2012-The MITRE Corporation. All rights reserved.
What is Software Quality Management?:An Application of the Basic Principles of Quality Management
“Quality is free. It’s not a gift, but it is free What costs money are thefree. What costs money are the unquality things – all the actions that involve not doing jobs right the first time.” 1
2
1 “Quality Is Free: The Art of Making Quality Certain”, Philip B. Crosby. McGraw-Hill Companies (January 1, 1979)
2
What is Software Quality Management?:An Application of the Basic Principles of Quality Management
“You can’t inspect quality into a product ” 2product. 2
3
2 Harold F. Dodge, as quoted in “Out of the Crisis”, W. Edwards Deming. MIT, 1982
What is Software Quality Management?:An Application of the Basic Principles of Quality Management
“Trying to improve software quality by increasing the amount of testingby increasing the amount of testing is like trying to lose weight by weighing yourself more often.” 3
4
3 “Code Complete 2” Steve McConnell. Microsoft Press 2004
3
■Define quality:– “Meeting the requirements.”– Not: “Exceeding the customer’s expectations.”
Q lit i t i h i
Back to the Basics
■Quality improvement requires changes in processes– Fixing problems earlier in the process is more
effective and less costly than fixing them later.– The causes of defects must be identified and fixed
in the processes– Fixing defects without identifying and fixing the
causes does not improve product quality
5
causes does not improve product quality
Setting higher standards will help drive better development practices
■Classical Quality Management: start fresh in identifying and fixing process defects which may be unique to your organization
Two Ways to Get Started
be unique to your organization■Richard Hamming: “How do I obey Newton’s
rule? He said, ‘If I have seen further than others, it is because I’ve stood on the shoulders of giants.’ These days we stand on each other’s feet”
If we want to profit from the work of
6
If we want to profit from the work of pioneers in the field of software quality, we owe it to ourselves and them to stand on their shoulders.
4
■Requirements Definition■Architecture
Phases of Software Development
■Design■Construction■Testing■Documentation
Training
7
■Training■Deployment■Sustainment
■Historically a “write-only” exercise:If it doesn’t break, no one else reads it
Ad h b t t d d
What’s Wrong With Software Construction?
■Ad-hoc or absent standards■Testing as a separate exercise■Re-work (patch) to fix defects (“bugs”)■Features take precedence over quality■Definition of quality is not rigorous
St d d d b t ti t
8
Standards and best practices are not uniformly followed because they are not normally stated as requirements
5
If we built buildings this way….
What’s Missing in Software Construction?
Th i h dThey might not stand up
9
Or, we might not
Typical Building Code Requirements:■Building Heights and Areas■Types of Construction
Buildings are not built this wayBuilding construction has standards!
■Types of Construction■Soils and Foundations■Fire-Resistance and Fire Protection Systems■Means of Egress■Accessibility■Exterior Walls■Roof Assemblies
10
■Rooftop Structures■Structural Design■Materials (Concrete, Steel, Wood, etc.)■Electrical, Mechanical, Plumbing….
6
■There is a lack of external standards■Created ad hoc by each organization
Missing: the “Building Code” for software
y g■No penalty for inadequate standards■Best practices are often discarded
under cost and schedule pressure
11
■Where do building codes come from?– Not from a blank sheet of paper!
■Starting Point: The International Code Council®V tt d b l l th iti
Shoulders of Giants
– Vetted by local authorities– May rely on standards developed by various
independent standards organizations.■Example- The Building Code of New York State:
“The Building Code of New York State combines language from the 2006 International Building C ® f
12
Code®, and New York modifications developed by the State Fire Prevention and Building Code Council and its Building Code Technical Subcommittee.”
7
■Do we send engineers into a warehouse with a tub of concrete?O S f
A Concrete Example
■Or – Standing on the shoulders of giants…–The American Concrete Institute®
■New York State Building Code:“1903.1 General. Materials used to produce concrete, concrete itself and testing thereof shall comply with the applicable standards listed in
13
ACI 318.”
■Don’t invent our own standards■Identify industry best practices
How Do We Apply This to Software?
■Identify industry best practices■Enforce best practices
–Requirements–Rules
This is how to make sure our
14
software doesn’t fall down!
8
■Uniform Coding Standards– References– Tools
Improving Development Practices:
– Practices■Automated Unit Testing
– Design for test– Tools for testing– An Enterprise approach
■Root Cause Analysis and Classification– Analytic methods Top level categories :
• 0xxx Planning• 1xxx Requirements and Features
2 F ti lit I l t d
15
– Taxonomy■Code Reuse
– Development techniques– Reliable sources
• 2xxx Functionality as Implemented• 3xxx Structural Bugs• 4xxx Data• 5xxx Implementation• 6xxx Integration• 7xxx Real-Time and Operating System• 8xxx Test Definition or Execution Bugs• 9xxx Other
■References– .NET
■ Framework Design Guidelines: Conventions,
Improving Development Practices: Uniform Coding Standards
g ,Idioms, and Patterns for Reusable .NET Libraries
■ Practical Guidelines and Best Practices for Microsoft Visual Basic and C# Developers
– Java■ Effective Java Programming Language Guide ■ The Elements of Java Style
■Tools and Techniques
16
■Tools and Techniques– Static Code Analysis
■ .NET: FxCop, Visual Studio■ Java: FindBugs, ParaSoft JTest
– Code Review (with audit)
9
■Microsoft developed free tool■Equivalent version in Visual Studio
Improving Development Practices: ToolsStatic Analysis with FxCop for .NET
■Analyzes managed (.NET) code■Language independent■Applied to compiled code■Applies Microsoft best practices■Rules also documented in:
“Framework Design Guidelines”
17
■Preparation – inspection of code by developer
Improving Development Practices:Coding Standards – Code Review
inspection of code by developer –may uncover defects, inspire changes
■Review by other programmers –sharing of ideas, improved techniques–uncovers defects or poor techniques
■Determining, fixing causes of defects
18
■Determining, fixing causes of defects■Customer audit
–provides assurance of execution
10
■Design Impact– Design for Test
Improving Development Practices: Automated Unit Testing
– Test Driven Development■Tools and Techniques
– .NET■NUnit/NCover/NCover Explorer■Visual Studio
– Java■ JUnit/Cobertura (etc )
19
■ JUnit/Cobertura (etc.)■Enterprise Impact
– Uniform Developer Usage– Use by Test Organizations
■A CMMI 5 practice area – but this should be a requirement regardless of CMMI level.
■Find the cause
Improving Development Practices: Root Cause Analysis
■Find the cause– “Five Whys”– Kepner-Trego Problem Analysis– IBM: Defect Causal Analysis
■Fix the cause => change the process■Fix the problem: use the changed process
H t P K l d ?
20
■How to Preserve Knowledge?– Classify Root Causes– Look for patterns– Metrics: Statistics, Pareto Diagrams
Top level categories : • 0xxx Planning• 1xxx Requirements and Features• 2xxx Functionality as Implemented• 3xxx Structural Bugs• 4xxx Data• 5xxx Implementation• 6xxx Integration• 7xxx Real-Time and Operating System• 8xxx Test Definition or Execution Bugs• 9xxx Other
11
■Beizer Taxonomy– Classification of Root Causes of Software Defects
D l d b B i B i
Improving Development Practices: Root Cause Classification
– Developed by Boris Beizer– Published in “Software Testing Techniques 2nd Edition”– Modified by Otto Vinter– Based on the Dewey Decimal System– Extensible Classification – “A Beizer categorisation can be performed at a rate of 5
minutes per bug” *O th l D f t Cl ifi ti
21
■Orthogonal Defect Classification■Defect Causal Analysis
* PRIDE report, section 5.1
Top level categories : • 0xxx Planning• 1xxx Requirements and Features
Classifying Root Causes: Beizer* Taxonomy
1xxx Requirements and Features• 2xxx Functionality as Implemented• 3xxx Structural Bugs• 4xxx Data• 5xxx Implementation• 6xxx Integration
22
• 7xxx Real-Time and Operating System• 8xxx Test Definition or Execution Bugs• 9xxx Other
* Boris Beizer, "Software Testing Techniques", Second edition, 1990, ISBN-0-442-20672-0
12
■Extensibility features in .NET■Microsoft Patterns and Practices
Improving Development Practices: Software Reuse for .NET
–Enterprise Library■Data Access Application Block■Logging Application Block■Tracing (Core)
■.NET Library Features–Windows Presentation Foundation
23
Windows Presentation Foundation–Windows Communication Foundation–Windows Workflow–Entity Framework–LINQ
■Requirements: a key source of defects■1999 – PRIDE study in Denmark
Improving Requirements
■Key techniques studied:– User scenario descriptions– Navigational Prototype Usability Testing
■Key results achieved:(Comparison between product versions)– 27% reduction in error reports
24
– 72% reduction in usability issues per new screen– Almost 3 times difference in productivity
13
How Much Does It Cost?
???? ????
??
?? ??????
????
??
25
?? ??
■Cost and Benefits of Automated Unit Testing■The situation:
Cost/Benefit Analysis:Automated Unit Testing (AUT)
■The situation:– Organizations either use AUT or don’t– No one will stop to compare(if they do, they won’t tell anyone what they found out!)
■The basic cost problem:– To test n lines of code, it takes n to n + 25% lines– Why wouldn’t it cost more to do this?
26
– If there isn’t any more to it, why use this technique?■The solution:
– Use a more complete model– There’s more to cost than lines of code!
14
■Based on analysis of thousands of projects■Takes into account a wide variety of factors:
– Sizing
The SEER-SEM1 Modeling tool
g– Technology– Staffing– Tool Use– Testing– QA
■Delivers outputs:
27
p– Effort– Duration– Cost– Expected Defects
1SEER® is a trademark of Galorath Incorporated
■Consider the cost of defects: – Legacy defects to fix
New defects to fix
Cost/Benefit Analysis: Technique
– New defects to fix– Defects not yet fixed (legacy and new)
■Model costs using SEER-SEM scenarios– Cost model reflecting added/modified code– Comparison among scenarios with varying
development techniquesSchedule Effort for each scenario
28
– Schedule, Effort for each scenario– Probable undetected remaining defects after
FQT (Formal Qualification Test) per scenario
15
■The Project: – Three major applications– Two vendor-supplied applications
Cost-Benefit Analysis: Example
pp pp– Moderate criticality
■The cases:– Baseline: no AUT
■Nominal team experience (environment, tools, practices)– Introducing AUT
■ Increases automated tool use parameterD d l t i t i
29
■Decreases development environment experience■ Increases volatility
– Introducing AUT and Added Experience■ Increases automated tool use parameter■Experience and volatility changes are eliminated
■Estimated schedule months■Estimated effort
– Effort months– Effort hours
Cost-Benefit Analysis: Results
– Effort hours– Effort costs
■Estimate of defect potential– Size– Complexity
■Estimate of delivered defects– Project size
30
– Programming language– Requirements definition formality– Specification level– Test level– …
16
Defect Prediction Detail*
Baseline Introducing AUT
Difference AUT + Experience
Difference
Potential Defects 738 756 2% 668 ‐9%
Defects Removed 654 675 3% 600 ‐8%
Delivered Defects 84 81 ‐4% 68 ‐19%Defect Removal Efficiency 88.60% 89.30% 89.80%Hours/Defect Removed 36.52 37.41 2% 35.3 ‐3%
31
* SEER-SEM Analysis by Karen McRitchie, VP of Development, Galorath Incorporated
Cost Model*
Baseline Introducing AUT
Difference AUT + Experience
Difference
Schedule Months 17.09 17.41 2% 16.43 ‐4%
Effort Months 157 166 6% 139 ‐11%
Hours 23,881 25,250 6% 21,181 ‐11%
Base Year Cost $2,733,755 $2,890,449 6% $2,424,699 ‐11%
Defect Prediction 84 81 ‐4% 68 ‐19%
32
* SEER-SEM Analysis by Karen McRitchie, VP of Development, Galorath Incorporated
17
■What are the best, proven techniques?■Optimal Sequence of Software Defect Removal
From:
Defect Removal – Capers Jones
From:“Software Engineering Best Practices:Lessons from Successful Projects in the Top Companies”McGraw-Hill, 2010. Chapter 9, pp. 618-619
■Combining these recommended methods “will achieve cumulative defect removal efficiency levels in excess of 95 percent for every software project and can achieve 99 percent for some projects ”
33
percent for some projects.”
1.Requirements inspection2.Architecture inspection3
Pretest Defect Removal
3.Design inspection4.Code inspection5.Test case inspection6.Automated static analysis
34
18
7.Subroutine test8.Unit test9 f
Testing Defect Removal
9.New function test10.Security test11.Performance test12.Usability test13.System test14 Acceptance or beta test
35
14.Acceptance or beta test
■If you develop software:–Follow general best practices–Add best practices for your technology
What Should We Do?
Add best practices for your technology–Formulate your own guidelines but -
■Stand on the shoulders of giants!!–Enforce your guidelines through reviews
■If you contract for software development–Examine the practices of competitors
Insist on accountability for best practices
36
– Insist on accountability for best practices–Trust, but verify!
■If you acquire software for the government–This is a whole separate topic! See references.
19
■Have/Require a Software Development Plan–Processes and practices–Tools and Techniques to be used
How to Incorporate Best Practices
Tools and Techniques to be used–Requirements management–Types and frequency of reviews–Types of tests–Configuration and release management–Well-Defined Deliverables
■Have/Require a Software Test Plan
37
■Have/Require a Software Test Plan–Development Test–QA Testing–Acceptance Test
■The use of known best practices can improve the quality of software
Summary
p q y■Better results can be achieved at the
same time as lower costs■If you want these benefits, you have
to require best practices!
38 38
20
Questions
39 39
■ Spiewak, Rick and McRitchie, Karen. Using Software Quality Methods to Reduce Cost and Prevent Defects, CrossTalk, Dec 2008. http://www.crosstalkonline.org/storage/issue-archives/2008/200812/200812-Spiewak.pdf
■ McConnell Steve Code Complete 2 Microsoft Press 2004
Selected References
■ McConnell, Steve. Code Complete 2. Microsoft Press, 2004.■ Crosby, Philip B. Quality Is Free: The Art of Making Quality Certain.
McGraw-Hill Companies, 1979. ■ Beizer, Boris. Software Testing Techniques. 2nd ed. International
Thomson Computer Press, 1990. ■ Jones, Capers. Software Engineering Best Practices: Lessons from
Successful Projects in the Top Companies. McGraw-Hill Companies, 2010.
40
■ Vinter O., S. Lauesen & J. Pries-Heje. A Methodology for Preventing Requirements Issues from Becoming Defects (1999) http://www.ottovinter.dk/Finalrp3.doc
■ Spiewak, R. et al. Applying the fundamentals of quality to software acquisition. 2012 IEEE SysConhttp://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=6189447
■ 40