Upload
evelyn-ryan
View
214
Download
2
Embed Size (px)
Citation preview
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed. 1
A Practical Definition of “Software Quality” (Predictable and Measurable)
• Low Defect Potentials (< 1 per function point)• High Defect Removal (> 95%)• Unambiguous, Stable Requirements (< 2.5% change)• Explicit Requirements Achieved (> 97.5% achieved)• High User Satisfaction Ratings (> 90% “Excellent”)
– Installation - Error Handling– Ease of Learning - User Information (screens, manuals, etc)– Ease of Use - Customer Support– Functionality - Defect Repairs– Compatibility
Rev 1.1
Software developers and acquirers at firms that GAO visited use threefundamental management strategies to ensure the delivery of high-quality
products on time and within budget: working in an evolutionaryenvironment, following disciplined development processes, and collecting
and analyzing meaningful metrics to measure progress.“Stronger Management Practices are Needed to Improve DoD’s Software-Intensive Weapon
Acquisitions, GAO Report, March 2004
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed. 2
Perspectives on Software Quality
QUALITY
SUITABILITY
MAINTAINABILITY
Reliability
Efficiency
Usability
Understandability
Modifiability
Testability
Portability
Completeness
Accuracy
Consistency
THE ESSENCE: Does it do what it is supposed to?
“Software problems and defects are among the few
direct measurements of software processes and products. Problem and
defect measurements also are the basis for
quantifying several significant software quality
attributes, factors, and criteria--reliability,
correctness, completeness, efficiency, and usability
among them”
Software Quality Measurement, SEI-92-TR-22
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed.
The Software Quality Metrics Methodology
1) Establish Requirementsa) Identify list of possible quality requirements
b) Determine list of quality requirementsc) Assign a direct metric to each quality requirement
2) Identify Metricsa) Apply the software quality metrics framework
b) Perform a cost-benefit analysisc) Gain commitment to the metrics
3) Implement Metricsa) Define the data collection procedures
b) Prototype the measurement processc) Collect the data and compute the metric values
4) Analyze Resultsa) Interpret the results
b) Identify software qualityc) Make software quality predictionsd) Ensure compliance with requirements
5) Validate Metricsa) Apply the validation methodology
b) Apply the validity criteriac) Apply the validation proceduresd) Document results
IEEE Standard for a Software QualityMetrics MethodologyIEEE Std 1061-1998
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed. 4
Goal-Question-Metrics (GQM) Measurement Methodology
Goal-Question-Metrics (GQM) Measurement Methodology is a way to implement Goal-Driven Measurement
STEP 1: Identify goals
STEP 2: Identify questions that need to be asked if the goal is to be achieved
STEP 3: Identify an indicator to display the answers to your questions in STEP 2
STEP 4: Identify measures that can satisfy the question (PSM Method)
Vers 1.0
SOFTWARE ACQUISITION GOLD PRACTICETM
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed. 5
GOAL-DRIVEN MEASUREMENT (GDM) SOFTWARE ACQUISITION GOLD PRACTICETM
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed.
Key Practices of the GQM Approach
• Get the right people(at all levels of developers) involved in the GQM process
• Set and state explicit measurement goals and state them explicitly• Thoroughly plan the measurement program and document it
(explicit and operational definitions) • Don’t create false measurement goals • Acquire implicit quality models from the team• Consider context• Derive appropriate metrics• Stay focused on goals when analyzing data
SOFTWARE ACQUISITION GOLD PRACTICETM
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed.
Key Practices of the GQM Approach (Cont’d)
• Let the data be interpreted by the people involved• Integrate the measurement activities with regular project
activities• Do not use measurement for other purposes• Secure management commitment to support measurement
results• Establish an infrastructure to support the measurement program• Ensure that measurement is viewed as a tool, not the end goal• Get training in GQM before going forward
SOFTWARE ACQUISITION GOLD PRACTICETM
DACS Gold Practices Website https://www.goldpractices.com/practices/gqm/
Optional: Goal-Question-Metric Exercise (Link)
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed.
Lifecycle Software Development Activity
Initial $ Spent
Errors Introduced
Errors Found
Relative Cost of Errors
Requirements Analysis 5% 55% 18% 1.0
Design Analysis 25% 30% 10% 1.0 – 1.5
Testing Activities 60% 10% 50% 1.5 – 5.0
Documentation 10% --- --- ---
PDSS ---* 5% 22% 10-100
Example Typical Costs of Software Fixes
*Once a system is fielded, PDSS costs are typically 50-70% of total system lifecycle costs
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed.
Other Software Quality and Capability Initiatives
• Practical Software and Systems Measurement (PSM)– Best practices within the software/system acquisition and engineering communities.– Goal is to provide Project Managers with the information needed to meet cost,
schedule, and technical objectives on programs.• Control Objectives for Information and related Technology (COBIT)
– Provides good practices across a domain and process framework – Practices designed to help optimize IT-enabled investments, ensure service delivery
and provide a measure against which to judge when things do go wrong.• Information Technology Infrastructure Library (ITIL)
– Provides international best practices for IT service management– Consists of a series of books giving guidance on the provision of quality IT services,
and on the accommodation and environmental facilities needed to support IT• SPICE (SW Process Improvement and Capability Determination) (ISO/IEC 15504)
– An international standard for software process assessment– Derived from process lifecycle standard ISO 12207 and ideas of maturity models like
Bootstrap, Trillium and the CMM.
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed. 10
Selecting Measures
Vers 1.0
Prospective Measures
• Requirements traced• Requirements tested• Requirements status• Problem reports opened• Problem reports closed• Reviews completed• Change requests opened • Change requests resolved• Units designed
• Units coded• Units integrated• Test cases attempted• Test cases passed • Action item opened• Action item completed• Components integrated• Functionality integrated
Schedule and Progress• Milestone completion• Work unit progress• Incremental Capability
Detailed Design Progress
0
100
200
300
400
500
600
700
5 Jul 98 26 Jul 98 16 Aug 98 6 Sep 98 27 Sep 98 18 Oct 98 8 Nov 98 29 Nov 98
Date
Nu
mb
er o
f U
nit
s C
om
ple
tin
g D
esig
n
Plan
Actual
Data as of 4 March 99Project: TNMS
Detailed Design Progress
0
100
200
300
400
500
600
700
5 Jul 98 26 Jul 98 16 Aug 98 6 Sep 98 27 Sep 98 18 Oct 98 8 Nov 98 29 Nov 98
Date
Nu
mb
er o
f U
nit
s C
om
ple
tin
g D
esig
n
Plan
Actual
Data as of 4 March 99Project: TNMS
Code and Unit Test Progress
0
100
200
300
400
500
600
700
6 Oct 98 10 Nov 98 15 Dec 98 19 Jan 99 23 Feb 99 30 Mar 99 4 May 99 8 Jun 99
Date
Nu
mb
er o
f U
nit
s C
om
ple
tin
g C
od
e an
d U
T
Plan
Actual
Data as of 4 March 99Project: TNMS
Code and Unit Test Progress
0
100
200
300
400
500
600
700
6 Oct 98 10 Nov 98 15 Dec 98 19 Jan 99 23 Feb 99 30 Mar 99 4 May 99 8 Jun 99
Date
Nu
mb
er o
f U
nit
s C
om
ple
tin
g C
od
e an
d U
T
Plan
Actual
Data as of 4 March 99Project: TNMS
Code and Unit Test Progressby SI
0
50
100
150
200
250
300
A B C D
SI
Nu
mb
er o
f U
nit
s C
om
ple
tin
g C
od
e an
d U
T
Total Plan
Plan to Date
Actual to Date
Project: TNMS Data as of 4 March 99
Code and Unit Test Progressby SI
0
50
100
150
200
250
300
A B C D
SI
Nu
mb
er o
f U
nit
s C
om
ple
tin
g C
od
e an
d U
T
Total Plan
Plan to Date
Actual to Date
Project: TNMS Data as of 4 March 99
Requirements Growth
0
50
100
150
200
250
300
Jan 98 Apr 98 Jul 98 Oct 98 Jan 99 Apr 99 Jul 99 Oct 99 Jan 00 Apr 00 Jul 00 Oct 00 Jan 01
Date
Nu
mb
er o
f R
equ
irem
ents
Plan
Actual
Project: TNMS Data as of 4 March 99
Requirements Growth
0
50
100
150
200
250
300
Jan 98 Apr 98 Jul 98 Oct 98 Jan 99 Apr 99 Jul 99 Oct 99 Jan 00 Apr 00 Jul 00 Oct 00 Jan 01
Date
Nu
mb
er o
f R
equ
irem
ents
Plan
Actual
Project: TNMS Data as of 4 March 99
•Goal•Questi
on•Metric
IRM 202/ Lesson 10/ SW Quality and Test/ DAU/Denman/ October 2009 Learn. Perform. Succeed. 11
References
The Data and Analysis Center for Software (DACS)https://www.thedacs.com/
Practical Software and Systems Measurement (PSM)http://www.psmsc.com
Software Engineering Information Repository (SEIR)https://seir.sei.cmu.edu/seir/
Software Program Manager’s Network (SPMN)http://www.spmn.com/lessons.html
Software Engineering Institute (SEI) – Carnegie Melonhttp://www.sei.cmu.edu/
DoD Information Technology Standards Registry (DISR Online)https://disronline.disa.mil/a/public/index.jsp
Best Practices Clearinghouse https://acc.dau.mil/sam