V&V Lifecycle Methodologies
By David F. Rico
2
Overview
• What is V&V ?• V&V Approaches
– Testing (Post Process)– Lifecycle Frameworks (In Process)– Lifecycle Methodologies (In Process)
• Costs & Benefits• Myths & Misconceptions• Conclusion• Bibliography
3
What is V&V?
• V&V is the process of determining whether:
– Requirements for a system or component are complete and correct
– Products of each development phase fulfill the requirements or conditions imposed by the previous phase
– Final systems or components comply with specified requirements
4
Testing (Post Process)
SoftwareDevelopment Testing
In Process Post Process
5
Testing Techniques
White Box(internal)
Basis Path
Control Structure
Black Box(external)
Specification
Interface
Operational
Flow Graph NotationCyclomatic Complexity
Deriving Test CasesGraph Matrices
ConditionData Flow
LoopGraph Based Methods
Equivalence PartitioningBoundary Value Analysis
ComparisonInterface Misuse
Interface MisunderstandingTiming
Serialization
ScenarioRequirements
Use
6
Testing DocumentsTestPlan
TestDesign
TestCase
TestProcedure
TestLog
TestIncident
TestSummary
Test ItemTransmittal
7
Testing Levels
Unit
Module
Component
Subsystem
Integration
System
Acceptance
8
What does Testing Do?
• V&V is the process of determining whether:
– Requirements for a system or component are complete and correct
– Products of each development phase fulfill the requirements or conditions imposed by the previous phase
Final systems or components comply withspecified requirements
9
Lifecycle Frameworks (In Process)
ConceptPhase
! Concept Evaluation
RequirementsPhase
! Trace Requirements! Evaluate Requirements! Interface Analysis! Test Plan Generation
DesignPhase
! Trace Design! Evaluate Design! Interface Analysis! Test Plan Generation! Test Design Generation
CodePhase
! Trace Code! Evaluate Code! Interface Analysis! Document Evaluation! Test Case Generation! Test Proc Generation! Component Test
TestPhase
! Test Proc Generation! Integration Test! System Test! Acceptance Test
InstallationPhase
! Configuration Audit! V&V Reporting
MaintenancePhase
! V&V Plan Revision! Problem Evaluation! Change Evaluation! Lifecycle Reiteration
Inputs
! Schedules! Concept! Software Requirements! Interface Requirements! Software Design! Interface Design! Source Code! Executable Code! User Documentation! V&V Plan! Change Requests! Problem Reports! V&V Reports
Inputs
! Installation Package
Inputs
! Source Code! Executable Code! User Documents
Inputs
! Standards! Software Design! Source Code! Executable Code! Interface Design! User Documentation
Inputs
! Standards! Software Requirements! Software Design! Interface Requirements! Interface Design! User Documentation
Inputs
! Concept! Software Requirements! Interface Requirements! User Documentation
Inputs
! Schedules! Concept
Outputs
! Revised V&V Plan! Task Reports! Lifecycle Reports! Problem Reports! V&V Report
Outputs
! Task Reports! Problem Reports! V&V Reports
Outputs
! Task Reports! Acceptance Test Procs! Problem Reports! V&V Reports
Outputs
! Task Reports! Component Test Cases! Integration Test Cases! System Test Cases! Acceptance Test Cases! Component Test Procs! Integration Test Procs! System Test Procs! Problem Reports! V&V Report
Outputs
! Task Reports! Component Test Plan! Integration Test Plan! Component Test Design! Integration Test Design! System Test Design! Acceptance Test Design! Problem Reports! V&V Report
Outputs
! Task Reports! System Test Plan! Acceptance Test Plan! Problem Reports! V&V Report
Outputs
! Task Reports! Problem Reports! V&V Report
“IEEE Standard for Software Verification and Validation Plans,” IEEE Std 1012-1986.
10
More Lifecycle Framework Tasks
Man
agem
ent
Con
cept
Req
uire
men
ts
Des
ign
Impl
emen
tatio
n
Test
Inst
alla
tion
Mai
nten
ance
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓
✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓
✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓
✓ ✓ ✓ ✓ ✓ ✓
✓
✓ ✓ ✓ ✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓
✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓
✓ ✓ ✓ ✓
✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓
✓ ✓ ✓
✓ ✓ ✓ ✓ ✓ ✓ ✓
✓ ✓
✓ ✓
✓ ✓
✓ ✓
✓ ✓ ✓ ✓ Test Source Code Requirements DesignWalkthroughsV&V Tool Plan GenerationUser Document EvaluationTest WitnessingTest EvaluationTest CertificationSizing and Timing AnalysisSimulation Analysis Test Readiness Operational ReadinessReviews SupportRegression AnalysisQualification TestingPerformance MonitoringInstallation and CheckoutFeasibility Study EvaluationData Flow AnalysisDatabase AnalysisControl Flow AnalysisConfiguration Management Physical In-Process Functional Configuration ControlAudit Support Physical In-Process Functional Configuration ControlAudit PerformanceAlgorithm Analysis
Lifecycle Phase
Optional Task
✓
“IEEE Standard for Software Verification and Validation Plans,” IEEE Std 1012-1986.
11
What do Lifecycle Frameworks Do?
• V&V is the process of determining whether:
– Requirements for a system or component are complete and correctProducts of each development phasefulfill the requirements or conditionsimposed by the previous phase
Final systems or components comply withspecified requirements
12
Lifecycle Methodologies (In Process)
Analysis Design Code UnitTest
SystemTest
ComponentTest
Defects
Kan, S. H. (1995). Metrics and models in software quality engineering. Reading, MA: Addison-Wesley.
13
Basis for Lifecycle Methodology
Product ObjectivesArchitectureSpecification
High Level DesignInter Component Interfaces
Low Level DesignCode
Test PlanTest Cases
System/36 System/38 Ideal AS/400V&V ActivityLifecycle Phase
Review
ReviewReviewReview
InspectionInspectionInspectionInspection
ReviewWalkthrough
InspectionInspectionInspectionInspectionInspection
InspectionInspectionInspectionInspection
InspectionInspection
InspectionInspection
Review
Inspection
InspectionInspectionInspectionInspection
InspectionWalkthrough
ReviewWalkthrough
Inspection InspectionInspection
Sulack, R. A., Lindner, R. J., & Dietz, D. N. (1989). A new development rhythm for AS/400 software. IBM Systems Journal, 28(3), 386-406.
14
Software Inspection Process
Planning
!"
!
!
!
!
"""
"
""
"""
"""
"
"
Organization
ArtifactSpecificationsChecklists
Approve
ScheduleParticipation
NoticeRoomsCertify
ArtifactSpecificationsChecklists
Certification
Moderator
Purpose
Inputs
Tasks
Outputs
Roles
Overview
!"
!
!
!
!
""""
""
"
"""
Orientation
ArtifactSpecificationsChecklistsDefect Types
IntroduceInitiate
Artifact
ModeratorAuthorInspectors
Purpose
Inputs
Tasks
Outputs
Roles
Preparation
!"
!
!
!
!
""""
"
"
"
Familiarization
ArtifactSpecificationsChecklistsDefect Types
Procedures
Certification
Inspectors
Purpose
Inputs
Tasks
Outputs
Roles
Inspection
!"
!
!
!
!
"""""
""
"""
""
"""""
Identify Defects
ArtifactSpecificationsChecklistsDefect TypesDefect Forms
IntroduceInitiate
ReadIdentify DefectsRecord Defects
Defect ListCertification
ModeratorAuthorReaderInspectorsRecorder
Purpose
Inputs
Tasks
Outputs
Roles
Rework
!"
!
!
!
!
"""
""
""
"
Correct Defects
Artifact CopyDefect ListCertification
Correct DefectsCertify
Corrected ArtifactCertification
Author
Purpose
Inputs
Tasks
Outputs
Roles
Followup
!"
!
!
!
!
""""
""
"
"""
"
Certify Artifact
ArtifactCorrected ArtifactDefect ListCertification
Verify CorrectionsTally Metrics
Certify
Validated ArtifactResultsCertification
Moderator
Purpose
Inputs
Tasks
Outputs
Roles
" Specifications
" Present
" Checklists
" Certification
" Q & A" Certify
" Brief" Certification
" Distribute
" Defect Types" Certification
" Checklists
" Defect Types" Notice
" Defect Types
" Defect Types" Brief
" Brief" Certification
" Procedures
" Roles
" Notice
" Roles
" Procedures " Procedures
" Notice
" Roles" Brief" Artifact" Specifications" Checklists" Defect Types" Certify
" Procedures
" Certification
" Moderate
" Certify
" Procedures " Procedures
" Submit Results
Fagan, M. E. (1976). Design and code inspections to reduce errors in program development. IBM Systems Journal, 12(7), 744-751.
15
Inspection Process Measurability
0.5
1.0
1.5
2.0
Hours
0.5 * M 1.0 * P 1.0 * I
Hours = Product Size / ( Inspection Rate * 2 ) * ( Team Size * 4 + 1)
+ +
OverviewSubstage
2.0 * P 1.0 * C 0.5 * M+ + +
PlanningSubstage
FollowupSubstage
10,000 Lines
People 180 SLOCPer Hour
120 SLOCPer Hour
60 SLOCPer Hour
4 1,417 708 472
5 1,750 875 583
6 2,083 1,042 694
7 2,417 1,208 806
100,000 Lines180 SLOCPer Hour
120 SLOCPer Hour
60 SLOCPer Hour
1,000,000 Lines180 SLOCPer Hour
120 SLOCPer Hour
60 SLOCPer Hour
20,833
17,500
14,167
24,167
10,417
8,750
7,083
12,083
6,944
5,833
4,722
8,056
208,333
175,000
141,667
241,667
104,167
87,500
70,833
120,833
69,444
58,333
47,222
80,556
PreparationSubstage
InspectionSubstage
ReworkSubstage
Russell, G. W. (1991). Experience with inspection in ultralarge-scale developments. IEEE Software, 8(1), 25-31.
16
Lifecycle Methodology Accuracy
Project Size Language Spec
A 680K Jovial
B 30K PL/1
C 70K BAL
D 1,700K Jovial
E 290K Ada
F 70K --
G 540K Ada
H 700K Ada
Design Code UnitTest
CompTest
SystemTest
FirstYear
ProductLife
EstimateAccuracy
4
2
6
4
4
1
2
6
--
7
25
10
8
2
5
7
13
14
6
15
13
4
12
14
0.6
6.0
0.3
0.9
0.7
2.1
1.1
0.4
0.6
6.0
0.4
0.8
0.6
2.2
1.2
0.4
0.3
3.0
0.2
0.4
0.3
1.1
0.6
0.2
2
--
0.5
3
0.1
0.9
1.8
0.4
4
7
2
3
8
5
4
1
5
9
3
4
--
6
12
3
Kan, S. H. (1995). Metrics and models in software quality engineering. Reading, MA: Addison-Wesley.
17
What do Lifecycle Methodologies Do?
• V&V is the process of determining whether:
Products of each development phasefulfill the requirements or conditionsimposed by the previous phase
Final systems or components comply withspecified requirements
Requirements for a system or componentare complete and correct
18
Costs & Benefits
* PSP V&V hours include development time
Start DefectsInspection Hours
End DefectsInspection Efficiency
Start DefectsTest HoursEnd Defects
Test EfficiencyTotal V&V HoursTotal V&V DefectsDelivered Defects
LifecycleInspection
Program Size 10,000 SLOC1,000708100
1.27 Defects/Hour100
1,14410
12.71 Hours/Defect
Testing
10,000 SLOC1,000
1,85299010
PSPsm
10,000 SLOC1,000
400 *1,000
0
1,00011,439
10012.71 Hours/Defect
11,439900100
MethodologyCost/Benefits
sm Personal Software Process and PSP are service marks of Carnegie Mellon University.
19
Costs of Methodologies
Analysis Design Code UnitTest
SystemTest
ComponentTest
Defects
V&V Methodology Lifecycle Cost
Testing Methodology Cost (10X)
20
Hewlett Packard
89
Millions
90 91 92 93 94 95 96 97 98
10
20
30
40
50
60
70
Software Inspection Process Return on Investment (ROI)
Grady, R. B. (1997). Successful software process improvement. Saddle River, NH: Prentice Hall.
21
Raytheon
50% 65% 93% 98% 99.9% 99.99964%
$1.5 Billion
$1.5 Billion
$1 Billion
6 Sigma
5 Sigma
4 Sigma
3 Sigma2 Sigma
How Often Defects Are Likely to Occur
5
10
15
20
25
30
Percentof Sales
Velocci, A. L. (Nov 1998). High hopes riding on six sigma at raytheon. Aviation Week & Space Technology.
22
Myths & Misconceptions
• V&V, Quality, and Testing often confused
• Quality and Testing often equated
• Testing and V&V often equated
• Testing believed to be sufficient
• V&V often confused with IV&V
• IV&V believed to be better than Lifecycle Methodologies
23
Conclusion
• Testing is inefficient and happens too late in the lifecycle
• Lifecycle Frameworks are inundating, non-methodological, and not easily measured
• Lifecycle Methodologies are fast, efficient, measurable, and accurate
24
Bibliography
• “Modeling and Software Development Quality,” Stephen H. Kan, IBMSystems Journal, Vol 30, No 3, 1991
• “AS/400 Software Quality Management,” Stephen H. Kan, et al., IBM Systems Journal, Vol 33, No 1, 1994
• “A New Development Rhythm for AS/400 Software,” Richard A.Sulack, et al., IBM Systems Journal, Vol 28, No 3, 1989
• “Lessons Learned from Three Years of Inspection Data,” Edward F.Weller, IEEE Software, September 1993
• “IEEE Standard for Software Verification and Validation Plans,” IEEE Std 1012-1986
• “IEEE Guide for Software Verification and Validation Plans,” IEEE Std 1059-1993