Upload
maryann-nicholson
View
219
Download
2
Tags:
Embed Size (px)
Citation preview
Software Engineering
Lecture 14: Testing Techniques and Strategies
Today’s Topics Chapters 17 & 18 in SEPA 5/e Testing Principles & Testability Test Characteristics Black-Box vs. White-Box Testing Flow Graphs & Basis Path Testing Testing & Integration Strategies
Software Testing Opportunities for human error
• Specifications, design, coding
• Communication
“Testing is the ultimate review” Can take 30-40% of total effort For critical apps, can be 3 to 5 times all other
efforts combined!
Testing Objectives Execute a program with the intent of finding
errors Good tests have a high probability of discovering
errors Successful tests uncover errors ‘No errors found’: not a good test! Verifying functionality is a secondary goal
Testing Principles Tests traceable to requirements Tests planned before testing Pareto principle: majority of errors traced to
minority of components Component testing first, then
integrated testing Exhaustive testing is not possible Independent tests: more effective
Software Testability
Operability Observability Controllability Decomposability
Characteristics that lead totestable software:
Simplicity Stability Understandability
Operability
System has few bugs No bugs block execution of tests Product evolves in functional stages
The better it works, the more efficientlyit can be tested
Observability
Distinct output for each input States & variables may be queried Past states are logged Factors affecting output are visible Incorrect output easily identified Internal errors reported Source code accessible
What you see is what you test
Controllability
All possible outputs can be generated by some input
All code executable by some input States, variables directly controlled Input/output consistent, structured Tests are specified, automated, and reproduced
The better we can control the software,the more the testing can be automated
Decomposability
Independent modules Modules can be tested separately
By controlling the scope of testing, wecan more quickly isolate problems andperform smarter retesting
Simplicity
Minimum feature set Minimal architecture Code simplicity
The less there is to test, the morequickly we can test it
Stability
Changes made to system:• are infrequent
• are controlled
• don’t invalidate existing tests
Software recovers from failure
The fewer the changes, the fewer thedisruptions to testing
Understandability
Design is well-understood Dependencies are well understood Design changes are communicated Documentation is:
• accessible
• well-organized
• specific, detailed and accurate
The fewer the changes, the fewer thedisruptions to testing
Test Characteristics
Good test has a high probability of finding an error
Good test is not redundant A good test should be “best of breed” A good test is neither too simple nor too complex
Test Case Design
‘Black Box’ Testing• Consider only inputs and outputs
‘White Box’ or ‘Glass Box’ Testing• Also consider internal logic paths, program states, intermediate
data structures, etc.
White-Box Testing
Guarantee that all independent paths have been tested
Exercise all conditions for ‘true’ and ‘false’ Execute all loops for boundary conditions Exercise internal data structures
Why White-Box Testing?
More errors in ‘special case’ code which is infrequently executed
Control flow can’t be predicted accurately in black-box testing
Typo errors can happen anywhere!
Basis Path Testing
White-box method [McCabe ‘76] Analyze procedural design Define basis set of execution paths Test cases for basis set execute every program
statement at least once
Basis Path Testing [2]
Flow Graph: Representation of Structured Programming Constructs
[From SEPA 5/e]
Cyclomatic Complexity
V(G)=E-N+2 = 4
Independent Paths1: 1,112: 1,2,3,4,5,10,1,113: 1,2,3,6,8,9,10,1,114: 1,2,3,6,7,9,10,1,11
V(G): upper bound on number of teststo ensure all code has been executed
[From SEPA 5/e]
Black Box Testing
Focus on functional requirements Incorrect / missing functions Interface errors Errors in external data access Performance errors Initialization and termination errors
Black Box Testing [2]
How is functional validity tested? What classes of input will make good test cases? Is the system sensitive to certain inputs? How are data boundaries isolated?
Black Box Testing [3]
What data rates and volume can the system tolerate?
What effect will specific combinations of data have on system operation?
Comparison Testing Compare software versions “Regression testing”: finding the outputs that
changed Improvements vs. degradations Net effect depends on frequency and impact of
degradations When error rate is low, a large corpus can be
used
Generic Testing Strategies Testing starts at module level and moves
“outward” Different testing techniques used at different
times Testing by developer(s) and independent testers Testing and debugging are separate activities
Verification and Validation Verification
• “Are we building the product right?”
Validation• “Are we building the right product?”
Achieved by life-cycle SQA activities, assessed by testing
“You can’t create quality by testing”
Organization of Testing
[From SEPA 5/e]
How Much Test Time is Necessary?
Logarithmic Poissonexecution-time model
With sufficient fit,model predicts testingtime required to reachacceptable failure rate
[From SEPA 5/e]
Unit Testing [From SEPA 5/e]
Top-Down Integration
PRO: Higher-level (logic) modules tested earlyCON: Lower-level (reusable) modules tested late
[From SEPA 5/e]
Bottom-Up Integration
PRO: Lower-level (reusable) modules tested earlyCON: Higher-level (logic) modules tested late
[From SEPA 5/e]
Hybrid Approaches Sandwich Integration: combination of top-down
and bottom-up Critical Modules
• address several requirements
• high level of control
• complex or error prone
• definite performance requirements
Test Critical Modules ASAP!
Questions?
Software Engineeringfor Information Technology
Lecture 12: System Design
Today’s Topics Design Elements Principles for Quality Design Modularity & Partitioning Effective Modular Design Architectural Styles Mapping Models to Modules
Design Elements Data Design
data structures for data objects Architectural Design
modular structure of software Interface Design
internal / external communication Component-Level Design
procedural description of modules
[Fro
m S
EPA
5/e
]
IncreasingDetail
Design ElementsLinked to
Analysis Models
Evaluating A Design A design must implement:
• explicit requirements (analysis model)
• customer’s implicit requirements
A design must be readable, understandable by coders & testers
A good design provides a complete view of data, function, and behavior
Design Principles [Davis ‘95]
Consider > 1 design alternative Design traceable to analysis model Use design patterns Design structure should reflect structure of
problem domain Consistent style, well-defined interfaces
Design Principles [2] Structured to accommodate change (easy to
modify & update) Structured to degrade gently “Design is not coding,
coding is not design” Assess quality during creation Review design for semantic errors
Design Process Goals A hierarchical organization making use of the
control characteristics of the software A modular design which logically partitions
software into functional elements Useful abstractions for both data and procedures
Design Goals [2]
Modules should be functionally independent Modular interfaces should have minimal
complexity Explicit linking of design elements to
requirements analysis models
Modularity and Software Cost
[Fro
m S
EPA
5/e
]
Modular Design [Meyer ‘88]
Decomposabilityeffective decomposition reduces complexity
Composabilityenable reuse of existing design elements
Understandabilitymodules that can be understood in isolation are easier to build and change
Modular Design [2] Continuity
changes to requirements should trigger localized changes to specific modules
Protectionerror conditions should be considered on a per-module basis
Architectural Terminology
[Fro
m S
EPA
5/e
]
Partitioning Horizontal
branches for each major function Vertical
control & execution are top-down Increase in horizontal partitioning = increased
number of interfaces Vertically partitioned structures more resilient to
change
[Fro
m S
EPA
5/e
]
PartitioningExamples
ProceduralLayering
[Fro
m S
EPA
5/e
]
Effective Modular Design Functional independence
• maximize cohesion of modules
• minimize coupling between modules
• promote robustness in the design
Cohesionone task per procedure is optimal
Couplingminimize module interconnection
Types of Coupling
[From SEPA 5/e]
Design Heuristics Reduce coupling (implode) Improve cohesion (explode) Minimize fan-out & strive for fan-in Scope of effect = scope of control Reduce interface complexity Predictable “black box” modules Controlled entry (no GOTOs!)
ProgramStructures
[From SEPA 5/e]
Architectural Styles Data-Centered Data-Flow Call-and-Return
• main program / subprogram
• remote procedure call
Layered
Data-Centered Architecture
[From SEPA 5/e]
Data FlowArchitectures
[From SEPA 5/e]
LayeredArchitecture
[From SEPA 5/e]
Mapping Models to Modules Goal: map DFDs to a modular architecture Transform Mapping
data flow is modeled as a series of functions with input / output
Transaction Mapping:data flow is modeled as a chain of events (transactions)
Level 0 DFD for SafeHome
[From SEPA 5/e]
Level 1DFD for SafeHome
[From SEPA 5/e]
Level 2DFD for SafeHome
Refines “monitor sensors” process
[From SEPA 5/e]
Level 3DFD for SafeHome
Refines “monitor sensors”process, with flow boundaries
[From SEPA 5/e]
First-LevelFactoring
• Flow boundaries used to determine program structure and modules
• Additional factoring to introduce more detail
[From SEPA 5/e]
Questions?