78
1 Ira and Vi Glickstein 12/28/2004 06/15/22 Vi and Ira Glickstein Lesson 09 Lesson 09 Software Software Verification, Verification, Validation Validation and Testing and Testing Includes: Includes: Software Testing Software Testing Techniques Techniques Intro to Testing Intro to Testing Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

09-Software Validation, Verification

Embed Size (px)

Citation preview

Page 1: 09-Software Validation, Verification

1

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Lesson 09Lesson 09

Software Software Verification, Verification, Validation Validation and Testingand Testing

Includes:Includes:• Software Testing TechniquesSoftware Testing Techniques• Intro to TestingIntro to Testing

Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 2: 09-Software Validation, Verification

2

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Software Software TestingTesting

Testing is the process of exercising aTesting is the process of exercising aprogram with the specific intent of findingprogram with the specific intent of findingerrors prior to delivery to the end user.errors prior to delivery to the end user.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 3: 09-Software Validation, Verification

3

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

We Design Test Cases We Design Test Cases to...to...

• have high likelihood of finding errors

• exercise the internal logic of software components

• exercise the input and output to uncover errors in program function, behavior, and performance

Goal is to find maximum number of errors with the minimum amount of effort and time!

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 4: 09-Software Validation, Verification

4

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Testing is “Destructive” Testing is “Destructive” ActivityActivity

• designing and executing test cases to “break” or “demolish” the software.

• Must change your mindset during this activity

The objective is to find errors therefore errors found are good not bad. Tell that to a manager!

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 5: 09-Software Validation, Verification

5

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Testing Testing ObjectivesObjectives

• Execute a program with intent of finding an error

• Good test case has high probability of finding an as-yet undiscovered error

• Successful test case finds an as-yet undiscovered error

Successful testing uncovers errors.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 6: 09-Software Validation, Verification

6

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Testing Testing demonstrates ...demonstrates ...

• Software functions work as specified

• Behavioral and performance requirements appear to be met

• Data collected is an indicator of reliability and quality

TESTING CANNOT SHOW THE ABSENCE OF ERRORS AND DEFECTS. Testing only shows that errors and defects

are present.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 7: 09-Software Validation, Verification

7

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Basic Principles of Basic Principles of TestingTesting• All testing traceable to requirement

• Plan testing long before testing begins. Plan and design tests during design before any code has been generated.

• Pareto Principle - 80% errors in 20% of components

• Start small and progress to large. First test individual components (unit test), then on clusters of integrated components (integration test), then on whole system

• Exhaustive testing not possible but we can assure that all conditions have been exercised

• All testing should not be done by developer - need independent 3rd party

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 8: 09-Software Validation, Verification

8

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

TestabiliTestabilityty Operability Operability —it operates cleanly—it operates cleanly

ObservabilityObservability —the results of each test case —the results of each test case are readily observedare readily observed

ControllabilityControllability —the degree to which testing —the degree to which testing can be automated and optimizedcan be automated and optimized

DecomposabilityDecomposability —control scope of testing—control scope of testing SimplicitySimplicity —reduce complex architecture and —reduce complex architecture and

logic to simplify testslogic to simplify tests StabilityStability —few changes are requested during —few changes are requested during

testingtesting UnderstandabilityUnderstandability —of the design and —of the design and

documentsdocumentsTestability refers to how easily product can be tested.

Design software with “Testability” in mind.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 9: 09-Software Validation, Verification

9

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

What Testing ShowsWhat Testing Shows

• ErrorsErrors

• Requirements conformanceRequirements conformance

• PerformancePerformance

• An indication of qualityAn indication of quality

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 10: 09-Software Validation, Verification

10

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Who Tests the Software?Who Tests the Software?

DeveloperDeveloper IndependentIndependent TesterTester

Understands the system Understands the system but will test “gently” and is but will test “gently” and is driven by “delivery”driven by “delivery”

Must learn about the system Must learn about the system but will attempt to break it but will attempt to break it and is driven by qualityand is driven by quality

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 11: 09-Software Validation, Verification

11

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Software Software TestingTesting

Black Box Testing MethodsBlack Box Testing Methods White Box Testing MethodsWhite Box Testing Methods Strategies for Testing Strategies for Testing

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 12: 09-Software Validation, Verification

12

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Black Box Black Box TestingTesting• Based on specified function, on the requirements

• Tests conducted at the software interface

• Demonstrates that the software functions are operational, input is properly accepted, output is correctly produced, and integrity of external info is maintained

• Uses the SRS as basis for construction of tests

• Usually performed by independent group

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 13: 09-Software Validation, Verification

13

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

White-Box White-Box TestingTesting

… … Our goal is to ensure that all statements Our goal is to ensure that all statements and conditions have been executed at least and conditions have been executed at least once.once.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 14: 09-Software Validation, Verification

14

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

White Box White Box Testing -- ITesting -- I

• Based on internal workings of a product; requires close examination of software

• Logical paths are tested by providing test cases that exercise specific sets of conditions and/or loops

• Check status of program by comparing actual results to expected results at selected points in the software

Exhaustive path testing is impossible Exhaustive path testing is impossible

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 15: 09-Software Validation, Verification

15

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Exhaustive TestingExhaustive Testing

loop < 20 timesloop < 20 times

There are 10 possible paths! If we execute oneThere are 10 possible paths! If we execute onetest per millisecond, it would take 3,170 years totest per millisecond, it would take 3,170 years totest this program!!test this program!!

1414

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 16: 09-Software Validation, Verification

16

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

White Box Testing White Box Testing -- II-- II• Logic errors and incorrect assumptions usually

occur with special case processing

• Our assumptions about flow of control and data may lead to errors that are only uncovered during path testing

• We make typing errors; some uncovered by compiler (syntax, type checking) BUT others only uncovered by testing. Typo may be on obscure path

Black box testing can miss these types of errors Black box testing can miss these types of errors

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 17: 09-Software Validation, Verification

17

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Selective TestingSelective Testing

loop < 20 timesloop < 20 times

SelectedSelected pathpath

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 18: 09-Software Validation, Verification

18

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Software Testing Software Testing TechniquesTechniques

Testing Analysis Testing Analysis

Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 19: 09-Software Validation, Verification

19

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Test Case Test Case DesignDesign

UncoverUncover errors in a errors in a completecomplete manner manner with a with a minimumminimum of effort and time! of effort and time!

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 20: 09-Software Validation, Verification

20

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Basis Path Basis Path Testing -- ITesting -- I A white box testing technique - A white box testing technique -

McCabeMcCabe Use this technique to derive a Use this technique to derive a

logical measure of complexitylogical measure of complexity Use as a guide for defining a Use as a guide for defining a

“basis set” of execution paths“basis set” of execution paths Test cases derived to execute the Test cases derived to execute the

basis set are basis set are guaranteed to guaranteed to execute every statement at least execute every statement at least one time during testingone time during testing

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 21: 09-Software Validation, Verification

21

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Cyclomatic Cyclomatic ComplexityComplexity This is a This is a quantitative measurequantitative measure of the of the

logical complexity of a program.logical complexity of a program. Used in conjunction with basis set testing Used in conjunction with basis set testing

it defines the it defines the number of independent number of independent pathspaths in the basis set in the basis set

It provides an It provides an upper boundupper bound for the number for the number of tests that ensure all statements have of tests that ensure all statements have been executed at least once.been executed at least once.

See See http://www.mccabe.com/pdf/nist235r.pdfhttp://www.mccabe.com/pdf/nist235r.pdf for more detailed paper McCabe’s for more detailed paper McCabe’s Cyclomatic Complexity.Cyclomatic Complexity.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 22: 09-Software Validation, Verification

22

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Basis Path Testing Basis Path Testing -- II-- II

First, we compute the cyclomatic Complexity::

number of simple decisions + 1 1,2,3=3 decisions+1 or

number of enclosed areas + 1 A,B,C=3 areas +1

In this case, V(G) = 4

11

22

33

AA

BB

CC

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 23: 09-Software Validation, Verification

23

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Cyclomatic Cyclomatic ComplexityComplexityA number of industry studies have indicated that the higher A number of industry studies have indicated that the higher

the V(G), the higher the probability of errors.the V(G), the higher the probability of errors.

V(G)V(G)

modulesmodules

modules in this range are more error pronemodules in this range are more error prone

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 24: 09-Software Validation, Verification

24

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Basis Path Testing Basis Path Testing -- III-- IIINext, we derive the Next, we derive the

independent paths:independent paths:

Since V(G) = 4,Since V(G) = 4,there are four pathsthere are four paths

Path 1: 1,2,3,6,7,8Path 1: 1,2,3,6,7,8Path 2: 1,2,3,5,7,8Path 2: 1,2,3,5,7,8Path 3: 1,2,4,7,8Path 3: 1,2,4,7,8Path 4: 1,2,4,7,2,...7,8Path 4: 1,2,4,7,2,...7,8 Note the … implies insertionNote the … implies insertion of path 1, 2, or 3 here.of path 1, 2, or 3 here.Finally, we derive testFinally, we derive testcases to exercise these cases to exercise these Paths.Paths. 88

11

22

3344

55 66

77

88

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 25: 09-Software Validation, Verification

25

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Creating Flow Creating Flow GraphsGraphs Circle (node) represents one or Circle (node) represents one or

more statementsmore statements Arrows (edges) represent flow or Arrows (edges) represent flow or

control. Must terminate in a node.control. Must terminate in a node. Region is an area bounded by Region is an area bounded by

edges and nodes. The area edges and nodes. The area outside the flow graph is included outside the flow graph is included as a region.as a region.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 26: 09-Software Validation, Verification

26

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Calculating Cyclomatic Calculating Cyclomatic Complexity from Flow Complexity from Flow

GraphGraph Count the number of regionsCount the number of regions V(G) = E - N + 2V(G) = E - N + 2

where E = number of edgeswhere E = number of edges N = number of nodesN = number of nodes

V(G) = P + 1V(G) = P + 1 where P = number of predicate nodes (2 or where P = number of predicate nodes (2 or

more edges leave the node)more edges leave the node)

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 27: 09-Software Validation, Verification

27

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Basis Path Testing Basis Path Testing NotesNotes- You don’t need a flow chart or graph - You don’t need a flow chart or graph but the picture helps when you trace but the picture helps when you trace program pathsprogram paths

- Count each simple logical test as 1, - Count each simple logical test as 1, compound tests count as 2 or more compound tests count as 2 or more (depending on number of tests)(depending on number of tests)

- Basis Path Testing should be applied Basis Path Testing should be applied to critical modules.to critical modules.

- Some Development Environments Some Development Environments will automate calculation of V(G)will automate calculation of V(G)

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 28: 09-Software Validation, Verification

28

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Deriving Test Deriving Test CasesCases

Using design or code as a foundation, Using design or code as a foundation, draw a corresponding flow graphdraw a corresponding flow graph

Determine the cyclomatic complexityDetermine the cyclomatic complexity Identify the basis set of linearly Identify the basis set of linearly

independent pathsindependent paths Prepare test cases that will force Prepare test cases that will force

execution of each path in the basis setexecution of each path in the basis set Exercise - create flow graph from Exercise - create flow graph from

exampleexample

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 29: 09-Software Validation, Verification

29

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Graph Graph MatricesMatrices

Software tools exist that use a graph Software tools exist that use a graph matrix to derive the flow graph and matrix to derive the flow graph and determine the set of basis pathsdetermine the set of basis paths

Square matrix whose size equals the Square matrix whose size equals the number of nodes on the flow graphnumber of nodes on the flow graph

Each node is identified by number Each node is identified by number and each edge by letterand each edge by letter

Can add link weight for other more Can add link weight for other more interesting properties (e.g. interesting properties (e.g. processing time, memory required, processing time, memory required, etc.)etc.)

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 30: 09-Software Validation, Verification

30

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Control Structure Control Structure TestingTesting

Basis path testing is not enoughBasis path testing is not enough Must broaden testing coverage Must broaden testing coverage

and improve quality of testingand improve quality of testing Condition TestingCondition Testing Data Flow TestingData Flow Testing Loop TestingLoop Testing

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 31: 09-Software Validation, Verification

31

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Condition Condition Testing --ITesting --I

Exercise the logical conditions in a Exercise the logical conditions in a program moduleprogram module

Boolean variable or relational Boolean variable or relational expressionexpression

Compound conditions - one or Compound conditions - one or more conditionsmore conditions

Detect errors in conditions AND Detect errors in conditions AND also in rest of program. If test set also in rest of program. If test set is effective for conditions, likely is effective for conditions, likely also for other errors.also for other errors.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 32: 09-Software Validation, Verification

32

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Condition Testing Condition Testing --II--II

Branch Testing - test each True Branch Testing - test each True and False branch at least onceand False branch at least once

Domain Testing - 3 or 4 tests for a Domain Testing - 3 or 4 tests for a relational expression. Test for relational expression. Test for greater than, equal to, less than. greater than, equal to, less than. Also a test which makes the Also a test which makes the difference between the 2 values difference between the 2 values as small as possible.as small as possible.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 33: 09-Software Validation, Verification

33

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Data Flow Data Flow TestingTesting

Selects test paths according to Selects test paths according to the locations of definitions and the locations of definitions and uses of variables in the program.uses of variables in the program.

Can’t use for large system but can Can’t use for large system but can target for suspect areas of the target for suspect areas of the softwaresoftware

Useful for selecting test paths Useful for selecting test paths containing nested if and loop containing nested if and loop statementsstatements

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 34: 09-Software Validation, Verification

34

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Loop Loop TestingTesting

White box technique focuses on White box technique focuses on validity of loop constructsvalidity of loop constructs

Four different types of loops:Four different types of loops: Simple loopsSimple loops Nested loopsNested loops Concatenated loopsConcatenated loops Unstructured loops - should redesign to Unstructured loops - should redesign to

reflect structured constructsreflect structured constructs

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 35: 09-Software Validation, Verification

35

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Loop TestingLoop Testing

Nested Nested LoopsLoops

ConcatenatedConcatenated LoopsLoops Unstructured Unstructured

LoopsLoops

SimpleSimple looploop

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 36: 09-Software Validation, Verification

36

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Loop Testing: Simple Loop Testing: Simple LoopsLoops

Minimum conditions—Simple LoopsMinimum conditions—Simple Loops

1. skip the loop 1. skip the loop entirelyentirely2. only one pass through the loop2. only one pass through the loop3. two passes through the loop3. two passes through the loop4. m passes through the loop m < n4. m passes through the loop m < n5. (n-1), n, and (n+1) passes through 5. (n-1), n, and (n+1) passes through the loopthe loop

where n is the maximumwhere n is the maximum numbernumber of allowable passesof allowable passes

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 37: 09-Software Validation, Verification

37

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Loop Testing: Nested Loop Testing: Nested LoopsLoops

Nested LoopsNested Loops

ConcatenatedConcatenated LoopsLoops

1.1. Start at the innermost loop. Set all outermost loops Start at the innermost loop. Set all outermost loops to their minimum values. to their minimum values.

2.2. Test the min+1, typical, max-1 and max for the Test the min+1, typical, max-1 and max for the innermost loop while holding the outermost loops at innermost loop while holding the outermost loops at minimum values.minimum values.

3. Move out one loop and set it up as in step 2 holding 3. Move out one loop and set it up as in step 2 holding all loops at typical values until the outermost loop all loops at typical values until the outermost loop has been tested.has been tested.

If the loops are independent of each other then treat as If the loops are independent of each other then treat as simple loops. Otherwise treat as nested loops.simple loops. Otherwise treat as nested loops.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 38: 09-Software Validation, Verification

38

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Black-Box TestingBlack-Box Testing

requirementsrequirements

eventseventsinputinput

outputoutput

Also called behavioral testingAlso called behavioral testingFrom Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 39: 09-Software Validation, Verification

39

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Black Box Black Box TestingTesting

Does not replace white box testingDoes not replace white box testing A complementary approachA complementary approach Focuses on functional requirements Focuses on functional requirements

of the softwareof the software Tries to find following types of Tries to find following types of

errors:errors: incorrect or missing functionsincorrect or missing functions interface errorsinterface errors errors in data structures or database accesserrors in data structures or database access behavior or performance errorsbehavior or performance errors initialization or termination errorsinitialization or termination errors

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 40: 09-Software Validation, Verification

40

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Black Box Black Box TestingTesting

Done during later stages of testingDone during later stages of testing Tests designed to answer following Tests designed to answer following

questionsquestions how is functional validity tested?how is functional validity tested? How is system behavior and perf tested?How is system behavior and perf tested? What classes of input will make good test cases?What classes of input will make good test cases? Is system sensitive to certain input values?Is system sensitive to certain input values? How are the boundaries of a data class isolated?How are the boundaries of a data class isolated? What data rates and data volume can the system What data rates and data volume can the system

take?take? What effect will specific data comb. have on What effect will specific data comb. have on

systemsystem

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 41: 09-Software Validation, Verification

41

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Equivalence Equivalence PartitioningPartitioning

Black box method that divides the Black box method that divides the input domain of a program into input domain of a program into classes of data from which test cases classes of data from which test cases can be derivedcan be derived

Strive to design a test case that Strive to design a test case that uncovers classes of errors and uncovers classes of errors and reduces the total number of test cases reduces the total number of test cases that must be developed and run. E.g. that must be developed and run. E.g. incorrect processing of all character incorrect processing of all character datadata

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 42: 09-Software Validation, Verification

42

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Equivalence Equivalence PartitioningPartitioning

useruserqueriesqueries mousemouse

pickspicks

outputoutputformatsformats

promptspromptsdatadata errorserrors

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 43: 09-Software Validation, Verification

43

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Sample Equivalence Sample Equivalence ClassesClasses

• Data outside bounds of the programData outside bounds of the program• Physically impossible data Physically impossible data • Proper value supplied in the wrong placeProper value supplied in the wrong place

Valid dataValid data

Invalid dataInvalid data

• User supplied commandsUser supplied commands• Responses to system promptsResponses to system prompts• FilenamesFilenames• Computational DataComputational Data

• physical parametersphysical parameters• bounding valuesbounding values• initiation valuesinitiation values

• Output data formattingOutput data formatting• Responses to error msgsResponses to error msgs• Graphical data (e.g. mouse picks)Graphical data (e.g. mouse picks)

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 44: 09-Software Validation, Verification

44

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Equivalence Class Equivalence Class DefinitionDefinitionGuidelinesGuidelines Input condition specified range - one valid and Input condition specified range - one valid and

2 invalid classes defined2 invalid classes defined Input condition requires specific value, one Input condition requires specific value, one

valid and 2 invalid classes definedvalid and 2 invalid classes defined Input condition specifies a number of a set, Input condition specifies a number of a set,

one valid and one invalid class definedone valid and one invalid class defined Input condition is Boolean, one valid and one Input condition is Boolean, one valid and one

invalid class definedinvalid class defined E.g. prefix - 3 digit number not beginning with E.g. prefix - 3 digit number not beginning with

0 or 1; Input condition: range - specified value 0 or 1; Input condition: range - specified value >200; Input condition: value - 4 digit length>200; Input condition: value - 4 digit length

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 45: 09-Software Validation, Verification

45

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Boundary Value Boundary Value AnalysisAnalysis

More errors occur at boundary of the input More errors occur at boundary of the input domaindomain

BVA leads to selection of test cases that BVA leads to selection of test cases that exercise the boundariesexercise the boundaries

Guidelines:Guidelines: Input in range a..b: select a, b, just above and just below Input in range a..b: select a, b, just above and just below

a and ba and b Inputs with number of values: select min and max, just Inputs with number of values: select min and max, just

above and below min, maxabove and below min, max Use same guidelines for output conditionsUse same guidelines for output conditions boundaries on data structures (array with 100 entries): boundaries on data structures (array with 100 entries):

test at boundarytest at boundary

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 46: 09-Software Validation, Verification

46

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Software Testing Software Testing StrategiesStrategies

Page 47: 09-Software Validation, Verification

47

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Testing Goals - Testing Goals - ReviewReview

Goal is to discover as many errors as Goal is to discover as many errors as possible with minimum effort and timepossible with minimum effort and time

Destructive activity - people who Destructive activity - people who constructed the sw now asked to test it constructed the sw now asked to test it

Vested interest in showing sw is error-free, meets Vested interest in showing sw is error-free, meets requirements, and will meet budget and schedulerequirements, and will meet budget and schedule

Works against thorough testingWorks against thorough testing

Therefore, should the developer do no Therefore, should the developer do no testing? Should all testing be done testing? Should all testing be done independently and testers get involved independently and testers get involved only when developers finished with only when developers finished with construction?construction?

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 48: 09-Software Validation, Verification

48

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Testing Testing Strategies ...Strategies ... In the past, only defense against In the past, only defense against

programming errors was careful design programming errors was careful design and the intelligence of the programmerand the intelligence of the programmer

Now we have modern design techniques Now we have modern design techniques and formal technical reviews to reduce the and formal technical reviews to reduce the number of initial errors in the codenumber of initial errors in the code

In Chapter 17 we discussed how to design In Chapter 17 we discussed how to design effective test cases, now we discuss the effective test cases, now we discuss the strategy we use to execute them.strategy we use to execute them.

Strategy is developed by project manager, Strategy is developed by project manager, software engineer, and testing specialists. software engineer, and testing specialists. It may also be mandated by customer. It may also be mandated by customer.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 49: 09-Software Validation, Verification

49

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Why is Testing Why is Testing Important?Important?

Testing often accounts for more effort Testing often accounts for more effort than any other sw engineering activitythan any other sw engineering activity

If done haphazardly, weIf done haphazardly, we waste timewaste time waste effortwaste effort errors sneak thruerrors sneak thru

Therefore need a systematic approach Therefore need a systematic approach for testing softwarefor testing software

Work product is a Test Specification Work product is a Test Specification (Test Plan)(Test Plan)

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 50: 09-Software Validation, Verification

50

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

What is a Test What is a Test Plan?Plan?

a a road maproad map describing the steps to be describing the steps to be conductedconducted

specifies when the steps are planned specifies when the steps are planned and then undertakenand then undertaken

states how much effort, time, and states how much effort, time, and resources will be requiredresources will be required

must incorporate test planning, test must incorporate test planning, test case design, test execution, and data case design, test execution, and data collection and evaluationcollection and evaluationShould be flexible for customized testing but rigid enough

for planning and management tracking.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 51: 09-Software Validation, Verification

51

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Strategic Strategic IssuesIssues

Specify requirements in a quantifiable manner so the Specify requirements in a quantifiable manner so the requirement can be tested.requirement can be tested.

State testing objectives explicitlyState testing objectives explicitly Understand potential users and develop profilesUnderstand potential users and develop profiles Develop testing plan - in increments quicklyDevelop testing plan - in increments quickly Build robust software with error checkingBuild robust software with error checking Use effective Formal Test Reviews (FTRs) to find Use effective Formal Test Reviews (FTRs) to find

errors early - save time/$errors early - save time/$ Conduct FTRs on tests and test strategyConduct FTRs on tests and test strategy Develop continuous improvement - collect metrics Develop continuous improvement - collect metrics

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 52: 09-Software Validation, Verification

52

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Testing StrategyTesting Strategy

unit testunit test integrationintegrationtesttest

validationvalidationtesttest

systemsystemtesttest

Component levelIntegrate components

Requirements levelSystem elements tested as a wholeFrom Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 53: 09-Software Validation, Verification

53

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Verification and Verification and ValidationValidation Verification - ensure sw correctly Verification - ensure sw correctly

implements specified functionimplements specified function ““Are we building the product right?”Are we building the product right?”

Validation - ensure sw is traceable to Validation - ensure sw is traceable to requirementsrequirements

““Are we building the right product?”Are we building the right product?”

Independent Test Group (ITG) performs Independent Test Group (ITG) performs V&V - works closely with developer to fix V&V - works closely with developer to fix errors as they are founderrors as they are found

ITG starts at beginning of project thru ITG starts at beginning of project thru finishfinish

ITG reports to organization apart from SWITG reports to organization apart from SW

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 54: 09-Software Validation, Verification

54

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Comparison of Testing Comparison of Testing TypesTypes

Test Type Uses Black Box

White Box

SW Dev

Independent Group

Unit Code No Yes Yes No

Integration Code Yes Yes Yes No

Validation Req Yes No No Yes

System Req Yes No No Yes

Eliminate duplication of testing between different groups to save time/$

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 55: 09-Software Validation, Verification

55

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Unit TestingUnit Testing

interfaceinterface locallocal datadata structuresstructures

boundaryboundary conditionsconditionsindependent paths (basis paths)independent paths (basis paths)error handling pathserror handling paths

modulemoduleto beto betestedtested

test casestest cases

Types of testing

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 56: 09-Software Validation, Verification

56

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Unit Test Unit Test EnvironmentEnvironment

ModuleModule

stubstub stubstub

driverdriver

RESULTSRESULTS

interfaceinterface

local data structureslocal data structures

boundary conditionsboundary conditions

independent pathsindependent paths

error handling pathserror handling paths

test casestest casesTesting is simplified if unit has only one function (hi cohesion) - fewer test cases

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 57: 09-Software Validation, Verification

57

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Drivers and Drivers and StubsStubs

Driver - Main program that accepts test case Driver - Main program that accepts test case data, passes such data to the module, and data, passes such data to the module, and prints relevant resultsprints relevant results

Stub - replace modules that are subordinate Stub - replace modules that are subordinate to unit under test; uses the subordinate to unit under test; uses the subordinate module’s I/F, may do minimal data module’s I/F, may do minimal data manipulation, prints verification of entry, manipulation, prints verification of entry, returns control to module undergoing testing.returns control to module undergoing testing.

Overhead when writing drivers and stubsOverhead when writing drivers and stubs Sometimes, can’t adequately unit test with Sometimes, can’t adequately unit test with

simple overhead sw - then wait till integration simple overhead sw - then wait till integration (drivers and stubs may be used here)(drivers and stubs may be used here)

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 58: 09-Software Validation, Verification

58

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Types of Computation Types of Computation ErrorsErrors

Misunderstood or incorrect arithmetic Misunderstood or incorrect arithmetic precedenceprecedence

Mixed Mode operationsMixed Mode operations Incorrect initializationIncorrect initialization Precision inaccuracyPrecision inaccuracy Incorrect symbolic representation of Incorrect symbolic representation of

an expressionan expression

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 59: 09-Software Validation, Verification

59

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Types of Control Flow Types of Control Flow ErrorsErrors

Comparison of different data typesComparison of different data types Incorrect logical operators or precedenceIncorrect logical operators or precedence Expectation of equality when precision Expectation of equality when precision

error makes unlikelyerror makes unlikely Incorrect comparison of variablesIncorrect comparison of variables Improper or nonexistent loop terminationImproper or nonexistent loop termination Failure to exitFailure to exit Improperly modified loop variablesImproperly modified loop variables

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 60: 09-Software Validation, Verification

60

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Error Handling Error Handling EvaluationEvaluation

Error conditions must be anticipated and error Error conditions must be anticipated and error handling must reroute or cleanly terminate handling must reroute or cleanly terminate processing – Antibuggingprocessing – Antibugging

Typical Antibugging Errors:Typical Antibugging Errors: Error description is unintelligibleError description is unintelligible Error noted doesn’t match error encounteredError noted doesn’t match error encountered Error condition causes system interventionError condition causes system intervention Exception condition processing is incorrectException condition processing is incorrect Error description doesn’t provide enough infoError description doesn’t provide enough info

Make sure error handling is tested!From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 61: 09-Software Validation, Verification

61

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Integration Testing StrategiesIntegration Testing StrategiesOptions:Options:

•• The “big bang” approachThe “big bang” approach OROR•• An incremental construction strategyAn incremental construction strategy

• Top DownTop Down• Bottom UpBottom Up• Sandwich Sandwich

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 62: 09-Software Validation, Verification

62

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

What is Integration What is Integration Testing?Testing?

Take unit tested components and build a program Take unit tested components and build a program structure by joining the components while testing structure by joining the components while testing to find errors associated with interfaces between to find errors associated with interfaces between componentscomponents

Data can be lost across an interface, one module Data can be lost across an interface, one module can have inadvertent adverse affect on another, can have inadvertent adverse affect on another, etc.etc.

Program is constructed and tested in small Program is constructed and tested in small increments. Errors are easier to isolate, increments. Errors are easier to isolate, interfaces are more likely to be tested interfaces are more likely to be tested completely, systematic test approach is applied.completely, systematic test approach is applied.

Software gains maturity as integrate modules.Software gains maturity as integrate modules.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 63: 09-Software Validation, Verification

63

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Top Down IntegrationTop Down Integration

top module is tested with top module is tested with Stubs for B, F, GStubs for B, F, G

stubs are replaced one at stubs are replaced one at a time with real components, "depth first"a time with real components, "depth first"

as new modules are integrated, as new modules are integrated, some subset of tests is re-run - regressionsome subset of tests is re-run - regression

AA

BB

CC

DD EE

FF GG

What would be replaced next?

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 64: 09-Software Validation, Verification

64

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Bottom-Up IntegrationBottom-Up Integration

drivers are removed and builds combineddrivers are removed and builds combinedMoving upward one at a time, "depth first"Moving upward one at a time, "depth first"

worker modules are grouped into worker modules are grouped into builds that perform specific subfunctionbuilds that perform specific subfunctionand integratedand integrated

AA

BB

CC

DD EE

FF GG

clusterclusterFrom Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 65: 09-Software Validation, Verification

65

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Top-down vs Bottom Up Top-down vs Bottom Up IntegrationIntegration

Top DownTop Down Stubs replace low Stubs replace low

level modules which level modules which normally supply datanormally supply data

Therefore may delay Therefore may delay some testing (not some testing (not good)good)

Simulate the actual Simulate the actual module in the stub module in the stub (high overhead)(high overhead)

Verifies major Verifies major control earlycontrol early

Bottom upBottom up First integrate the First integrate the

low level modules low level modules that supply datathat supply data

Program doesn’t Program doesn’t exist until last exist until last module integratedmodule integrated

Easier test case Easier test case designdesign

Don’t need stubs - Don’t need stubs - need drivers. need drivers.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 66: 09-Software Validation, Verification

66

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Sandwich TestingSandwich Testing

Top modules areTop modules aretested with stubstested with stubs

Worker modules are grouped into Worker modules are grouped into builds and integratedbuilds and integrated

AA

BB

CC

DD EE

FF GG

clusterclusterFrom Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 67: 09-Software Validation, Verification

67

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Critical Critical ModulesModules

Identify critical modules and target for Identify critical modules and target for early testing and focus regression early testing and focus regression testing on them. Critical modules:testing on them. Critical modules:

Address several sw reqAddress several sw req High level of control (high in sw High level of control (high in sw

structure)structure) Complex or error prone (high V(G))Complex or error prone (high V(G)) Has definite performance Has definite performance

requirementsrequirements

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 68: 09-Software Validation, Verification

68

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

High Order TestingHigh Order Testing

Validation Test - Test Plan outlines classes of tests to be performed. Test Procedures have specific test cases.

After each test case runs, either passes or have deviation which is recorded as a Software Trouble Report (STR)

Resolution of STRs is monitored

Alpha and Beta Testing - Alpha at developer’s site and Beta at customer site

System Test - tests to verify system elements have been properly integrated and perform required functions

This was performed as a System Level Acceptance Test (SLAT) at IBM.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 69: 09-Software Validation, Verification

69

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Debugging: Debugging: A Diagnostic A Diagnostic

ProcessProcess

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 70: 09-Software Validation, Verification

70

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Debugging Debugging ProcessProcess

Debugging is a consequence of testingDebugging is a consequence of testing Debugging effort is combination of the Debugging effort is combination of the

time required to diagnose the symptom time required to diagnose the symptom and determine the cause of the error and determine the cause of the error AND the time required to correct the AND the time required to correct the error and conduct regression tests.error and conduct regression tests.

Regression test is a selective re-Regression test is a selective re-running of tests to assure that nothing running of tests to assure that nothing has been broken when fix or has been broken when fix or modification was implemented.modification was implemented.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 71: 09-Software Validation, Verification

71

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Symptoms & CausesSymptoms & Causes

symptomsymptomcausecause

Symptom and cause may be geographically remote:

Symptom may disappear when another error is fixed

Symptom may be caused by nonerror (e.g. roundoff)

Symptom caused by human error; compiler error; assumptions

Symptom caused by timing problems

Hard to duplicate conditions (real-time application)

Symptom intermittent - with embedded systems

Symptom due to causes distributed across number of tasks on different processors.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 72: 09-Software Validation, Verification

72

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Consequences of BugsConsequences of Bugs

damagedamage

mildmild annoyingannoying

disturbingdisturbingseriousserious

extremeextremecatastrophiccatastrophic

infectiousinfectious

Bug TypeBug Type

Bug Categories:Bug Categories: function-related bugs, function-related bugs, system-related bugs, data bugs, coding bugs, system-related bugs, data bugs, coding bugs, design bugs, documentation bugs, standards design bugs, documentation bugs, standards violations, etc.violations, etc.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 73: 09-Software Validation, Verification

73

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Debugging Debugging TechniquesTechniques

brute force / testingbrute force / testing

backtrackingbacktracking

Cause eliminationCause elimination

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 74: 09-Software Validation, Verification

74

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Brute Force Brute Force DebuggingDebugging

““Let the computer find the error” - Let the computer find the error” - memory dumps, run-time traces, memory dumps, run-time traces, WRITE statements all over programWRITE statements all over program

Most common and Most common and least efficientleast efficient method for isolating cause of errormethod for isolating cause of error

Wasted effort and timeWasted effort and time Think first!Think first!

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 75: 09-Software Validation, Verification

75

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Backtracking Backtracking DebuggingDebugging

Begin at the site where symptom Begin at the site where symptom uncovered, source code is traced uncovered, source code is traced backward manually until cause is backward manually until cause is found.found.

As number of LOC increases, number As number of LOC increases, number of backward paths becomes of backward paths becomes unmanageably largeunmanageably large

Fairly common debugging approach - Fairly common debugging approach - successful for small programssuccessful for small programs

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 76: 09-Software Validation, Verification

76

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Cause Elimination - Cause Elimination - DebuggingDebugging

Data related to the error occurrence is Data related to the error occurrence is organized to isolate potential causesorganized to isolate potential causes

““Cause hypothesis” is devised and Cause hypothesis” is devised and data used to prove or disprove the data used to prove or disprove the hypothesishypothesis

Or, a list of all possible causes is Or, a list of all possible causes is developed and tests run to eliminate developed and tests run to eliminate each.each.

From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000

Page 77: 09-Software Validation, Verification

77

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

Debugging: Final Debugging: Final ThoughtsThoughts

ThinkThink about the symptom you are seeing about the symptom you are seeing Use Use toolstools such as dynamic debugger to gain such as dynamic debugger to gain

more insight about the bug.more insight about the bug. Get Get helphelp from somebody else if you are from somebody else if you are

stuck. Just talking to another person can help stuck. Just talking to another person can help you see the cause of the bug.you see the cause of the bug.

Every time you touch existing code, you run Every time you touch existing code, you run the risk of injecting errors. Therefore the risk of injecting errors. Therefore ALWAYS ALWAYS run regression tests on all fixes.run regression tests on all fixes.

Ask the following questions: Ask the following questions: Is the bug also in another part of program? How can we prevent the bug in the first place?

Page 78: 09-Software Validation, Verification

78

Ira and Vi Glickstein 12/28/200404/12/23 Vi and Ira Glickstein

WebliograpWebliographyhy

Check the Webliography for some Check the Webliography for some interesting cases of software bugs interesting cases of software bugs such as the Therac radiation bug and such as the Therac radiation bug and other information about testing.other information about testing.