36
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20

Software Testing

  • Upload
    naeva

  • View
    52

  • Download
    0

Embed Size (px)

DESCRIPTION

Software Testing. Reference: Software Engineering , Ian Sommerville, 6 th edition, Chapter 20. Topics Covered. The testing process Defect testing Black box testing Equivalence partitions White box testing Equivalence partitions Path testing Integration testing - PowerPoint PPT Presentation

Citation preview

Page 1: Software Testing

Software Testing

Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 20

Page 2: Software Testing

CMSC 345, Version 4/04 2

Topics Covered The testing process Defect testing

• Black box testing» Equivalence partitions

• White box testing» Equivalence partitions» Path testing

Integration testing• Top-down, bottom-up, sandwich, thread• Interface testing• Stress testing

Object-oriented testing

Page 3: Software Testing

CMSC 345, Version 4/04 3

Testing Goal The goal of testing is to discover defects in

programs. A successful test is a test that causes a

program to behave in an anomalous way. Tests show the presence, not the absence, of

defects. Test planning should be continuous throughout

the software development process. Note that testing is the only validation technique

for non-functional requirements.

Page 4: Software Testing

CMSC 345, Version 4/04 4

The V-model of Testing

Requirementsspecification

Systemspecification

Systemdesign

Detaileddesign

Module andunit codeand tess

Sub-systemintegrationtest plan

Systemintegrationtest plan

Acceptancetest plan

Service Acceptancetest

Systemintegration test

Sub-systemintegration test

Page 5: Software Testing

CMSC 345, Version 4/04 5

The Testing Process Component (unit) testing

• Testing of individual program components (e.g., functions, methods, or classes)

• Usually the responsibility of the component developer (except sometimes for critical systems)

Integration and system testing• Testing of groups of components integrated to create a sub-

system or entire system• Usually the responsibility of an independent testing team• Tests are based on system specification

Acceptance testing• Run in presence of customer or by customer• Used to validate all system requirements

Page 6: Software Testing

CMSC 345, Version 4/04 6

Test data - Inputs that have been devised to conduct the particular test

Expected output – Recorded before test is conducted

Actual output – The output actually received Pass/fail criteria – Criteria to determine if test

passed or failed when the actual results are compared to the expected results. Determined before test is conducted.

The Testing Process (con’t)

Page 7: Software Testing

CMSC 345, Version 4/04 7

Black Box Testing An approach to testing where the program is

considered as a “black box” (i.e., one cannot “see” inside of it)

The program test cases are based on the system specification, not the internal workings (e.g., algorithms), of the program.

Use equivalence partitions when conducting black box testing

Page 8: Software Testing

CMSC 345, Version 4/04 8

Equivalence Partitioning Input data and output results often fall into

different classes where all members of a class are related. Examples:• Positive (or negative) numbers• Strings with (or without) blanks

Each of these classes is an equivalence partition where the program behaves in an equivalent way for each class member.

Test cases should be chosen from each partition, especially at the boundaries.

Page 9: Software Testing

CMSC 345, Version 4/04 9

System accepts 4 to 10 inputs which are 5-digit integers greater than 10,000

Partition system inputs into groups (partitions) that should cause equivalent behavior. Include both valid and invalid inputs.• If input is a 5-digit integer between 10,000 and 99,999,

equivalence partitions are: < 10,000 10,000 - 99,999 > 99,999

Choose test cases at the boundaries of these partitions:• 9,999 10,000 99,999 100,000

Equivalence Partitions Example

Page 10: Software Testing

CMSC 345, Version 4/04 10

Equivalence Partitions Example

Between 10000 and 99999Less than 10000 More than 99999

999910000 50000

10000099999

Input values

Between 4 and 10Less than 4 More than 10

34 7

1110

Number of input values

Page 11: Software Testing

CMSC 345, Version 4/04 11

Derivation of Test CasesSome Practice

procedure Search (Key : ELEM ; T: ELEM_ARRAY; Found : in out BOOLEAN; L: in out ELEM_INDEX) ;

Pre-condition-- the array has at least one element

Post-condition-- the element is found and is referenced by L

or -- the element is not in the array

Page 12: Software Testing

CMSC 345, Version 4/04 12

What Tests Should We Use? Equivalence Partitions

Page 13: Software Testing

CMSC 345, Version 4/04 13

Search Routine Test CasesArray ElementSingle value In sequenceSingle value Not in sequenceMore than 1 value First element in sequenceMore than 1 value Last element in sequenceMore than 1 value Middle element in sequenceMore than 1 value Not in sequence

Input sequence (T) Key (Key) Output (Found, L)17 17 true, 117 0 false, ??17, 29, 21, 23 17 true, 141, 18, 9, 31, 30, 16, 45 45 true, 717, 18, 21, 23, 29, 41, 38 23 true, 421, 23, 29, 33, 38 25 false, ??

Page 14: Software Testing

CMSC 345, Version 4/04 14

Sometimes called structural testing Derivation of test cases according to program

structure (can “see” inside) Objective is to exercise all program statements

at least once Usually applied to relatively small program

units such as functions or class methods

White Box Testing

Page 15: Software Testing

CMSC 345, Version 4/04 15

Binary Search Routine Assume that we now know that the search

routine is a binary search. Any new tests?

Page 16: Software Testing

CMSC 345, Version 4/04 16

Binary Search Test Cases

Page 17: Software Testing

CMSC 345, Version 4/04 17

Path Testing The objective of path testing is to ensure that

the set of test cases is such that each path through the program is executed at least once.

The starting point for path testing is a program flow graph.

Page 18: Software Testing

Binary search flow graph

1

2

3

4

65

7

while bottom <= top

if (elemArray [mid] == key

(if (elemArray [mid]< key8

9

bottom > top

Page 19: Software Testing

CMSC 345, Version 4/04 19

1, 2, 8, 9 1, 2, 3, 8, 9 1, 2, 3, 4, 6, 7, 2 1, 2, 3, 4, 5, 7, 2 Test cases should be derived so that all of

these paths are executed. A dynamic program analyser may be used to

check that paths have been executed (e.g., LINT).

Independent Paths

Page 20: Software Testing

CMSC 345, Version 4/04 20

The minimum number of tests needed to test all statements equals the cyclomatic complexity.

CC = number_edges – number_nodes + 2

In the case of no goto’s, CC = number_decisions + 1

Although all paths are executed, all combinations of paths are not executed.

Some paths may be impossible to test.

Cyclomatic Complexity

Page 21: Software Testing

CMSC 345, Version 4/04 21

Integration Testing Tests the complete system or subsystems

composed of integrated components Integration testing is black box testing with tests

derived from the requirements and design specifications.

Main difficulty is localizing errors Incremental integration testing reduces this

problem.

Page 22: Software Testing

CMSC 345, Version 4/04 22

Incremental Integration Testing

T3

T2

T1

T4

T5

A

B

C

D

T2

T1

T3

T4

A

B

C

T1

T2

T3

A

B

Test sequence1

Test sequence2

Test sequence3

Page 23: Software Testing

CMSC 345, Version 4/04 23

Approaches to Integration Testing Top-down testing

• Start with high-level system and integrate from the top-down, replacing individual components by stubs where appropriate

Bottom-up testing• Integrate individual components in levels until the complete

system is created In practice, most integration testing involves a

combination of both of these strategies:• Sandwich testing (outside-in)• Thread testing

Page 24: Software Testing

CMSC 345, Version 4/04 24

Top-down Testing

Level 2Level 2Level 2Level 2

Level 1 Level 1Testingsequence

Level 2stubs

Level 3stubs

. . .

Page 25: Software Testing

CMSC 345, Version 4/04 25

Bottom-up Testing

Level NLevel NLevel NLevel NLevel N

Level N–1 Level N–1Level N–1

Testingsequence

Testdrivers

Testdrivers

Page 26: Software Testing

CMSC 345, Version 4/04 26

Method “Pro’s” Top-down

Bottom-up

Sandwich

Thread

Page 27: Software Testing

CMSC 345, Version 4/04 27

Interface Testing Interface misuse

• A calling component calls another component and makes an error in the use of its interface.» parameters in the wrong order» parameter(s) of the wrong type» incorrect number of parameters

Interface misunderstanding• A calling component embeds assumptions about the behavior

of the called component that are incorrect.» binary search function called with an unordered array» wrong flag “number”» other preconditions that are violated

Page 28: Software Testing

CMSC 345, Version 4/04 28

Interface Types Parameter interfaces

• Data passed from one procedure to another» functions

Shared memory interfaces• Block of memory is shared between procedures

» global data

Message passing interfaces• Sub-systems request services from other sub-systems

» OO systems» client-server systems

Page 29: Software Testing

CMSC 345, Version 4/04 29

Some Interface Testing Guidelines

Design tests so that parameters to a called procedure are at the extreme ends of their ranges (boundaries).

Always test pointer parameters with null pointers.

Design tests that cause the component to fail.• violate preconditions

In shared memory systems, vary the order in which components are activated.

Page 30: Software Testing

CMSC 345, Version 4/04 30

Stress Testing Exercises the system beyond its maximum

design load.• Exceed string lengths• Store/manipulate more data than in specification• Load system with more users than in specification

Stressing the system often causes defects to come to light.

Systems should not fail catastrophically. Stress testing checks for unacceptable loss of service or data.

Page 31: Software Testing

CMSC 345, Version 4/04 31

The components to be tested are classes that are instantiated as objects.

No obvious “top” or “bottom” to the system for top-down or bottom-up integration and testing.

Levels:• Testing class methods• Testing the class as a whole• Testing clusters of cooperating classes• Testing the complete OO system

Object-oriented Testing

Page 32: Software Testing

CMSC 345, Version 4/04 32

Class Testing Complete test coverage of a class involves

• Testing all operations associated with an object• Setting and interrogating all object attributes• Exercising the object in all possible states; i.e., all events that

cause a state (attribute(s)) change in the object should be tested

Inheritance makes it more difficult to design class tests as the information to be tested is not localized.

Page 33: Software Testing

CMSC 345, Version 4/04 33

Object Integration Levels of integration are less distinct in object-

oriented systems. Cluster testing is concerned with integrating

and testing clusters of cooperating objects. Identify clusters using knowledge of the

operation of objects and the system features that are implemented by these clusters

Page 34: Software Testing

CMSC 345, Version 4/04 34

Approaches to Cluster Testing Use case or scenario testing

• Testing is based on user (actor) interactions with the system• Has the advantage that it tests system features as

experienced by users Thread testing

• Tests the system’s response to events as processing threads through the system» a button click

Page 35: Software Testing

CMSC 345, Version 4/04 35

Scenario-based Testing Identify scenarios from use cases and

supplement these with sequence diagrams that show the objects involved in the scenario

Page 36: Software Testing

CMSC 345, Version 4/04 36

Sample Sequence Diagram:CommsController

request (report)

acknowledge ()report ()

summarise ()

reply (report)

acknowledge ()

send (report)

:WeatherStation :WeatherData