21
1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

1 Software Testing and Quality Assurance Lecture 17 - Test Analysis Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

Embed Size (px)

DESCRIPTION

3 Guided inspection: evaluation criteria Correctness: is a measure of the accuracy of the model: In analysis: it is the accuracy of the problem description. In design: it is how accurately the model represents the solution of the problem The model is correct with respect to a set of test cases if every test case produces the expected result.

Citation preview

Page 1: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

1

Software Testing and Quality Assurance

Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide

to Testing Object-Oriented Software)

Page 2: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

2

Lecture Outline Evaluation Criteria & Organization of the

guided inspection activities. Preparing for the guided inspection.

Page 3: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

3

Guided inspection: evaluation criteria Correctness: is a measure of the accuracy of

the model: In analysis: it is the accuracy of the problem

description. In design: it is how accurately the model

represents the solution of the problem The model is correct with respect to a set of test

cases if every test case produces the expected result.

Page 4: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

4

Guided inspection: evaluation criteria Completeness: is a measure of the

inclusiveness of the model (are any necessary, or at least useful, elements missing from the model)? In iterative incremental process, completeness

is considered relative to how mature the current increment is expected to be.

Do all objects needed for the sequence diagram come from classes in the class diagram?

The model is complete if the result of executing the test cases can be adequately represented using only the content of the model.

Page 5: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

5

Guided inspection: evaluation criteria (cont...) Consistency: is a measure of whether there

are contradictions within the model or between the current model and the model upon which it is based. Consistency can determine whether there are

contradictions or conflicts present either internal to a single diagram or between two diagrams.

The model is inconsistent if there are different representations within the model for similar test cases.

Page 6: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

6

Guided inspection: evaluation criteria (cont...)

Other qualities like performance goals: define a number of system attributes that the development team might wish to verify. The guided inspection test cases can be

used as scenarios for testing performance.

Page 7: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

7

Organization of the guided inspection activity Basic roles:

Domain expert: the source of the expected results Tester: conduct the analysis necessary to select

effective test cases. Developer: the creators of the models under test.

Individual inspections: testers complete a checklist specific to the type of model being inspected. This process can be automated.

Page 8: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

8

Preparing for the guided inspection: specifying the inspection

Scope of an inspection is defined by specifying a set of use cases, a set of packages, or abstract classes/interfaces.

Depth of an inspection is defined by specifying layers in aggregation hierarchies under which messages are not sent.

Page 9: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

9

Preparing for the guided inspection: realistic models Layered approach: more individual

diagrams but each diagram is sufficiently modular to fit within the scope of a specific inspection

Page 10: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

10

Preparing for the guided inspection: realistic models— layered approach

Page 11: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

11

Preparing for the guided inspection: realistic models— layered approach (cont…)

Page 12: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

12

Preparing for the guided inspection: selecting test cases for the inspection

Test cases can be selected to ensure that specific types of coverage are achieved or to find specific type of defects.

Page 13: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

13

Preparing for the guided inspection: selecting test cases for the inspection Test case selection methods:

Orthogonal defect classification: most likely to identify defects by covering the different categories of system actions that trigger defects.

Use profiles: give confidence in the reliability of the product by identifying which parts of the program are used the most,

Risk Analysis

Page 14: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

14

Preparing for the guided inspection: selecting test cases for the inspection (cont...)

Orthogonal defect classification (ODC) The activities that caused a defect to be

detected are classified as triggers. The guided inspection technique uses

several of these triggers as a guide to select test cases.

Page 15: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

15

Preparing for the guided inspection: selecting test cases for the inspection (cont...)

By structuring the guided inspection process so that as many of these triggers as possible are

encountered, you ensure that the tests that guide the

inspection are more likely to trigger as many failures as possible.

Page 16: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

16

Preparing for the guided inspection: selecting test cases for the inspection (cont...)

Use profiles: A use profile for a system is an ordering of

the individual test cases based on a combination of the frequency and criticality values for the individual use cases.

Page 17: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

17

Preparing for the guided inspection: selecting test cases for the inspection (cont...)

Risk as a test case selector: Risk can be used as the basis for test case

selection It is useful during the development It is not appropriate after development when we

are trying to achieve some measure of reliability of the software,

at that time the use of profile technique is better because it supports the way software will be used.

Page 18: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

18

Preparing for the guided inspection: creating test cases Use case scenario: the path taken The alternative paths: several

scenarios that differ from the use scenario but represent valid execution

Exceptional paths: scenarios that result in error conditions

Page 19: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

19

Preparing for the guided inspection: an example of a use caseUse case # 1Actor: PlayerUse Scenario: The user selects the play option from the

menu. The system responds by starting the game.Alternate Paths: If a match is already in progress, the

selection is ignored.Exceptional Cases: If the match cannot open the display,

an error message is displayed and the game aborts.Frequency: LowCriticality: HighRisk: Medium

Page 20: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

20

Testing specific types of models The level of detail in the model becomes

greater as development proceeds. The amount of information also increases as

development proceeds. The exact interpretation of the evaluation

criteria can be made more specific for a specific model.

The membership of the inspection team changes for different models.

Page 21: 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis  Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

21

Key points Evaluation criteria:

Correctness Completeness Consistency Other qualities

Test case selection methods: Orthogonal defect classification Use profiles Risk