44
Budapest University of Technology and Economics Department of Measurement and Information Systems Verification & Validation: Overview, Requirement-based testing Systems Engineering BSc Course

Verification & Validation: Overview

  • Upload
    others

  • View
    24

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Verification & Validation: Overview

Budapest University of Technology and EconomicsDepartment of Measurement and Information Systems

Verification & Validation: Overview,Requirement-based testing

Systems Engineering BSc Course

Page 2: Verification & Validation: Overview

Trac

eab

ility

Ver

ific

atio

n a

nd

Val

idat

ion

Platform-based systems design

Functional model

Platform model

Architecture model

Config. modelComponent

behav. model

Source code Config. file

Binary code

Compiler Linker

HW/SW allocation

code generationcode generation

HW library

Requirements

Fault tolerance & safety

2

Page 3: Verification & Validation: Overview

Learning Objectives

V&V overview

• List typical V&V activities

• Classify verification techniques according to their place in the lifecycle

Requirement-based testing

• Recall basic testing concepts

• Describe the goal of specification-based test design techniques

• Use basic test design techniques

3

Page 4: Verification & Validation: Overview

Overview of V&V techniques

Page 5: Verification & Validation: Overview

Typical steps in development lifecycle

5

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Schedule, sequencing depends on lifecycle model!

System engineer

Architect

Developer,coder

Test engineer

Page 6: Verification & Validation: Overview

6

Task V&V criteria V&V technique

Defining functions, actors, use cases

- Risks

- Criticality

- Checklists

- Failure mode and effects analysis

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Requirement analysis

Page 7: Verification & Validation: Overview

System specification

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Task V&V criteria V&V technique

Defining functional and non-functional requirements

- Completeness

- Unambiguity

- Verifiability

- Feasibility

- Reviews

- Static analysis

- Simulation

7

Page 8: Verification & Validation: Overview

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Architecture design

Task V&V criteria V&V technique

- Decomposing modules

- HW-SW co-design

- Designing communication

- Function coverage

- Conformance of interfaces

- Non-functional properties

- Static analysis

- Simulation

- Performance, dependability, security analysis

8

Page 9: Verification & Validation: Overview

9

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Module design (detailed design)

Task V&V criteria V&V technique

- Designing detailed behavior (data structures, algorithms)

- Correctness of critical internal algorithms and protocols

- Static analysis

- Simulation

- Formal verification

- Rapid prototyping

Page 10: Verification & Validation: Overview

10

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Module implementation

Task V&V criteria V&V technique

- Software implementation

Code is

- Safe

- Verifiable

- Maintainable

- Coding conventions

- Code reviews

- Static code analysis

- Verifying module implementation

- Conformance to module designs

- Unit testing

- Regression testing

Page 11: Verification & Validation: Overview

11

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

System integration

Task V&V criteria V&V technique

- Integrating modules

- Integrating SW with HW

- Conformance of integrated behavior

- Verifying communication

- Integration testing (incremental)

Page 12: Verification & Validation: Overview

12

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

System delivery and deployment

Task V&V criteria V&V technique

- Assembling complete system

- Conformance to system specification

- System testing

- Measurements, monitoring

- Fulfilling user expectations

- Conformance to requirements and expectations

- Validation testing

- Acceptance testing

- Alfa/beta testing

Source: Video and radar test (Bosch) Source: Consumer Reports

Page 13: Verification & Validation: Overview

13

Tasks during operation and maintenance:- Failure logging and analysis (for failure prediction)- V&V of modifications

Mini-lifecycle for each

modification

Requirement analysis

System specification

Architecture design

Module design

Module implementation

System integration

System delivery

Operation, maintenance

Operation and maintenance

Page 14: Verification & Validation: Overview

V&V in the V-model

14

Requirementanalysis

Systemspecification

Architecturedesign

Moduledesign

Moduleimplementation

Moduleverification

Systemintegration

Systemverification

Systemvalidation

Operation,maintenance

Module testdesign

Integration testdesign

System testdesign

System val. design

Not just after coding

V&V in each step!

Page 15: Verification & Validation: Overview

Basic V&V Concepts

Recap from Software Engineering course

15

Page 16: Verification & Validation: Overview

16

V&V techniques

• What: any artefact(documentation, model, code)

• How: without execution

• E.g.: review, static analysis

Static

• What: executable artefacts (model, code…)

• How: with execution

• E.g.: simulation, testing

Dynamic

Page 17: Verification & Validation: Overview

Basic concepts

▪ SUT: system under test

▪ Test caseo a set of test inputs, execution conditions, and

expected results developed for a particular objective

▪ Test suite

▪ Test oracleo A principle or mechanism that helps you decide

whether the program passed the test

▪ Verdict: result (pass / fail /error / inconclusive…)

Specification, requirements

Test cases VerdictsTest

execution

17

Page 18: Verification & Validation: Overview

18

Problems and tasks

▪ Test selection

o What test inputs and test data to use?

▪ Oracle problem

o How to get/create reliable oracle?

▪ Exit criteria

o How long to test?

▪ Testability

o Observability + controllability

Page 19: Verification & Validation: Overview

19

Case study: AUTOSAR Acceptance Tests

Source: AUTOSAR ATS Overview, AUTOSAR ATS RTE

19

Page 20: Verification & Validation: Overview

AUTOSAR concepts (recap)

20

Page 21: Verification & Validation: Overview

AUTOSAR Acceptance Tests

▪ System-level tests based on specification

▪ Checks visible functionalities

o Application level and Bus level

▪ Acceptance Test Specifications (ATS)

▪ Test suites for different specifications

o Communication (CAN, FlexRay…), Memory stack, Runtime Environment [RTE]…

21

Page 22: Verification & Validation: Overview

Example: AUTOSAR ATS RTE

▪ Tests RTE functionality

▪ 5 features, 68 test cases, 251 pages (!)

Feature: RTE Client Server Communication

▪ General Test Objectives:cover the Client Server feature of the RTE [RS_BRF_01312]

22

Page 23: Verification & Validation: Overview

Requirements and specification to test

[RS_BRF_01312] AUTOSAR RTE shall support calling of subroutines (client/server call, including remote procedure calls).

[SRS_Rte_00029] The RTE shall support multiple-client-single-server ("n:1") client-server (function invocation) communication.

[SWS_Rte_04516] The RTE’s implementation of the client-server communication shall ensure that a service result is dispatched to the correct client if more than one client uses a service.

23

Refine

Refine

How can we test this functionality?

Page 24: Verification & Validation: Overview

What is needed to define a test

▪ Test architecture

o SUT, simulated components, test drivers and stubs…

▪ Test configuration and data

o Parameters, message data…

▪ Test cases

o Test goal, pre-conditions, sequence of steps (input + expected output), post-conditions…

24

Page 25: Verification & Validation: Overview

Test architecture

25

System Under TestSoftware Components for testing

Point of Control and Observation

Starting and controlling tests

Page 26: Verification & Validation: Overview

Test configuration (excerpt)

26

Page 27: Verification & Validation: Overview

Test case

27

Page 28: Verification & Validation: Overview

Test case (cont’d)

28

Page 29: Verification & Validation: Overview

Test case (cont’d)

29

Page 30: Verification & Validation: Overview

Specification-based test design

Page 31: Verification & Validation: Overview

Test design techniques

31

Goal: Select test cases based on test objectives

Specification-based Structure-based

• SUT: black box• Only spec. is known• Testing specified

functionality

• SUT: white box• Inner structure known• Testing based on

internal behavior

Page 32: Verification & Validation: Overview

Specification-based techniques

32

Equivalence classes

Boundary values

Decision tables

Combinatorial testing

Based on use cases

Page 33: Verification & Validation: Overview

Equivalence class partitioning

▪ Input and output equivalence classes:

o Data that are expected to cover the same faults(cover the same part of the program)

o Goal: Each equivalence class is represented by one test input (selected test data)

▪ Highly context-dependent

o Needs to know the domain and the SUT!

o Depends on the skills and experience of the tester

33

Page 34: Verification & Validation: Overview

Selecting equivalence classes

▪ Selection uses heuristics

o Initial: valid and invalid partitions

o Next: refine partitions

▪ Typical heuristics:

o Interval (e.g. 1-1000)

• < min, min-max, >max

o Set (e.g. RED, GREEN, BLUE)

• Valid elements, invalid element

o Specific format (e.g. first character is @)

• Condition true, condition false

o Custom (e.g. February from the months)34

Page 35: Verification & Validation: Overview

Deriving test cases from equiv. classes

▪ Combining equiv. classes of several inputs

▪ For valid (normal) equivalence classes:

o test data should cover as much equivalence classes as possible

▪ For invalid equivalence classes:

o first covering the each invalid equivalence class separately

o then combining them systematically

35

Page 36: Verification & Validation: Overview

EXERCISE

Requirement: The loan application shall be denied if the requested amount is larger than 1M Ft and the customer is a student, unless the amount is less than 3M Ft and the customer has repaid a previous loan (of any kind).

▪ Input parameters? Equivalence classes?

▪ Any questions regarding the requirement?

Equivalence partitions

37

Page 37: Verification & Validation: Overview

Specification-based techniques

38

Equivalence classes

Boundary values

Decision tables

Combinatorial testing

Based on use cases

Page 38: Verification & Validation: Overview

2. Boundary value analysis

▪ Examining the boundaries of data partitions

o Focusing on the boundaries of equivalence classes

o Both input and output partitions

▪ Typical faults to be detected:

o Faulty relational operators,

o conditions in cycles,

o size of data structures,

o …

39

Page 39: Verification & Validation: Overview

Typical test data for boundaries

▪ A boundary requires 3 tests:

▪ An interval requires 5-7 tests:

boundary 1 boundary 2

boundary

40

Page 40: Verification & Validation: Overview

EXERCISE

Requirement: If the robot detects that a human is closer than 4 meter, then it has to slow down, and if it is closer than 2 meter, then it has to stop.

▪ What values to use for testing?

▪ Any other questions regarding the requirement?

Boundary values

42

Page 41: Verification & Validation: Overview

Specification-based techniques

43

Equivalence classes

Boundary values

Decision tables

Combinatorial testing

Based on use cases

Page 42: Verification & Validation: Overview

Deriving tests from use cases

▪ Typical test cases:

o 1 test for main path („happy path”, „mainstream”)

• Oracle: checking post-conditions

o Separate tests for each alternate path

o Tests for violating pre-conditions

▪ Mainly higher levels (system, acceptance…)

44

Page 43: Verification & Validation: Overview

EXERCISE Deriving tests from a use case

45

Page 44: Verification & Validation: Overview

Summary

46