Introduction to Verification and Test of Embedded Systems ... · Formal specification and...

Preview:

Citation preview

Introduction to Verificationand Test of EmbeddedSystemsSE767: Vérification & Test

Ulrich Kühneulrich.kuhne@telecom-paristech.fr26/11/2018

Objectives of this Course

Understanding the role of test & verification in thedevelopment processApplying Test-Driven Design to embedded softwareGetting in touch with formal methodsUnderstanding and writing a formal specification for ahardware module

2/53 Ulrich Kühne 26/11/2018

Course Structure

1. Introduction(this lecture)

2. Test-Driven Design of embedded software(lecture + exercise)

3. Embedded systems modeling(video lecture + exercise)

4. Introduction to formal methods(lecture + exercise)

5. Formal specification and verification of embeddedhardware(lecture + exercise)

3/53 Ulrich Kühne 26/11/2018

Plan

Motivation

Basic Validation MethodologyObjectivesDevelopment Cycle

Test CoverageSoftware Coverage MetricsTesting Requirements in the Railway DomainCoverage with gcc and lcov

Testing Embedded Systems

Hardware Verification & Test

4/53 Ulrich Kühne 26/11/2018

Why do we need to test and verify?

5/53 Ulrich Kühne 26/11/2018

Ariane 5

Ariane 5 rocket explodesshortly after liftoff (1996)Integer overflow causingexceptionLegacy code (Ariane 4)used in flight control

6/53 Ulrich Kühne 26/11/2018

Mars Pathfinder

Rover gets stuck on Mars(1997)Scheduling problem leadsto permanent restartsBug fixed on identical copyon earth, live update overthe air

7/53 Ulrich Kühne 26/11/2018

Therac-25

Therac-25 radiationtherapy machine (1982)Software bug leads to raceconditionPatients exposed to lethalradiation dose

8/53 Ulrich Kühne 26/11/2018

Zune 30

Zune-30 music playersbricked during December31, 2008Infinite loop in time servicein loop (!) yearsSeveral New Year’s Eveparties without music. . .

9/53 Ulrich Kühne 26/11/2018

Plan

Motivation

Basic Validation MethodologyObjectivesDevelopment Cycle

Test CoverageSoftware Coverage MetricsTesting Requirements in the Railway DomainCoverage with gcc and lcov

Testing Embedded Systems

Hardware Verification & Test

10/53 Ulrich Kühne 26/11/2018

Goals of Testing & Verification

1. Find and eliminate bugs2. Improve design quality3. Reduce risk of user4. Reduce risk of enterprise5. Fulfill certification requirements6. Improve performance7. . . . ?

11/53 Ulrich Kühne 26/11/2018

Test & Verification Techniques

Plug & prayNon-regression testsUnit testingTest-driven designModel-based designFormal methodsMathematical proofs

12/53 Ulrich Kühne 26/11/2018

Test & Verification Techniques

Plug & prayNon-regression testsUnit testingTest-driven designModel-based designFormal methodsMathematical proofs

Easy

Hard

12/53 Ulrich Kühne 26/11/2018

Test & Verification Techniques

Plug & prayNon-regression testsUnit testingTest-driven designModel-based designFormal methodsMathematical proofs

Easy

Hard

Sloppy

Complete

12/53 Ulrich Kühne 26/11/2018

Validation in the Development Cycle

When should validation take place??

As early as possibleAs often as necessary

13/53 Ulrich Kühne 26/11/2018

Validation in the Development Cycle

When should validation take place??

As early as possibleAs often as necessary

13/53 Ulrich Kühne 26/11/2018

The V-Model

Requirementsanalysis

Softwarespecification

Softwarearchitecture

Moduledesign

Implemen-tation

Unittests

Moduleintegration

Systemintegration

Acceptancetests

Deployment andmaintenance

14/53 Ulrich Kühne 26/11/2018

The V-Model

Requirementsanalysis

Softwarespecification

Softwarearchitecture

Moduledesign

Implemen-tation

Unittests

Moduleintegration

Systemintegration

Acceptancetests

Deployment andmaintenance

14/53 Ulrich Kühne 26/11/2018

Agile Design

Continuous integrationBreak long validation cyclesOne feature (functionality) at a time

Featuredefinition

Implemen-tation

Vali-dation

Featuredefinition

Implemen-tation

Vali-dation

. . .

15/53 Ulrich Kühne 26/11/2018

Agile Design

Continuous integrationBreak long validation cyclesOne feature (functionality) at a time

Featuredefinition

Implemen-tation

Vali-dation

Featuredefinition

Implemen-tation

Vali-dation

. . .

15/53 Ulrich Kühne 26/11/2018

Agile Design

Continuous integrationBreak long validation cyclesOne feature (functionality) at a time

Featuredefinition

Implemen-tation

Vali-dation

Featuredefinition

Implemen-tation

Vali-dation

. . .

15/53 Ulrich Kühne 26/11/2018

What to Test?

How to define our test cases??

Depends on abstraction level:Module tests, integration tests, acceptance tests, . . .Depends on test strategy:Functional vs. structural testingDepends on test objectives:Bug hunting, coverage, performance, stress testing, . . .Depends on context:Certification and safety norms (e.g. EN 50128)

16/53 Ulrich Kühne 26/11/2018

What to Test?

How to define our test cases??

Depends on abstraction level:Module tests, integration tests, acceptance tests, . . .Depends on test strategy:Functional vs. structural testingDepends on test objectives:Bug hunting, coverage, performance, stress testing, . . .Depends on context:Certification and safety norms (e.g. EN 50128)

16/53 Ulrich Kühne 26/11/2018

Functional vs. Structural Testing

Functional Testing

Black box testingDriven by specificationGoal: Cover all specifiedfunctionality

Structural Testing

White box testingDriven by code structureGoal: Achieve structuralcoverage metrics

17/53 Ulrich Kühne 26/11/2018

Success Criteria

When do we stop testing??

Boss says to stop(time budget)Bug rate belowthresholdCoverage abovethreshold

bug

rate

time

cove

rage

time

18/53 Ulrich Kühne 26/11/2018

Success Criteria

When do we stop testing??

Boss says to stop(time budget)Bug rate belowthresholdCoverage abovethreshold

bug

rate

time

cove

rage

time

18/53 Ulrich Kühne 26/11/2018

Plan

Motivation

Basic Validation MethodologyObjectivesDevelopment Cycle

Test CoverageSoftware Coverage MetricsTesting Requirements in the Railway DomainCoverage with gcc and lcov

Testing Embedded Systems

Hardware Verification & Test

19/53 Ulrich Kühne 26/11/2018

Test Coverage

DUT

Input Coverage

Full coverageBoundary valuesEquivalence classes

Code Coverage

Statement coverageBranch coveragePath coverage

20/53 Ulrich Kühne 26/11/2018

Test Coverage

DUT

Testbench

inpu

tstim

uli

outputchecks

Input Coverage

Full coverageBoundary valuesEquivalence classes

Code Coverage

Statement coverageBranch coveragePath coverage

20/53 Ulrich Kühne 26/11/2018

Test Coverage

DUT

Testbench

inpu

tstim

uli

outputchecks

Input Coverage

Full coverageBoundary valuesEquivalence classes

Code Coverage

Statement coverageBranch coveragePath coverage

20/53 Ulrich Kühne 26/11/2018

Test Coverage

DUT

Testbench

inpu

tstim

uli

outputchecks

Input Coverage

Full coverageBoundary valuesEquivalence classes

Code Coverage

Statement coverageBranch coveragePath coverage

20/53 Ulrich Kühne 26/11/2018

Input Coverage: Equivalence Partitioning

Black box approachCover all interesting cases ofinput stimuliDetermine equivalence classesthat should entail the samebehaviorSelect one test case for eachclass

Are the equivalence classes consistent with the code??

21/53 Ulrich Kühne 26/11/2018

Input Coverage: Equivalence Partitioning

Black box approachCover all interesting cases ofinput stimuliDetermine equivalence classesthat should entail the samebehaviorSelect one test case for eachclass

Are the equivalence classes consistent with the code??

21/53 Ulrich Kühne 26/11/2018

Input Coverage: Equivalence Partitioning

Black box approachCover all interesting cases ofinput stimuliDetermine equivalence classesthat should entail the samebehaviorSelect one test case for eachclass

Are the equivalence classes consistent with the code??

21/53 Ulrich Kühne 26/11/2018

Input Coverage: Equivalence Partitioning

Black box approachCover all interesting cases ofinput stimuliDetermine equivalence classesthat should entail the samebehaviorSelect one test case for eachclass

Are the equivalence classes consistent with the code??

21/53 Ulrich Kühne 26/11/2018

Boundary Conditions

Test values close to and onboundariesReveals bugs due to off-by-onemistakesTest negative (i.e. failing) results

22/53 Ulrich Kühne 26/11/2018

Boundary Conditions

Test values close to and onboundariesReveals bugs due to off-by-onemistakesTest negative (i.e. failing) results

22/53 Ulrich Kühne 26/11/2018

Boundary Conditions

Test values close to and onboundariesReveals bugs due to off-by-onemistakesTest negative (i.e. failing) results

22/53 Ulrich Kühne 26/11/2018

Code Coverage

Statement coverageBranch coverage

• Decision coverage• Condition coverage• Condition/decision coverage• Modified condition/decision coverage (MC/DC)• Compound condition coverage

Path coverage

100% path cov. ⇒ 100% decision cov. ⇒ 100% statement cov.

23/53 Ulrich Kühne 26/11/2018

Control Flow Graph

01: int square(int x){

02: if (x <= 0)

03: x = -x;

04: int s = 0, a = 1;

05: for (int i=0; i<x; ++i){

06: s += a;

07: a += 2;

08: }

09: return s;

10: }

2

3

4

5

67

9

24/53 Ulrich Kühne 26/11/2018

Control Flow Graph

01: int square(int x){

02: if (x <= 0)

03: x = -x;

04: int s = 0, a = 1;

05: for (int i=0; i<x; ++i){

06: s += a;

07: a += 2;

08: }

09: return s;

10: }

2

3

4

5

67

9

24/53 Ulrich Kühne 26/11/2018

Control Flow Graph

01: int square(int x){

02: if (x <= 0)

03: x = -x;

04: int s = 0, a = 1;

05: for (int i=0; i<x; ++i){

06: s += a;

07: a += 2;

08: }

09: return s;

10: }

2

3

4

5

67

9

24/53 Ulrich Kühne 26/11/2018

Branch Coverage

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Decision coverage Every decision has taken all possible values atleast once.

Condition coverage Every condition has taken all possible values atleast once.

Condition/decision coverage Combination of decision and conditioncoverage.

Modified condition/decision coverage Condition/decision coverageplus every condition has been shown to independentlyaffect the decision’s outcome.

25/53 Ulrich Kühne 26/11/2018

Branch Coverage

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Decision

Decision coverage Every decision has taken all possible values atleast once.

Condition coverage Every condition has taken all possible values atleast once.

Condition/decision coverage Combination of decision and conditioncoverage.

Modified condition/decision coverage Condition/decision coverageplus every condition has been shown to independentlyaffect the decision’s outcome.

25/53 Ulrich Kühne 26/11/2018

Branch Coverage

1: x = 0;

2: if (a && (b || c))

3: x = 1;

DecisionConditions

Decision coverage Every decision has taken all possible values atleast once.

Condition coverage Every condition has taken all possible values atleast once.

Condition/decision coverage Combination of decision and conditioncoverage.

Modified condition/decision coverage Condition/decision coverageplus every condition has been shown to independentlyaffect the decision’s outcome.

25/53 Ulrich Kühne 26/11/2018

Branch Coverage

1: x = 0;

2: if (a && (b || c))

3: x = 1;

DecisionConditions

Decision coverage Every decision has taken all possible values atleast once.

Condition coverage Every condition has taken all possible values atleast once.

Condition/decision coverage Combination of decision and conditioncoverage.

Modified condition/decision coverage Condition/decision coverageplus every condition has been shown to independentlyaffect the decision’s outcome.

25/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc}{abc, abc}{abc, abc}{abc, abc}{abc, abc, abc, abc}

26/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc} yes no no no no{abc, abc}{abc, abc}{abc, abc}{abc, abc, abc, abc}

26/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc} yes no no no no{abc, abc} yes yes no no no{abc, abc}{abc, abc}{abc, abc, abc, abc}

26/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc} yes no no no no{abc, abc} yes yes no no no{abc, abc} no no yes no no{abc, abc}{abc, abc, abc, abc}

26/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc} yes no no no no{abc, abc} yes yes no no no{abc, abc} no no yes no no{abc, abc} yes yes yes yes no{abc, abc, abc, abc}

26/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc} yes no no no no{abc, abc} yes yes no no no{abc, abc} no no yes no no{abc, abc} yes yes yes yes no{abc, abc, abc, abc} yes yes yes yes yes

26/53 Ulrich Kühne 26/11/2018

Let’s Cover. . .

1: x = 0;

2: if (a && (b || c))

3: x = 1;

Tests Statement Decision Condition C/D MC/DC{abc} yes no no no no{abc, abc} yes yes no no no{abc, abc} no no yes no no{abc, abc} yes yes yes yes no{abc, abc, abc, abc} yes yes yes yes yes

26/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

×1

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexplored

Execute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

×2

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexploredExecute loops 0,1, morethan 1 time (loop coverage)

Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

×n

27/53 Ulrich Kühne 26/11/2018

Path Coverage

Every execution path of theprogram has beenexploredExecute loops 0,1, morethan 1 time (loop coverage)Exponential number ofpaths in generalInfeasible paths

2

3

4

5

67

9

×n

27/53 Ulrich Kühne 26/11/2018

Example: Railway Systems

Safety-critical embedded systemsStrict regulation by European and national agenciesEuropean Train Control System (ETCS)

28/53 Ulrich Kühne 26/11/2018

European Norm CENELEC EN 50128

Development & validationprocess for railwaysystemsSafety integrity levels SIL0up to SIL4Organizational structureDevelopment cycle (Vmodel)Validation activities andreports for each projectphase

29/53 Ulrich Kühne 26/11/2018

Extracts from EN 50128

Table A.18 – Performance Testing

TECHNIQUE/MEASURE Ref SIL 0 SIL 1 SIL 2 SIL 3 SIL 4

1. Avalanche/Stress Testing D.3 - R R HR HR

2. Response Timing and Memory Constraints D.45 - HR HR HR HR

3. Performance Requirements D.40 - HR HR HR HR

Table A.19 – Static Analysis

TECHNIQUE/MEASURE Ref SIL 0 SIL 1 SIL 2 SIL 3 SIL 4

1. Boundary Value Analysis D.4 - R R HR HR

2. Checklists D.7 - R R R R

3. Control Flow Analysis D.8 - HR HR HR HR

4. Data Flow Analysis D.10 - HR HR HR HR

5. Error Guessing D.20 - R R R R

6. Walkthroughs/Design Reviews D.56 HR HR HR HR HR

30/53 Ulrich Kühne 26/11/2018

Extracts from EN 50128Table A.21 – Test Coverage for Code

Test coverage criterion Ref SIL 0 SIL 1 SIL 2 SIL 3 SIL 4

1. Statement D.50 R HR HR HR HR

2. Branch D.50 - R R HR HR

3. Compound Condition D.50 - R R HR HR

4. Data flow D.50 - R R HR HR

5. Path D.50 - R R HR HR

Requirements:

1) For every SIL, a quantified measure of coverage shall be developed for the test undertaken. This can support the judgment on the confidence gained in testing and the necessity for additional techniques.

2) For SIL 3 or 4 test coverage at component level should be measured according to the following:

- 2 and 3; or

- 2 and 4; or

- 5

or test coverage at integration level should be measured according to one or more of 2, 3, 4 or 5.

3) Other test coverage criteria can be used, given that this can be justified. These criteria depend on the software architecture (see Table A.3) and the programming language (see Table A.15 and Table A.16).

4) Any code which it is not practicable to test shall be demonstrated to be correct using a suitable technique, e.g. static analysis from Table A.19.

31/53 Ulrich Kühne 26/11/2018

Practical Code Coverage with gcc and gcov

gcc has a magic option --coverage

Instrumentation of binary codeCount execution of each basic blockCount branches taken/untakenGenerate coverage report using gcov (or lcov)Integration into build system

32/53 Ulrich Kühne 26/11/2018

Coverage Tool Flow with gcov

main.c gcc

a.out

main.gcno

main.gcda

gcov main.c.gcov

Let’s do it. . .

33/53 Ulrich Kühne 26/11/2018

Coverage Tool Flow with gcov

main.c gcc

a.out

main.gcno

main.gcda

gcov main.c.gcov

Let’s do it. . .

33/53 Ulrich Kühne 26/11/2018

Test Coverage (Again)

DUT

Testbench

inpu

tstim

uli

outputchecks

Input coverage Code coverage

34/53 Ulrich Kühne 26/11/2018

Test Coverage (Again)

DUT

Testbench

inpu

tstim

uli

outputchecks

Input coverage Code coverage

What am I missing here. . . ??

34/53 Ulrich Kühne 26/11/2018

Test Coverage (Again)

DUT

Testbench

inpu

tstim

uli

outputchecks

Input coverage Code coverage Output coverage?

What am I missing here. . . ??

34/53 Ulrich Kühne 26/11/2018

Mutation Coverage

Did we check the right things at the output??

How to assess test bench quality??

Mutation coverage(aka error seeding)Randomly insert errors into thecode (mutants)Check if the test bench captures(kills) themCompute ratio of killed mutants

35/53 Ulrich Kühne 26/11/2018

Mutation Coverage

Did we check the right things at the output??

How to assess test bench quality??

Mutation coverage(aka error seeding)Randomly insert errors into thecode (mutants)Check if the test bench captures(kills) themCompute ratio of killed mutants

35/53 Ulrich Kühne 26/11/2018

Mutation Coverage

Did we check the right things at the output??

How to assess test bench quality??

Mutation coverage(aka error seeding)Randomly insert errors into thecode (mutants)Check if the test bench captures(kills) themCompute ratio of killed mutants

35/53 Ulrich Kühne 26/11/2018

Mutation Coverage: Rationale

Mutations should mimic typical mistakes• Loop condition off by one• Replace operators such as < vs ≤• Modify constants• . . .

A test bench not detecting these mistakes should beimprovedMutation coverage approximates ratio of real bugs found

Number of mutants killedTotal number of mutants

≈ Number of real bugs foundTotal number of real bugs

36/53 Ulrich Kühne 26/11/2018

Mutation Coverage: Rationale

Mutations should mimic typical mistakes• Loop condition off by one• Replace operators such as < vs ≤• Modify constants• . . .

A test bench not detecting these mistakes should beimprovedMutation coverage approximates ratio of real bugs found

Number of mutants killedTotal number of mutants

≈ Number of real bugs foundTotal number of real bugs

36/53 Ulrich Kühne 26/11/2018

Summary Coverage

Coverage metrics measure test qualityWidely used in embedded industryStrong requirements for railway, aerospace, andautomobile domainsSimple coverage with gcc and gcov

Comes at virtually no cost ⇒ Use it!

37/53 Ulrich Kühne 26/11/2018

Plan

Motivation

Basic Validation MethodologyObjectivesDevelopment Cycle

Test CoverageSoftware Coverage MetricsTesting Requirements in the Railway DomainCoverage with gcc and lcov

Testing Embedded Systems

Hardware Verification & Test

38/53 Ulrich Kühne 26/11/2018

Testing Embedded Software

What’s so special about embedded software testing??

Runs on dedicated (expensive,scarce, buggy, . . . ) hardwareHardware not available (yet)Limited memoryLimited debug capabilitiesReal-timeComplex interactions withphysical worldLong build and upload times

39/53 Ulrich Kühne 26/11/2018

Testing Embedded Software

What’s so special about embedded software testing??

Runs on dedicated (expensive,scarce, buggy, . . . ) hardwareHardware not available (yet)Limited memoryLimited debug capabilitiesReal-timeComplex interactions withphysical worldLong build and upload times

39/53 Ulrich Kühne 26/11/2018

Embedded Testing Techniques

Testing on target hardwareHigh confidence in test resultsLong test cycles, hardware might not be available (yet)

Emulating the target hardware (FPGA)Test results close to the real targetDifficult to set up, HDL sources needed

Testing on a virtual platform (SystemC, qemu, . . . )High performance, no dedicated hardwareHigh development effort

Purely C-based shallow test harnessShort test cycles, easy to set upDifficult for real-time systems

SE743

SE744

SE747

Here!

40/53 Ulrich Kühne 26/11/2018

Embedded Testing Techniques

Testing on target hardwareHigh confidence in test resultsLong test cycles, hardware might not be available (yet)

Emulating the target hardware (FPGA)Test results close to the real targetDifficult to set up, HDL sources needed

Testing on a virtual platform (SystemC, qemu, . . . )High performance, no dedicated hardwareHigh development effort

Purely C-based shallow test harnessShort test cycles, easy to set upDifficult for real-time systems

SE743

SE744

SE747

Here!

40/53 Ulrich Kühne 26/11/2018

Embedded Testing Techniques

Testing on target hardwareHigh confidence in test resultsLong test cycles, hardware might not be available (yet)

Emulating the target hardware (FPGA)Test results close to the real targetDifficult to set up, HDL sources needed

Testing on a virtual platform (SystemC, qemu, . . . )High performance, no dedicated hardwareHigh development effort

Purely C-based shallow test harnessShort test cycles, easy to set upDifficult for real-time systems

SE743

SE744

SE747

Here!

40/53 Ulrich Kühne 26/11/2018

Embedded Testing Techniques

Testing on target hardwareHigh confidence in test resultsLong test cycles, hardware might not be available (yet)

Emulating the target hardware (FPGA)Test results close to the real targetDifficult to set up, HDL sources needed

Testing on a virtual platform (SystemC, qemu, . . . )High performance, no dedicated hardwareHigh development effort

Purely C-based shallow test harnessShort test cycles, easy to set upDifficult for real-time systems

SE743

SE744

SE747

Here!

40/53 Ulrich Kühne 26/11/2018

Embedded Testing Techniques

Testing on target hardwareHigh confidence in test resultsLong test cycles, hardware might not be available (yet)

Emulating the target hardware (FPGA)Test results close to the real targetDifficult to set up, HDL sources needed

Testing on a virtual platform (SystemC, qemu, . . . )High performance, no dedicated hardwareHigh development effort

Purely C-based shallow test harnessShort test cycles, easy to set upDifficult for real-time systems

SE743

SE744

SE747

Here!

40/53 Ulrich Kühne 26/11/2018

Alternative: Model-based Testing

Developmentand Test Model

Test EngineIntegrated HW/SW System

TestProcedures

SystemCode

Generatedfrom model

Generated ormanually

developed

HW/SW IntegrationTests

41/53 Ulrich Kühne 26/11/2018

Alternative: Model-based Testing

DevelopmentModel

TestModel

Test EngineIntegrated HW/SW System

TestProcedures

SystemCode

Generatedfrom model

Generated ormanually

developed

HW/SW IntegrationTests

41/53 Ulrich Kühne 26/11/2018

Principles of Model-Based Testing

Use of well-founded models(e.g. SysML state charts)Models serve for documentationand reviewEnables testing duringdevelopmentAutomated generation of testcasesAutomated requirementstraceability

Requirements

Test Model

TestProcedures

Test Engine

TestPlan

42/53 Ulrich Kühne 26/11/2018

Example from Railway DomainCSM_ON

CSM_ON

OVERSPEED

/entry OpaqueBehavior

SpeedSupervisionStatus = OverspeedStatus;

currentSpeed = SimulatedTrainSpeed;

/do OpaqueBehavior

permittedSpeed= V_mrsp;

displayPermittedSpeed=true;

WARNING

/entry OpaqueBehavior

SpeedSupervisionStatus =

WarningStatus;

currentSpeed = SimulatedTrainSpeed;

/do OpaqueBehavior permittedSpeed

SERVICE_BRAKE

/entry OpaqueBehavior

SpeedSupervisionStatus =

InterventionStatus;EmergencyBrakeC

ommand= !

sbCmd;ServiceBrakeCommand=

sbCmd;

EMER_BRAKE

/entry OpaqueBehavior

SpeedSupervisionStatus =

InterventionStatus;EmergencyBrakeCommand= 1;ServiceBrakeCommand= 0;

currentSpeed = SimulatedTrainSpeed;

CSM_INIT

NORMAL/entry OpaqueBehavior

SpeedSupervisionStatus = NormalStatus;EmergencyBrakeCommand=0;

ServiceBrakeCommand=0;currentSpeed = SimulatedTrainSpeed;

(SimulatedTrainSpeed == 0)]

[SimulatedTrainSpeed > V_mrsp+dV_warning]

[SimulatedTrainSpeed <= V_mrsp]

[SimulatedTrainSpeed > V_mrsp]

[SimulatedTrainSpeed > V_mrsp+dV_sbi]

[SimulatedTrainSpeed <= V_mrsp]

[SimulatedTrainSpeed <= V_mrsp]

[EmergencyBrakeCommand != 1 && ServiceBrakeCommand == 1]

ServiceBrakeCommand != 1]

[EmergencyBrakeCommand != 1 &&

[SimulatedTrainSpeed > V_mrsp+dV_ebi]

[EmergencyBrakeCommand == 1]

[(SimulatedTrainSpeed <=V_mrsp && RevocationEmergencyBrake) ||

[Source: OpenETCS project, Cécile Braunstein]

43/53 Ulrich Kühne 26/11/2018

Other Testing Aspects

Mechanical testing• Vibrations• Shock• Standardized stress

Environmental conditions• Temperature• Pressure• Humidity• Radiation

Ageing

[Source: Institute of Space Systems, DLR]

44/53 Ulrich Kühne 26/11/2018

Plan

Motivation

Basic Validation MethodologyObjectivesDevelopment Cycle

Test CoverageSoftware Coverage MetricsTesting Requirements in the Railway DomainCoverage with gcc and lcov

Testing Embedded Systems

Hardware Verification & Test

45/53 Ulrich Kühne 26/11/2018

Hardware Design Flow

Specification

Electr. System Lvl.

Transaction Lvl.

Register Transfer Lvl.

Netlist

Layout

Chip

Natural language

UML, SysML, Matlab, . . .

C, C++, SystemC, . . .

VHDL, Verilog, . . .

Gate models

Geometric, electr. models

Silicon

Req. eng.,modeling

Design Spaceexpl., partitioning

Implementation,refinement

Synthesis

Place & route

Manufacturing

46/53 Ulrich Kühne 26/11/2018

Hardware Design Flow

Specification

Electr. System Lvl.

Transaction Lvl.

Register Transfer Lvl.

Netlist

Layout

Chip

Verification

Equiv. Checking

Test

Req. eng.,modeling

Design Spaceexpl., partitioning

Implementation,refinement

Synthesis

Place & route

Manufacturing

46/53 Ulrich Kühne 26/11/2018

Hardware Verification vs Test

Verification

Detect design bugsExtract properties fromrequirementsApplied on RTL codeHigh manual effort

Test

Detect physical defectsTest generation from netlistaccording to fault modelApplied on fabricated chipsHigh automation

47/53 Ulrich Kühne 26/11/2018

Physical Defects

[Source: IEEE Spectrum “The Art of Failure”]

48/53 Ulrich Kühne 26/11/2018

Stuck-at Fault Model

ab

c

d

a b c d0 0 0 00 0 1 00 1 0 10 1 1 01 0 0 11 0 1 01 1 0 11 1 1 0

〈000〉 is a test vector for the shown stuck-at-1 fault{〈010〉, 〈100〉, 〈110〉} are test vectors for the stuck-at-0 fault

49/53 Ulrich Kühne 26/11/2018

Stuck-at Fault Model

ab

c

d

sa-1

a b c d0 0 0 0/10 0 1 00 1 0 10 1 1 01 0 0 11 0 1 01 1 0 11 1 1 0

〈000〉 is a test vector for the shown stuck-at-1 fault

{〈010〉, 〈100〉, 〈110〉} are test vectors for the stuck-at-0 fault

49/53 Ulrich Kühne 26/11/2018

Stuck-at Fault Model

ab

c

d

sa-1

sa-0

a b c d0 0 0 0/10 0 1 00 1 0 1/00 1 1 01 0 0 1/01 0 1 01 1 0 1/01 1 1 0

〈000〉 is a test vector for the shown stuck-at-1 fault{〈010〉, 〈100〉, 〈110〉} are test vectors for the stuck-at-0 fault

49/53 Ulrich Kühne 26/11/2018

Automatic Test Pattern Generation

ATPG

Create a list of all possible(stuck-at) faultsFor each fault:

• Find a test pattern• Drop all other faults

detected by this pattern

Untestable faults?Hard to test faults?Sequential tests?Test compression?

50/53 Ulrich Kühne 26/11/2018

Summary

Validation on all levels of abstraction> 50% of overall costsCrucial for project success and product qualityVarious techniques

• Dynamic testing• Static verification• Model-based design

Integration into development cycle

51/53 Ulrich Kühne 26/11/2018

Outlook

Test-Driven Design of embedded softwareIntroduction to formal methodsFormal specification and verification of embeddedhardware

52/53 Ulrich Kühne 26/11/2018

Preparation for Exercises

Log in to GitLab:https://gitlab.telecom-paristech.fr

Go to the GitLab group:https://gitlab.telecom-paristech.fr/MSSE/TestVerif/2018

Request access to the group

53/53 Ulrich Kühne 26/11/2018

References I

DO-178B: Software Considerations in Airborne Systems and EquipmentCertification, 1982.

EN 50128 - Railway applications - Communication, signalling and processingsystems - Software for railway control and protection systems.Technical report, European Commitee for Electrotechnical Standardization, 2001.

James W. Grenning.Test Driven Development for Embedded C.Pragmatic Bookshelf, Raleigh, N.C, 1st edition, May 2011.

Kelly Hayhurst, Dan S. Veerhusen, John J. Chilenski, and Leanna K. Rierson.A practical tutorial on modified condition/decision coverage, 2001.

54/53 Ulrich Kühne 26/11/2018

Recommended