29
References 1 Dijkstra. Disciplines of Programming. Prentice Hall. 2 Littlewood B. The need for evidence from disparate sources to evaluate Software Safety Control Systems. Symposium, Bristol 1993 3 Beizer B. Software Testing Techniques. 4 EPSRC Safety Critical Systems Programme Project lED4/1/9301, Network Programmingfor Software Reliability Managed by University of Exeter. Professor D. Partridge 5 HSE Guidelines for the use of Programmable Electronic Systems in Safety-Related Applications. 1987 6 lEC 1508, parts 1 to 7, Functional Safety: safety related systems. Draft, 1995 7 lEC 880, Software for Computers in the Safety Systems of Nuclear Power Stations. 1986 8 RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment. December 1992 9 BRB/LU LTD/RIA Technical Specification No 23, Safety-Related Software for Railway Signalling. Consultative Document, 1991 10 CLC/ SC9XA/WG1 (draft), CENELEC Railway Applications: Software for Railway Control and Protection Systems. 11 ISA SP84, Electric (E) Electronic (E) / Programmable Electronic Systems (PES) for use in Safety Applications. Draft, 1993 12 DIN V VDE 801, Grundsatze fur Rechner in Systemen mit Sicherheitsaufgaben (Principles for computers in safety-related systems. January 1990 13 DEF STAN 00-55, parts 1 and 2, The Procurement of Safety Critical Software in Defence Equipment. Interim, April 1991 14 DEF STAN 00-56, parts 1 and 2, Safety Management Requirements of Defence Systems containing Programmable Electronics. Draft, February 1993

References - Springer978-1-4471-3277-6/1.pdf · References 1 Dijkstra. ... 25 Cullyer W J, Goodenough S J, Wichmann B A. The choices of ... Proceedings of IMA

  • Upload
    leque

  • View
    230

  • Download
    0

Embed Size (px)

Citation preview

References

1 Dijkstra. Disciplines of Programming. Prentice Hall.

2 Littlewood B. The need for evidence from disparate sources to evaluate Software Safety Control Systems. Symposium, Bristol 1993

3 Beizer B. Software Testing Techniques. 4 EPSRC Safety Critical Systems Programme Project lED4/1/9301,

Network Programmingfor Software Reliability Managed by University of Exeter. Professor D. Partridge

5 HSE Guidelines for the use of Programmable Electronic Systems in Safety-Related Applications. 1987

6 lEC 1508, parts 1 to 7, Functional Safety: safety related systems. Draft, 1995

7 lEC 880, Software for Computers in the Safety Systems of Nuclear Power Stations. 1986

8 RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment. December 1992

9 BRB/LU LTD/RIA Technical Specification No 23, Safety-Related Software for Railway Signalling. Consultative Document, 1991

10 CLC/ SC9XA/WG1 (draft), CENELEC Railway Applications: Software for Railway Control and Protection Systems.

11 ISA SP84, Electric (E) Electronic (E) / Programmable Electronic Systems (PES) for use in Safety Applications. Draft, 1993

12 DIN V VDE 801, Grundsatze fur Rechner in Systemen mit Sicherheitsaufgaben (Principles for computers in safety-related systems. January 1990

13 DEF STAN 00-55, parts 1 and 2, The Procurement of Safety Critical Software in Defence Equipment. Interim, April 1991

14 DEF STAN 00-56, parts 1 and 2, Safety Management Requirements of Defence Systems containing Programmable Electronics. Draft, February 1993

196 Testing Safety-Related Software: A Practical Handbook

15 IGE/SR/15, Safety Recommendations for Programmable Equipment in Safety-Related Applications. Institute of Gas Engineers, Issue 6, December 1993

16 Magee S, Tripp L. Guide to Software Engineering Standards and Specifications. 1997

17 B55760 part 8, Assessment of the Reliability of Systems Containing Software. This is currently in draft form (Draft for Development 198)

18 CONTESSE Task 12 Report, Hazard and Operability Study as an Approach to Software Safety Assessment.

19 Interim Defence Standard 00-58, HAZOP Studies Guide for Programmable Electronic Systems. Draft in preparation.

20 Standard for Software Engineering of Safety Critical Software. Ontario Hydro, 1990

21 Leveson N G and Harvey P R. Analyzing Software Safety. IEEE Transactions on Software Engineering 1983; Vol. SE-9, No.5

22 Myers G J. The Art of Software Testing. 1979 23 Myers G J. Software Reliability. John WJ.ley & Sons, 1976

24 Boehm B W. Software Engineering Economics. Prentice Hall, 1981

25 Cullyer W J, Goodenough S J, Wichmann B A. The choices of computer languages for use in Safety Critical Systems. Software Engineering Journal 1991; 6:51-58

26 The Statemate Approach to Complex Systems. i-Logix, 1989

27 The Semantics of Statecharts. i-Logix, 1991

28 Halstead M H. Elements of Software Science. Elsevier North Holland, New York, 1977

29 McCabe T J. A Complexity Measure. IEEE Transactions on Software Engineering 1976; Vol. SE-2; No.4

30 Pyle I C. Developing Safety System - A Guide Using Ada. Prentice Hall International.

31 CONTESSE Task 8 Report, An Integrated Platform for Environment Simulation.

32 Bache Rand Mullerburg M. Measures of testability as a basis for quality assurance. Software Engineering Journal, March 1990; 86-92

33 CONTESSE Task 4 Report, Software Testing via Environmental Simulation.

34 Bald O. Credibility Assessment of Simulation Results: The State of the Art. In: Proc. Conference on Methodology and Validation, 1987, Orlando, Fla., pp 19-25.

References 197

35 Carson J S. Convincing Users of Model's Validity Is Challenging Aspect of Modeler's Job. Ind. Eng. June 1986; 18:74-85.

36 Sargent R G. A TUtorial on Validation and Verification of Simulation Models. In: Proc. 1988 Wmter Simulation Conference, San Diego, Calif., pp 33-39.

37 Naylor T H and Finger J M. Verification of Computer Simulation Models. Management Sci. 1967; 5:644-653.

38 Goodenoug J B and Gerhart S L. Toward a theory of test data selection. mEE Transaction on Software Engineering, June 1975; Vol.SE_3

39 Weyuker E J. Assessing test data adequacy through program inference. ACM Transactions on Programming Languages and Systems, October 1983; Vol. 5, 4:641- 655.

40 Ould M A and Unwin C (eds). Testing in Software Development. Cambridge University Press, 1986.

41 Basili V and Selby R W. Comparing the effectiveness of software testing strategies. mEE Transactions on Software Engineering, December 1987; Vol. SE-13,12:1278-1296

42 Myers G J. A controlled experiment in program testing and code walkthrough/inspections. Communications of the ACM, September 1978; Vol. 21, 9:760-768.

43 CONTESSE Task 9 Report, Software Testing Adequacy and Coverage. 44 Duran S C and Ntafos J W. An evaluation of random testing. mEE

Transaction on Software Engineering, July 1984; Vol. SE_lO, 4:438-4444.

45 Gourlay J. A mathematical framework for the investigation of testing. mEE Transaction on Software Engineering, November 1983; Vol.SE_ 9,6: 686-709

46 Weyuker E J. Axiomatizing software test data adequacy. mEE Transaction on Software Engineering, December 1986; Vol. SE_12, 12:1128-1138

47 Parrish A and Zweben S H. Analysis and refinement of software test data adequacy properties. IEEE Transaction on Software Engineering, June 1991; Vol. SE_17, 6:565-581.

48 Parrish A and Zweben S H. Clarifying Some fundamental Concepts in Software Testing. mEE Transactions on Software Engineering, July 1993; Vol. 19,7:742-746.

49 Zhu H and Hall P A V. Test data adequacy measurement. Software Engineering Journal, Jan. 1993; Vol. 8, 1:21-30.

198 Testing Safety-Related Software: A Practical Handbook

50 Zhu H, Hall P A V, May J. Understanding software test adequacy­An axiomatic and measurement approach. In: Proceedings of IMA Conference on Mathematics of Dependable Systems, Royal Holloway, Univ. of London, Egham, Surrey 1st-3rd Sept. 1993

51 Clarke L A, Podgurski A, Richardson D J, Zeil S J. A formal Evaluation of data flow path selection criteria. IEEE Transactions on Software Engineering, November 1989; Vol. 15,11:1318-1332

52 Rapps S and Weyuker E J. Selecting software test data using data flow information. mEE Thansaction on Software Engineering, April 1985; Vol. SE_11,4:367-375

53 Ntafos S C. A comparison of some structural testing strategies. mEE Transaction on Software Engineering, June 1988; Vol. SE-14, 868-874

54 Frankl P G and Weyuker J E. A formal analysis of the fault-detecting ability of testing methods. mEE Transactions on Software Engineering, March 1993; Vol. 19,3:202- 213

55 Miller W M, Morell L J, Noonan R E et al. Estimating the probability of failure when testing reveals no failures. mEE Trans. on Software Engineering 1992; Vol. 18,1

56 Littlewood B and Strigini L. Validation of ultra-high dependability for software-based systems. Comms. of the ACM 1993

57 Parnas 0 I et al. Evaluation of safety critical software. Comms of the ACM 1990; Vol 33:6

58 Howden W D. Functional Program Testing and Analysis. McGraw­Hill,1987

59 Poore J H, Mills H 0, Mutcher D. Planning and certifying software system reliability. mEE Software, January 1993

60 Ehrenberger W. Probabilistic techniques for software verification in safety applications of computerised process control in nuclear power plants. IAEA-TECOOC-58, February 1991

61 Littlewood B. Forecasting Software Reliability. In: Bittanti S (ed). Software Reliability Modelling and Identification. Springer-Verlag, 1988

62 Mills H D, Dyer M, Linger R C. Cleanroom Software Engineering. mEE Software, September 1987

63 Brocklehurst S, Chan P Y, Littlewood B, Snell J. Recalibrating software reliability models. mEE Trans. on Software Engineering, 1990; Vol.16,4

64 Littlewood B. Software reliability models for modular program structure. IEEE Trans. on Reliability, October 198; Vol.30:313-320

References 199

65 Mellor P. Modular structured software reliability modelling in practice. 4th European Workshop on Dependable Computing, Prague 8th-10th Apri11992

66 Laprie J C, Kanoun K. X-ware reliability and availability modelling. , IEEE Trans. on Software Engineering 1992; Vol. 18,2

67 Knight J C, Leveson N G. An experimental evaluation of the assumptions of independence in multiversion programming. IEEE Trans. on Software Engineering 1986; Vol. 12,1

68 Butler R W, Finelli G B. The infeasibility of experimental quantification of life critical software reliability. In: Procs. ACM SIGSOFI' '91, Conference on Software for Critical Systems, New Orleans, Dec. 4th-6th 1991

69 May J H R, Lunn D. A model of code sharing for estimating software failure on demand probabilities. Tech. rep., Depts. of Computing and Statistics, May 1993

70 Casella G, Berger R L. Statistical Inference. Wadsworth and Brooks/ Cole, Belmont Calif, 1990

71 Hamlet R G. Are we testing true reliability? IEEE Software, July 1992

72 May J H R and Lunn D. New Statistics for Demand-Based Software Testing. Information Processing Letters 1995; Vol53

73 CONTESSE Task 11 Report, Practical Specification of a Simulation of a Probabalistic Environment for a Reactor Protection System.

74 Hughes G, Pavey 0, May J H Ret al. Nuclear Electric's Contribution to the CONTESSE Testing Framework and its Early Application. Procs. of the 3rd. Safety-Critical Systems Symposium, Brighton February 1995

75 Hunns 0 M and Wainwright N. Software-based Protection for Sizewell B: the Regulator's Perspective. Nuclear Engineering International, September 1991

76 Musa J D. Operational Profiles in Software Reliability Engineering. IEEE Software, March 1993.

77 Brooks F P. The Mythical Man Month. Addison Wesley, 1975

78 IEEE. Guidelines for Assuring Testability. 1988, ISBN 0-86341-129-0

79 BSI 00198 :1991. Guide to Assessment of reliability of systems containing software.

80 DOD-STD-2167 A. Military Standard, Defence System Software Development.

Appendix A

Summary of Advice from the Standards

This appendix contains a review of a range of standards and guidelines for the development of safety-related systems, and the advice given on testing. A discussion on the current industry practices and the use of the standards and guidelines can be found in Chapter 1.

HSE Guidelines [5]

Status and Content Overview

Guidance document for the development and use of programmable electronic systems in safety-related applications.

Provides checklists to aid software safety assessment

IGEjSRj15 [15]

Status and Content Overview

Draft application specific guidelines for the gas industry (ranging from offshore installations to domestic appliances) on the use of programmable equipment in safety­related applications.

Based on HSE Guidelines.

Summary of Testing Advice

Very limited guidance on testing given although some test coverage criteria suggested.

Mathematical models of software reliability are considered to have limited usefulness in the context of PES.

Static analysis is considered desirable.

Summary of Testing Advice

Very limited guidance on testing -little more than that already provided in the HSE Guidelines.

Advises that the use of formal methods be considered, particularly for complex systems.

Dynamic testing regarded as essential at all levels of development, but no direct advice on methods provided.

Use of static analysis encouraged with specific types of analysis available and the languages they are applicable to discussed.

202 Testing Safety-Related Software: A Practical Handbook

Drive Safely

Status and Content Overview Summary of Testing Advice

Proposal for European Standard for Provides limited guidance on test the development of safe road transport objectives. Guidance varies according informatic systems. to the required safety integrity level.

Suggests static analysis and testing should be carried out on high integrity software. For very high integrity software, formal proofs are required and test coverage must be justified. Black and white box testing required at all levels.

Summary of Advice from the Standards 203

lEe 1508 [6] - Part 3 Software Requirements

Status and Content Overview

Draft international standard on suitable techniques/measures to be used during development of safety­related software

Provides recommendations varying by required safety integrity level

Developer can be selective about techniques/measures chosen so long as the choice is justified

Summary otTesting Advice

Recommends test methods and techniques to be adopted for different levels of safety integrity.

The test methods are:

• formal proof - recommended for levels 2 and 3, highly recommended for level 4

• static and dynamic analysis -recommended for levell, highly recommended for level 2 and above

• probabilistic testing - recommended for levels 2 and 3, highly recommended for level 4

• functional and black box testing -highly recommended for all levels

• performance testing - recommended for levels 1 and 2, highly recommended for levels 3 and 4

• interface testing - recommended for levels 1 and 2, highly recommended for levels 3 and 4

• simulation/ modelling­recommended for levels 1 and 2, highly recommended for levels 3 and 4

For some test methods a set of suitable test techniques, graded for appropriateness by integrity level, are also provided.

204 Testing Safety-Related Software: A Practical Handbook

lEe 1508 [6] - Part 2 Requirements for electrical / electronic / programmable electronic systems

Status and Content Overview

Draft international standard setting out a generic approach to the development of electrical/ electronic/ programmable electronic systems (E/ E/PES) that are used to perform safety functions.

Provides recommendations on techniques/ measures appropriate to the development of E/E/PES of differing levels of System Integrity.

Developer can be selective about techniques/measures chosen so long as the choice is justified.

Summary of Testing Advice

Recommended test techniques/ measures include:

• simulation - recommended for levels 3and4

• functional testing - highly recommended for all levels

• expanded functional testing -recommended for levell, highly recommended for levels 2 to 4

• functional testing under environmental conditions - highly recommended for all levels

• statistical testing - recommended for levels 3 and 4

• static and dynamic analysis -recommended for levels 2 to 4

• surge immunity testing - highly recommended for level 4

• systematic testing - recommended for levell, highly recommended for levels 2 t04

• black box testing - recommended for all levels

• interference immunity testing -highly recommended for all levels

RIA 23[9]

Status and Content Overview

Draft proposal for railway industry standard for the procurement and supply of programmable electronic systems for use in railway applications.

Intended to be used in conjunction withffiC1508

Summary of Advice from the Standards 205

Summary atTesting Advice Recommended test techniques/ measures include:

• reliability growth modelling -recommended for levels 1 and 2, highly recommended for levels 3 and 4

• error guessing - highly recommended for levels 1 to 4

• error seeding - recommended for levels 3 and 4

• boundary value analysis - highly recommended for levels 1 to 3, mandatory level 4

• test coverage (including multiple condition coverage) - recommended for levels 1 and 2, highly recommended for levels 3 and 4

• functional testing - mandatory for all levels

• static code analysis - recommended for levels 1 and 2, highly recommended for level 3, mandatory for level 4

• transaction analysis - recommended for levels 1 to 3, highly recommended for level 4

• simulation - recommended for all levels

• load testing - highly recommended for all levels

• performance monitoring -recommended for levels 1 to 3, highly recommended for level 4

206 Testing Safety-Related Software: A Practical Handbook

CENELEC [10]

Status and Content Overview

Draft European standard specifying procedures and technical requirements for the development of programmable electronic systems for use in railway control and protection applications.

Summary of Testing Advice

Provides recommendations varying by required safety integrity levels. Required techniques for level 1 are the same as for level 2, techniques for level 3 are the same as for level 4.

Based on IEC 1508 and RIA 23

• functional and black box testing -highly recommended for all levels

• performance testing - highly recommended for levels 3 and 4

• interface testing - highly recommended for all levels

• formal proof - recommended for levels 1 and 2, highly recommended for levels 3 and 4

• probabilistic testing - recommended for levels 1 and 2, highly recommended for levels 3 and 4

• static and dynamic analysis - highly recommended for all levels

Summary of Advice from the Standards 207

DEF STAN 00-55 [13]

Status and Content Overview

Interim MoD standard for the procurement of safety-related software in defence equipment.

Mandates specific software development methods to be followed.

Summary of Testing Advice

Mandates the use of formal methods, dynamic testing and static path analysis as verification and validation techniques. Formal methods are to include formal specification and design, and verification via formal proof or rigorous argument.

Mandates the use of specific types of static analysis (e.g. control flow analysis) and specific dynamic test coverage criteria (e.g. all statements).

Executable prototype to be produced derived from the formal specification and used to compare results of testing integrated software via a test harness.

A simulator may be used if it is not practical to validate a system in its final system environment, but it is noted that 'a simulator is a complex system and its environment may be more complicated that the software under test'.

Resource modelling and performance analysis are to be carried out.

Validation is to include sufficient testing 'to obtain a statistically significantly measure of the achieved reliability of the system'.

Copies of Defence Standards DEF STAN 00-55 and 00-56 can be obtained free of charge from:

Directorate of Standardization Ministry of Defence Room 1138 Kentigem House 65 Brown Street Glasgow G28EX

Further information can be obtained from the Def Stan web site:

http://www.dstan.mod.uk

208 Testing Safety-Related Software: A Practical Handbook

OEF STAN 00-56 [14]

Status and Content Overview

Draft MoD standard defining the safety management requirements for defence systems containing programmable electronics.

Mandates that design rules and techniques for safety systems development are to be defined and approved prior to implementation.

Supporting guidance document provides some illustrative examples of topics to be considered when determining appropriate rules/ techniques for computer based applications. Guidance document refers to IEe 1508 for further guidance on techniques.

000-STO-2167 A [80]

Status and Content Overview

US military standard for defence system software development.

Leaves contractor responsible for selecting software development methods that best support the achievement of contract requirements and also for performing whatever safety analysis is necessary to ensure that the potential for hazardous conditions is minimised.

Summary of Testing Advice

Illustrative examples provided in guidance document show:

• dynamic testing required for all levels of safety integrity

• static testing required for safety integrity levels 3 and 4, optional for levels 1 and 2.

Summary of Testing Advice

No guidance provided on specific test measures/ techniques.

Limited guidance on test coverage criteria

Guidance concentrates on the documentation to be produced.

Summary of Advice from the Standards 209

AG3

Status and Content Overview

Assessment guide for the software aspects of digital computer based nuclear protection systems

lEC 880 [7]

Status and Content Overview

International standard defining the requirements for the generation and operation of software used in the safety systems of nuclear power stations.

Minimum standard recommended by the UK Nuclear Installations Inspectorate.

Summary of Testing Advice

Guidance provided on the level of test coverage required.

Static and dynamic testing to be undertaken by both supplier and independent assessor.

Where practical, requirements animation to be carried out.

Credit may be claimed for use of formal specifications.

Random and back-to-back testing suggested as powerful test techniques.

Extended period of on-site testing of full system required. During this testing a probabilistic estimate of the software reliability is to be computed (no guidance provided on how) for comparison against judged reliability value based on development methods.

Summary of Testing Advice

Both statistical and systematic testing are to be carried out Detailed coverage criteria are specified for each.

Automatic test tools are to be used as much as possible to derive test cases.

Static and dynamic simulation of inputs are to be used to exercise the system in normal and abnormal operation.

210 Testing Safety-Related Software: A Practical Handbook

Ontario Hydro Standard [20]

Status and Content Overview

Canadian standard specifying requirements for the engineering of safety-related software used in real­time control and monitoring systems in nuclear generating stations.

All requirements of the standard must be met.

Summary of Testing Advice

Statistical and systematic testing are to be carried out to ensure adequate and predefined test coverage is achieved.

A quantitative reliability value is to be demonstrated using statistically valid, trajectory based, random testing. Test case requirements for this testing are specified.

Detailed test objectives and coverage criteria are defined for unit, subsystem, validation and reliability testing.

Dynamic testing and simulation are recognised verification techniques.

Summary of Advice from the Standards 211

RTCA-178B [8]

Status and Content Overview

Document providing the aviation community with guidance for determining that the software aspects of airborne systems and equipment comply with air worthiness requirements.

Guidance is not mandated but represents a consensus view.

Summary of Testing Advice

Suggests use of target computer emulator or host computer simulator during testing. Emulator/simulator must be assessed as having achieved a predefined level of safety integrity. The differences between simulator / emulator and the target system and their effects on the verification process to be considered.

Emphasises the use of requirements based test methods, in particular hardware and software integration testing, software integration testing and low-level testing. The objectives of each of these test methods is defined and typical errors which should be revealed by the method identified. Guidance is provided on test case selection covering normal and abnormal range test cases.

Requirements based test coverage analysis and software structure coverage analysis is to be carried out. Guidance on coverage criteria is provided with the level of coverage required varying by integrity level.

Formal methods, exhaustive input testing and multi-version software considered potential' alternative methods' for software verification. Some guidance is provided on their use, but not in the main guidance sections of the document as they are felt to be currently of 'inadequate maturity' or 'limited applicability for airborne software'.

Guidance on software reliability models explicitly excluded as it is believed that currently available methods I do not provide results in which confidence can be placed to the level required'.

Bibliography

The documents listed below were produced by the CONTESSE project and are available (for a nominal cost, to cover printing, postage and packaging) from Mike Falla at:

MFR Software Services Limited 35 Benbrook Way Macclesfield Cheshire SK119RT

Information on the CONTESSE project can also be found on the DTI website:

http://www.augusta.co.uk/safety

The partner in the consortium responsible for leading the work described is also given.

214 Testing Safety-Related Software: A Practical Handbook

Id Title

B1 Survey of Current Simulation and Emulation Practices

This report presents a brief survey of the current simulation and emulation techniques used in the development of high integrity systems. The responses to these questionnaires are analysed and presented to provide an overview of current industry practices. Two case studies (from the avionic and nuclear industries) provide a more detailed study of the techniques currently employed in those industries.

Lead Partner: University of Warwick

B2 Definition of Environment Lijecycle Models

This report discusses the lifecycle for the development of real-time high integrity systems. A duallifecycle is proposed, which places increased emphasis on environment simulation as a method of dynamic testing. Methods of implementing the various components of the lifecycle are proposed that take into account the integrity level and complexity of the target system.

Lead Partner: University of Warwick

B3 Analysis of Software Testing Quality

This report covers:

• a review of the test quality analysis and targets, both as used by CONTESSE partners in general practice and as documented in current research

• a review of the use of tools in support of test quality analysis

• a categorisation of the approaches by application sector.

Lead Partner: Nuclear Electric

B4 Software Testing by Environmental Simulation

This report is concerned with the theoretical foundations of software testing through environment simulation. The report covers:

• formal descriptions of software and associated environment simulation, within which the problems of test adequacy, test case generation and software integrity estimation can be formulated

• the theoretical basis of test adequacy, and of software integrity, measurement when environment simulation is used to test software

• the relationship between test adequacy and software integrity when environment simulation is used to test software.

Lead Partner: Nuclear Electric

Bibliography 215

B5 Testability of the Design Features of Programmable Electronic Systems

This report uses a combination of experience gained with development of existing systems, and primitive control and data flow models of alternative design solutions to attempt to formulate a scale of testability of design features. The features of a system's interfaces with its environment, which make the construction of effective simulators difficult, is also analysed in a quantitative manner.

Lead Partner: Lloyds Register

B6 Requirements for Test Support Tools

This report defines environment lifecycle models and the requirements for the effective use of simulators. The report also analyses the overall development process for simulators and identifies those areas which would benefit from tools which directly contribute to this aspect of the development. Requirements for tools are defined to a level sufficient to evaluate those currently available.

Partner: NEI

B7 Computer Aided Software Testing: Tool Survey

This report provides the results of a survey of tools that may assist in testing software. Most of these tools are sold as part of an integrated development environment and some are also sold as stand-alone tools. The information contained within this report has been extracted from the tool vendors literature.

Lead Partner: University of Warwick

B8 Computer Aided Software Testing: Tool Survey and Evaluation

This report presents and evaluates the result of a survey of software and PLC test tools. Four software test and simulation tools were selected, "McCabe", "STATEMATE", "AdaTEST", and "MatrixX and SystemBuild" .

Lead Partner: University of Warwick

B9 Tool Support for the CONTESSE Framework

This report discusses aspects of tool integration and identifies a set of tools to support the CONTESSE Framework. This set is compared with the outcome of work carried out earlier in the project when specific test support tools were identified. The likelihood of obtaining tools to support the Framework is considered.

Lead Partner: NEI

216 Testing Safety-Related Software: A Practical Handbook

BI0 CONTESSE Test Handbook

This report contains a compilation of the results of all the tasks undertaken in the CONTESSE project The report is in four sections:

• Background

• Test Strategy and Methods

• Tools and Simulators

• Case Studies

This report contains the detailed information on which this book is based.

Lead Partner: BAeSEMA

Bll Fire and Gas Protection System Case Study

This report describes the particular features of offshore fire and gas protection systems that may benefit from of the guidelines in the CONTESSE Test Handbook, and describes the results of applying the guidelines, particularly those relating to:

• safety cases

• aspects of environmentaillfecycle models relating to construction of the environment simulator

• testability.

Lead Partner: G P-Elliot

B12 Application of the CONTESSE Framework to a Reactor Protection System

This report investigates the feasibility of including reactor protection software in a quantified overall nuclear power plant safety argument. To support the investigation a mathematical model was developed and a statistical software testing experiment conducted, using a simulated environment, to estimate the probability of failure on demand of protection software.

Lead Partner: Nuclear Electric

B13 Review Implementation against Framework and Test Framework on Model

This report reviews the applicability of the CONTESSE Test Handbook to the certification of safety-related programmable electronic systems within the civil aviation industry. It is structured to facilitate comparison of the CONTESSE guidelines and RTCA 00-178B, the aviation guidelines for software in airborne systems and equipment.

Lead Partner: Lucas Electronics

Bibliography 217

B14 PfSE Case Study Final Report

. This report discusses whether a system's claimed integrity can be improved by the execution of further tests. It reports on a study carried out using a duplicate of an existing reactor protection system subjected to a series of pseudo random tests. It concludes that the results from such tests can support an argument for an improved system integrity.

Lead Partner: NEI

B15 The Applicaticm of Simulation to Timing Constraints

This report discusses simulation methods, techniques and tools for improving the specification, verification and testing of real-time software timing constraints at an early stage during the software development lifecycle. A comparison of commercially available tools against a functional definition of an 'ideal' tool is provided.

Lead Partner: Rolls-Royce

B16 The application of the CONTESSE Framework to a Marine Roll and Stabilisation System.

This report describes a proposal for a Rudder Roll Stabilisation System, and with reference to the CONTESSE Framework considers the validation and testability of the proposed system.

Lead Partner: Lloyds Register

Index

accessibility 66 accumulation metrics 135 accuracy

of environment simulation 134, 136

evaluation 136 of real number system 133 of reliability estimation 146 of simulation 137-138 of simulators 132, 141

Ada 79,81,94,98,105,109,118 AdaTEST 105-106,116 adequacy criteria

boundary analysis 149 control flow based 154 fault based 149 functional analysis 149

adequacy test 143 ALARP 42 algorithm

heuristic 91 rate monotonic 86, 89 scheduling pre-emptive 86

analysis boundary 147 boundary value 180, 182-183,

205 causal net 56 code coverage 104 complexity 22, 104 control flow 95, 183 data flow 183 design 41 dynamic 203-204, 206 failure modes and effects 38 hazard 35,43-44,54,102,127,

184-185 information flow 183

mutation 150 performance 207 risk 33,41,44,102 safety 208 sensitivity 43 sneak circuit 183 static 31,39,79, 100, 121, 149,

176,186,201,203-204,206-207

static path 207 structured 103 timing 100

animation 182 of requirements 19 of specifications 61

animators 176 architecture target 97 As Low As Reasonably Practicable

(ALARP) 42 assessment

independent 25 process 2 product 2 quantitative 25 safety 25 test cost 171

availability 51 avalanche testing 182

back-to-back testing 209 behavioural constraints 83 behavioural model 125 bespoke architecture approach 14 black box testing 31,40, 111-112,

147 boundary analysis

adequacy criteria 149

220 Testing Safety-Related Software: A Practical Handbook

boundary value analysis 180, 183, 205

branch coverage 146 brute-force searching 91

C 98,105 C++ 81,94 causal net analysis 56 Cause Consequence Analysis

(CCA) 57 cause consequence diagrams 182 cause-effect

descriptions 40 modelling 35 relationships 35

CCA 57 channel diverse 7 checklists 183 c1eanroom technique 157 code

coverage 22 coverage analysis 104 inspection 21, 37

combinator 17 competence of personnel 41 compilers 82, 109

Ada 98 complexity 22,61, 76, 78, 103, 140,

166,180 analysis 104 McCabe's metric 79

component environment simulators 127

computation errors 149 confidence interval 164 configuration management 111, 121 control flow 96, 98

analysis 21, 95, 183, 207 based adequacy criteria 154 data 21 test data adequacy criteria 148

control loop performance 60 control path 158 control system 4, 59, 160 controllability 66, 163 cost

drivers 175 of testing 3, 146

coverage branch 146 of code 144 criteria 154, 209

data path 158 decision 117 exception 118 path 145 statement 118, 145 structure 148

coverage based testing 21 criterion structure coverage 146 cyclic scheduler 90

data flow 64, 110 analysis 183 criteria 148, 154 structure 154 test data adequacy criteria 148

data path coverage 158 decision coverage 117 decompositional modelling 157 demand based models 156 demand based SST 160 design

analysis 41 complexity 175 data flow based 20 data structure based 20 object oriented 20 reviews 183 standards 42 test case 40

designated architecture approach 14 deterministic static cyclic

scheduling 86 diagnostic coverage 10, 15 diverse

channels 7 hardware 72 systems 7

diversity 7 document reviews 37 domain error 149 drivers cost 175 duallifecyc1e model 49 dynamic

analysis 204, 206 memory allocation 94 simulation 209 test 120 test coverage criteria 207 testing 21, 25, 64, 100, 110, 201,

208-209 testing tools 191

embedded software 9 emulation 184 emulator 126, 211 end-to-end deadline times 93 environment 188

lifecycle 54 sbnulation 105,125,132,139,

155,159 sbnuIator 31,50,52, 141, 170

environmental and operating conditions 38

environmental model 125 environmental modelling 104 equivalence

classes 182 in non-determinism 134 in timing and time delay 134 of two environments 130

equivalence relation approach 125, 132-133

error 6 domain 149 guessing 182-183,205 human 1,57 random 178 seeding 150,182,205 systematic 178 timing 85

error based criteria 149 error computation 149 estimation of reliability 146 ETA 56 evaluation accuracy 136 event

space 130,136 trapper 64

Event Tree Analysis (ETA) 56 evidence

process 36 product based 36

exception coverage 118

Fagan inspections 183 failure 6

random 24 Failure Modes and Effects Analysis

(FMEA) 38 Failure Modes, Effects and Criticality

Analysis (FMECA) 55 failures 1 fault 6

Index 221

fault based adequacy criteria 149-150

fault propensity 158 Fault Tree Analysis (FI'A) 38, 56 finite state machines 61 flow graph based 148 flow graph model 148 FMEA 38 FMECA 55 formal

methods 31, 109, 163, 192, 201, 207,211

proof 31, 202-203, 206 semantics 110 specification 175,207,209

frequency response requirements 60 FI'A 38,56 functional

analysis 39 independence 37 testability 66 testing 21, 31, 64, 179, 181, 183-

184,188,204-206 functional analysis

adequacy criteria 149

global data areas 79

Halstead's software science 118 Hansen's software complexity 118 hard real-time systems 85, 89, 100,

120 hardware

diverse 72 interface 62 monitors 63

hazard 1,4,34,41 identification 42-43 log 47,54

hazard analysis 35,37,42-43,54, 103,107,127,161,165,184-185,190

Hazard and Operability Study (HAZOP) 55

HAZOP 38,43,55, 102 HEA 57 heuristic algorithms 91 human

error 1, 57 factors 146 failure modes 57

222 Testing Safety-Related Software: A Practical Handbook

Human Error Analysis (HEA) 57

independent assessment 25 44 inference adequacy criteria i45 information flow analysis 183 ~ormation hiding 79 mput

partition testing 182 trajectories 160

instruction prediction 98 integrated project support

environment (IPSE) 121 integrity level 6,33,44,46,51,107,

109,172,174,193 interface

considerations 62 hardware 62 software 62 testing 183

interrupt driven 38 interrupts 86-87,185 IPSE 121 iteration rates 92

justification of safety claims 45

ladder logic 60 language construct 21 language subset 94 LORA Testbed 106 lifecycle

dual 3,49 environment 54 model 26 project safety case 48 safety 3, 45, 54, 126

limits of safe operation 41 load testing 205 logic

control 38 trees 56

MALPAS 191 Markov chains 132 mathematical proof 86 Matrix-X 105-106 maximum difference metric 135 McCabe's 79 McCabe Toolset 105-106 McCabe's measure 118

memory allocation dynamic 94 memory constraints 182 methods formal 31 metric

accumulation 135 maximum difference 135 space 134

m~~c space approach 125, 132, 134 nummum time gap in input-output

processes 93 minor cycle overload 93 model

demand based 156 flow graph 148 lifecycle 26 set theoretical 129 stochastic process 131 time based 156 urn 156

modelling cause-effect 35

monitoring and auditing procedures 43

muta~on adequacy 145-146, 150 mutation analysis 150 Myre's extension 118

natural language 85 network programming 7

object oriented 20 design 77 programming 77

object-code reading 98 observability 66 operating system multi-tasking 20 operational envelope 64 organisational

gUidelines 41 procedures 41

partial beta factor model 178 partition testing 152,168 Pascal 94, 105, 109 path coverage 145, 159 penalty and benefit 93 performance

analysis 207 constraints 83 control loop 60

degradation 80 model 20 monitoring 205 requirements 182 stepresponse 60 testing 31, 203, 206

periodic processes 87 Petri Nets 85 pfd 161,179 pos~conditions 22,64 pre-conditions 22, 64 predictability 66 p~babilistictesting 192,203,206 p~bability

of failure 158 of failure on demand (pfd) 161,

179 p~cedural parameter 79-80 process 45

assessment 2 periodic 87 simulation 182 sporadic 87

p~cessor utilisation 89 product assessment 2 program based adequacy

criterion 150 programtextbased 148 p~ject safety case lifecycle 48 proof check 10 p~offormal 31 p~tection system 4, 25, 62, 72, 155-

156,159-160,179 p~totyping 182

QA 48 qualifications and experience

requirements 43 quality

assurance 25 control 25

quality assurance (QA) 48 quality requirements 101 quantitative hardware analysis 11

random 6 errors 178 failure 24 input testing 21, 177 testing 112,148,152,155

rate monotonic algorithm 86, 89

real time hard 133 soft 133

recursion p~hibition 94 redundancy 6

active 6 dual 6 passive 6 triple 6

Index 223

reliability 51,122,129,143,155,157, 177,190,207,209-211

~wthmodelling 156,177,181, 205

probabilistic estimate of 25 quantitative 31

repeatability 66 required integrity level 109,154 requirements

animation 209 specification 45,50

Requirements and Traceability Management (RTM) 113

resource modelling 31 usage 39 utilisation 65

response timings 182 review p~cedures 41 risk

analysis 33,41,44, 163-164 assessment 33-34, 102 reduction 44 tolerable 42 tolerable level 44

risk reduction facilities 44 measures 54 requirements (RRR) 102 techniques 41

RRR 102 RTM 105-106,113

safety analysis 208 argument 158 assessment 24-25, 28, 43, 48, 177,

201 assessment methodology 43 assurance 1 case 1-2,17,26,33,37,41-42,54,

102,107,139,141,171 compliance assessment 44

224 Testing Safety-Related Software: A Practical Handbook

~ction 5,20,34:54 hazard 29 integrity 1, 8, 30-31, 34, 107 integrity level 54, 123 integrity targets 44 justification 3, 20, 33, 103, 121,

125 lifecycle 3,17,33,45,49,54,102,

126 lifecycle phases 45 management plan 50 management system 43 objectives 25, 30 principles 42 records 44 requirements 5, 29, 33, 161 requirements specification 50 review 41 risk assessment 107 specification 190 targets 41 validation 52 validation report 53 verification 44

safety ~ction 10 sample path 131 sampler 64 schedule pre-run-time 100 scheduler

cyclic 90 non-pre-emptive 86 pre-emptive 86 static cyclic 90

scheduling deterministic static cyclic 86 pre-run-time 100 pre-run-time (or static) 87 run-time 87

self tests 62 sensitivity analysis 43 set theoretical model 3, 125, 129, 141 simulated annealing 90-92 simulation 22,27,68,121

accuracy 125,132,134,137-138, 141

dynamic 209 environment 132 static 209 stochastic 89

simulator 3, 101, 125, 139 single-version testing 156 sneak circuit analysis 183

software complexity 154 decomposition 157 hazard analysis 37-39 integrity 155, 157 interface 62 reliability 25, 168, 178 safety 29 safety assessments 54 variants 81

software structure coverage analysis 211

specification of safety requirements 41

specification animators 127 sporadic processes 87 SST 155

demand based 160 state transitions 61 Statemate 104, 106, 110 statement coverage 118, 145, 159 static

analysis 21,31,37,39,78-79,100, 117,121,149,179,182-183, 186,188,201-202,204-206

cyclic scheduler 90 simulation 209 testing 25,120,208-209 testing tools 191

statistical inference 164 software testing 3 testing 204, 209

statistical inference 170 Statistical Software Testing (SST) 155 step response performance 60 stochastic

evolution 92 process model 3,125,131 simulation 89

stop rule 146 stress testing 182 structural testing 179, 183, 188 structure coverage 148 structure diagrams 76 structure-based testing 182 structured analysis 103 structured design 76, 81 SUM 164 surge immunity testing 204 symbolic execution 183 syntactic structure 147

system certification 43 diverse 7 modularity 79 performance 20 reliability 172 resources 39 testability 171 validation specification 50

system function calls 39 systematic errors 178 systems

hard real-time 85 protection 25

target system architecture 97 target system drivers 127 TCF 171 Teamwork 113 TeamworkRqt 106 TeamworkSIM 106 test

acceleration 169 adequacy 3.143.157 adequacy measurement 155 case selection criterion 144 cases 22.64.128.144 cost factor 172 coverage 26-27.31 coverage based 21 coverage criteria 26 criterion 143 data adequacy criteria 144 dynamic 21.25,27.111.120 environment 52,101 expenditure 19. 172. 194 functional 21 harness 23 methods 26. 40 objectives 52 plan 19 planning 19. 108. 154 quality 154 random input 21. 177 regime 22. 102-103. 172 regime assessment 3. 171 report 52 schedule 112 script 116 specification 20. 52. 71 static 25.27.120 strategy 102

validity 60. 79. 166 white box 104

Index 225

test adequacy criteria functional testing 146 statement coverage 144

test adequacy measurement boundary analysis 147

test case design 40 execution 182 generation 40 management 23. 104 selection 145,211

test cost assessment 171 test cost factor (TCF) 171 test coverage 202. 205. 210

requirements based analysis 211 test coverage criteria 208 test data adequacy criteria

control flow 148 data flow 148

testability 3.59.64.66.75.82. 151. 173.179.184-186.188

testing back-to-back 209 black box 40.111.203 dynamic 25,31.64.110.208-210 functional 64. 203-204. 206 performance 31, 203, 206 probabilistic 192 random 112. 209-210 static 31.208-209 statistical 204.209-210 systematic 209-210 trajectory based 210 white box 40.111

testing approach 148 testing techniques

boundary value analysis 182 dynamic 177 functional testing 182 partition testing 168-169.182 random testing 155.181 static 177 structural testing 182

time based models 156 time deterministic mode 20 time index set 131 time resolution 131 Timed CSP 85-86.100 timed petri nets 86 timing 3.39.64.82-83.128.176

226 Testing Safety-Related Software: A Practical Handbook

analysis 100 behaviour 89 constraints 83 errors 85 of events 38 requirements 84 resource constraints 89

tolerable level of risk 44 tolerable risk 42 tool

dynamic testing 191 static testing 191

traceability 20,66, 102, 112, 114, 121, 184

traceability matrix 113 transaction analysis 205

ummodel156 single 164

validation 19,25,27,31,33,37,41, 102,107-108,125,132,136-137,140-141,176,191,207, 210

empirical 166 safety 52

verification 19,25,27,31,33,37,41, 84,89,102,107,125,136-137, 140-141,191,207,210-211

voting component 6 voting logic 62

walkthroughs 183 WeET 95 Weyukers axiom system 153 white box testing 31,40, 104, 111-

112,147,168 worst case execution times 3, 85, 94-

95,100