40
Software Assessment for Shipboard Computer Based System July 2008 Guidance Note NI 425 DT R01 E Marine Division 92077 Paris La Défense Cedex - France Tel: +33 (0)1 42 91 52 91 – Fax: +33 (0)1 42 91 53 20 Marine website: http://www.veristar.com Email: [email protected] © 2008 Bureau Veritas – All rights reserved

Software Assessment for Shipboard Computer Based … · Software Assessment for Shipboard Computer Based ... computer software of any sort or other ... reasonably foreseeable by or

  • Upload
    vanque

  • View
    232

  • Download
    0

Embed Size (px)

Citation preview

Software Assessment for Shipboard Computer

Based System

July 2008

Guidance Note NI 425 DT R01 E

Marine Division 92077 Paris La Défense Cedex - France

Tel: +33 (0)1 42 91 52 91 – Fax: +33 (0)1 42 91 53 20 Marine website: http://www.veristar.com Email: [email protected]

© 2008 Bureau Veritas – All rights reserved

ARTICLE 1

1.1. - BUREAU VERITAS is a Society the purpose of whose Marine Division (the “Society”) is the classifi-cation (“Classification”) of any ship or vessel or structure of any type or part of it or system therein collec-tively hereinafter referred to as a “Unit” whether linked to shore, river bed or sea bed or not, whetheroperated or located at sea or in inland waters or partly on land, including submarines, hovercrafts, drillingrigs, offshore installations of any type and of any purpose, their related and ancillary equipment, subsea ornot, such as well head and pipelines, mooring legs and mooring points or otherwise as decided by the So-ciety.

The Society:• prepares and publishes Rules for classification, Guidance Notes and other documents (“Rules”);• issues Certificates, Attestations and Reports following its interventions (“Certificates”);• publishes Registers.

1.2. - The Society also participates in the application of National and International Regulations or Standards,in particular by delegation from different Governments. Those activities are hereafter collectively referred toas “Certification”.

1.3. - The Society can also provide services related to Classification and Certification such as ship and com-pany safety management certification; ship and port security certification, training activities; all activities andduties incidental thereto such as documentation on any supporting means, software, instrumentation,measurements, tests and trials on board.

1.4. - The interventions mentioned in 1.1., 1.2. and 1.3. are referred to as “Services”. The party and/or itsrepresentative requesting the services is hereinafter referred to as the “Client”. The Services are preparedand carried out on the assumption that the Clients are aware of the International Maritime and/orOffshore Industry (the “Industry”) practices.

1.5. - The Society is neither and may not be considered as an Underwriter, Broker in ship’s sale or charter-ing, Expert in Unit’s valuation, Consulting Engineer, Controller, Naval Architect, Manufacturer, Shipbuilder,Repair yard, Charterer or Shipowner who are not relieved of any of their expressed or implied obligationsby the interventions of the Society.

ARTICLE 2

2.1. - Classification is the appraisement given by the Society for its Client, at a certain date, following sur-veys by its Surveyors along the lines specified in Articles 3 and 4 hereafter on the level of compliance of aUnit to its Rules or part of them. This appraisement is represented by a class entered on the Certificatesand periodically transcribed in the Society’s Register.

2.2. - Certification is carried out by the Society along the same lines as set out in Articles 3 and 4 hereafterand with reference to the applicable National and International Regulations or Standards.

2.3. - It is incumbent upon the Client to maintain the condition of the Unit after surveys, to presentthe Unit for surveys and to inform the Society without delay of circumstances which may affect thegiven appraisement or cause to modify its scope.

2.4. - The Client is to give to the Society all access and information necessary for the performance of therequested Services.

ARTICLE 3

3.1. - The Rules, procedures and instructions of the Society take into account at the date of their preparationthe state of currently available and proven technical knowledge of the Industry. They are not a code of con-struction neither a guide for maintenance or a safety handbook.

Committees consisting of personalities from the Industry contribute to the development of those documents.

3.2. - The Society only is qualified to apply its Rules and to interpret them. Any reference to themhas no effect unless it involves the Society’s intervention.

3.3. - The Services of the Society are carried out by professional Surveyors according to the Code of Ethicsof the Members of the International Association of Classification Societies (IACS).

3.4. - The operations of the Society in providing its Services are exclusively conducted by way ofrandom inspections and do not in any circumstances involve monitoring or exhaustive verification.

ARTICLE 4

4.1. - The Society, acting by reference to its Rules:• reviews the construction arrangements of the Units as shown on the documents presented by the Client;• conducts surveys at the place of their construction;• classes Units and enters their class in its Register;• surveys periodically the Units in service to note that the requirements for the maintenance of class are

met.

The Client is to inform the Society without delay of circumstances which may cause the date or theextent of the surveys to be changed.

ARTICLE 5

5.1. - The Society acts as a provider of services. This cannot be construed as an obligation bearingon the Society to obtain a result or as a warranty.

5.2. - The certificates issued by the Society pursuant to 5.1. here above are a statement on the levelof compliance of the Unit to its Rules or to the documents of reference for the Services provided for.

In particular, the Society does not engage in any work relating to the design, building, productionor repair checks, neither in the operation of the Units or in their trade, neither in any advisory serv-ices, and cannot be held liable on those accounts. Its certificates cannot be construed as an impliedor express warranty of safety, fitness for the purpose, seaworthiness of the Unit or of its value forsale, insurance or chartering.

5.3. - The Society does not declare the acceptance or commissioning of a Unit, nor of its construc-tion in conformity with its design, that being the exclusive responsibility of its owner or builder, re-spectively.

5.4. - The Services of the Society cannot create any obligation bearing on the Society or constitute any war-ranty of proper operation, beyond any representation set forth in the Rules, of any Unit, equipment or ma-chinery, computer software of any sort or other comparable concepts that has been subject to any surveyby the Society.

ARTICLE 6

6.1. - The Society accepts no responsibility for the use of information related to its Services which was notprovided for the purpose by the Society or with its assistance.

6.2. - If the Services of the Society cause to the Client a damage which is proved to be the direct andreasonably foreseeable consequence of an error or omission of the Society, its liability towards theClient is limited to ten times the amount of fee paid for the Service having caused the damage, pro-vided however that this limit shall be subject to a minimum of eight thousand (8,000) Euro, and toa maximum which is the greater of eight hundred thousand (800,000) Euro and one and a half timesthe above mentioned fee.

The Society bears no liability for indirect or consequential loss such as e.g. loss of revenue, loss ofprofit, loss of production, loss relative to other contracts and indemnities for termination of otheragreements.

6.3. - All claims are to be presented to the Society in writing within three months of the date when the Serv-ices were supplied or (if later) the date when the events which are relied on of were first known to the Client,and any claim which is not so presented shall be deemed waived and absolutely barred.

ARTICLE 7

7.1. - Requests for Services are to be in writing.

7.2. - Either the Client or the Society can terminate as of right the requested Services after givingthe other party thirty days' written notice, for convenience, and without prejudice to the provisionsin Article 8 hereunder.

7.3. - The class granted to the concerned Units and the previously issued certificates remain valid until thedate of effect of the notice issued according to 7.2. hereabove subject to compliance with 2.3. hereaboveand Article 8 hereunder.

ARTICLE 8

8.1. - The Services of the Society, whether completed or not, involve the payment of fee upon receipt of theinvoice and the reimbursement of the expenses incurred.

8.2. - Overdue amounts are increased as of right by interest in accordance with the applicable leg-islation.

8.3. - The class of a Unit may be suspended in the event of non-payment of fee after a first unfruitfulnotification to pay.

ARTICLE 9

9.1. - The documents and data provided to or prepared by the Society for its Services, and the informationavailable to the Society, are treated as confidential. However:• Clients have access to the data they have provided to the Society and, during the period of classification

of the Unit for them, to the classification file consisting of survey reports and certificates which havebeen prepared at any time by the Society for the classification of the Unit ;

• copy of the documents made available for the classification of the Unit and of available survey reportscan be handed over to another Classification Society Member of the International Association of Classi-fication Societies (IACS) in case of the Unit’s transfer of class;

• the data relative to the evolution of the Register, to the class suspension and to the survey status of theUnits are passed on to IACS according to the association working rules;

• the certificates, documents and information relative to the Units classed with the Society may bereviewed during IACS audits and are disclosed upon order of the concerned governmental or inter-gov-ernmental authorities or of a Court having jurisdiction.

The documents and data are subject to a file management plan.

ARTICLE 10

10.1. - Any delay or shortcoming in the performance of its Services by the Society arising from an event notreasonably foreseeable by or beyond the control of the Society shall be deemed not to be a breach of con-tract.

ARTICLE 11

11.1. - In case of diverging opinions during surveys between the Client and the Society’s surveyor, the So-ciety may designate another of its surveyors at the request of the Client.

11.2. - Disagreements of a technical nature between the Client and the Society can be submitted by theSociety to the advice of its Marine Advisory Committee.

ARTICLE 12

12.1. - Disputes over the Services carried out by delegation of Governments are assessed within the frame-work of the applicable agreements with the States, international Conventions and national rules.

12.2. - Disputes arising out of the payment of the Society’s invoices by the Client are submitted to the Courtof Nanterre, France.

12.3. - Other disputes over the present General Conditions or over the Services of the Society areexclusively submitted to arbitration, by three arbitrators, in London according to the Arbitration Act1996 or any statutory modification or re-enactment thereof. The contract between the Society andthe Client shall be governed by English law.

ARTICLE 13

13.1. - These General Conditions constitute the sole contractual obligations binding together theSociety and the Client, to the exclusion of all other representation, statements, terms, conditionswhether express or implied. They may be varied in writing by mutual agreement.

13.2. - The invalidity of one or more stipulations of the present General Conditions does not affect the va-lidity of the remaining provisions.

13.3. - The definitions herein take precedence over any definitions serving the same purpose which mayappear in other documents issued by the Society.

MARINE DIVISON

GENERAL CONDITIONS

BV Mod. Ad. ME 545 j - 16 February 2004

NI 425

july 2008 Bureau Veritas 3

Table of Contents 1. General..............................................................................................................................................................5

1.1. General.....................................................................................................................................................5 1.2. Field of application....................................................................................................................................6 1.3. References ...............................................................................................................................................6 1.4. Definitions.................................................................................................................................................6 1.5. Abbreviation .............................................................................................................................................8

2. Software requirement ......................................................................................................................................9

2.1. General.....................................................................................................................................................9 2.2. Functional requirements ...........................................................................................................................9 2.3. Safety of operation .................................................................................................................................10 2.4. Monitoring...............................................................................................................................................10 2.5. Fall-Back arrangement ...........................................................................................................................10 2.6. Documentation .......................................................................................................................................10

3. Software Development process....................................................................................................................12

3.1. General...................................................................................................................................................12 3.2. Software Quality Plan (Guidelines).........................................................................................................13

4. Software Assessment....................................................................................................................................15

4.1. Software classification ............................................................................................................................15 4.2. Software Category notations ..................................................................................................................16 4.3. Software Assessment.............................................................................................................................16

Annex A Questionnaire on the tests and inspections to be performed .............................................................19 Annex B Software Quality Plan Guidelines..........................................................................................................27 B.1. Development and design .....................................................................................................................29

B.1.1. Scope .....................................................................................................................................................29 B.1.2. Development Projects ............................................................................................................................29 B.1.3. Design description..................................................................................................................................29 B.1.4. Re-use....................................................................................................................................................30

B.2. Methodology for development ............................................................................................................30

B.2.1. Development planning............................................................................................................................30 B.2.2. Specification requirements .....................................................................................................................30 B.2.3. Preliminary design..................................................................................................................................31 B.2.4. Detailed design.......................................................................................................................................31 B.2.5. Coding and module testing.....................................................................................................................32 B.2.6. Integration testing...................................................................................................................................32 B.2.7. Validation and installation.......................................................................................................................33 B.2.8. Exploitation.............................................................................................................................................33

B.3. Responsibility.......................................................................................................................................33

B.3.1. Product Manager....................................................................................................................................33 B.3.2. Development Manager ...........................................................................................................................33 B.3.3. Project Manager .....................................................................................................................................34

B.4. Execution ..............................................................................................................................................34

B.4.1. Requirements .........................................................................................................................................34 B.4.2. Project Handbook...................................................................................................................................34 B.4.3. Project Plan ............................................................................................................................................34 B.4.4. Manning Plan .........................................................................................................................................35 B.4.5. Cost Plan................................................................................................................................................35 B.4.6. Documentation Plan ...............................................................................................................................35 B.4.7. Test Plans ..............................................................................................................................................35

NI 425

4 Bureau Veritas July 2008

B.5. Project Monitoring ................................................................................................................................35 B.6. Project Evaluation ................................................................................................................................36 B.7. Design Review ......................................................................................................................................36

B.7.1. Purpose ..................................................................................................................................................36 B.7.2. Scope .....................................................................................................................................................36 B.7.3. Responsibility..........................................................................................................................................36 B.7.4. Constitution of the Design Review Board ...............................................................................................36

B.8. Prototype ...............................................................................................................................................36

B.8.1. Documentation........................................................................................................................................37 B.8.2. Factory Acceptance Test ........................................................................................................................37 B.8.3. Hand-over Meeting .................................................................................................................................37 B.8.4. Training...................................................................................................................................................37

B.9. Relevant Documents ...........................................................................................................................37 B.10. Test and verification of software.........................................................................................................37 B.11. Software Test Documentation .............................................................................................................38

B.11.1. Software Test and Verification Plan........................................................................................................38 B.11.2. Software Test Specification ....................................................................................................................38 B.11.3. Test Log..................................................................................................................................................38 B.12. Failure report ..........................................................................................................................................39 B.13. Software maintenance ............................................................................................................................39

Index of Table Table I: System/software categories ......................................................................................................................15 Table II: Example of assignment to system/software categories.............................................................................16 Table III: Tests reports and documentation to be submitted.....................................................................................17

NI 425

july 2008 Bureau Veritas 5

1. General

1.1. General

1.1.1. This document is intended to assess software fitted on shipboard computer based system.

1.1.2. Within the scope of these Guidance Note, an assessment attestation may be issued.

1.1.3. Software assessment block diagram

Operational Condition

Safe Navigation Avoiding: - grounding - collision - any damage - uncontrolled Reducing: - failure effect Increasing: - availability - reliability

Control system: - steering - speed governor - course tracking - main engine Protection System - motor - fire Alarm monitoring - fire - engine management - navigation - pwr mgnt system

(PMS) Information system Support system …

Route monitoring Traffic surveillance Manoeuvring Communication Manual Steering Safety Operations Docking Loading Habitability

Bureau VERITAS procedures

1 - General

2 - Software Requirements

Software Assessment Certificate

3 - Software Development Process

2.1 General - Safety of operation - Monitoring - Fall Back Arrangement - Operation - Software classification - Software Category 2.2 Software Specification - Functional Requirement - Specific Requirement - Standard requirement - Other Requirement

3.1 General - Description - Development and design - Responsibility - Project Plan - Test Plan - Methodology - SQP -

- Category - Condition of granting - Guideline

- Annex A question - Annex B software quality plan example

IEC 61508-3

IACS

EC 60092-504

IEC 60945

SOLAS

IEC 60 945

IEC 61508-3

IEC 61511

ISO 9001 ISO 900-3

4 - Software Assessment

NI 425

6 Bureau Veritas July 2008

1.2. Field of application

The present provisions especially apply to software involved in computer based systems which perform one or more of the following functions; related to essential services as defined in NR467 Rules for the Classification of Steel Ships, Pt C, Ch 2, Sec 1, i.e.: � Propulsion plant and auxiliaries ( Diesel engine, main boilers and turbine, electric motor, gearing,

CPP ...) � Steering � Navigation (centralized navigation system …) � Electricity production (power management electrical protection) � Fire detection and fire fighting(alarm System ...) � Automation systems � Cargo (load computer) � External and internal communication (network …) � Safety systems (ESD …) � Instrumentation (Gas detection, Intelligent sensor, … )

They may also apply to software related to non essential services such as Air Conditioning management system, etc.

1.3. References

This rule note is based on the principle, derived from the following rules and standards: � International Maritime Organisation: SOLAS. � IEC 60 945: “Maritime navigation and radio communication equipment and systems - General

requirements - Methods of testing and required test results.” � IEC 61 508-3: “Functional Safety of Electric / Electronic / Programmable Electronic Safety-Related

Systems. – Part3: Software Requirements”. � IEC 60 092-504: “Electrical installations in ships – Part 504: Special features – Control. and

instrumentation.”. � IEC 61 511: “Safety instrumented systems for the process industry sector”. � DO 178B: “Software Considerations in Airborne Systems and Equipment Certification.” � BV Rules Part C chapter 3 � IMO MSC Circular N°891 Guidelines for the on-board use and application of computers � ISO/IEC 90003 : Software engineering - Guidelines for the application of ISO 9001:2000 to computer

software � ISO 9001: Quality management systems – Requirements � IACS Unified Requirement E 22: Unified requirement for the on board use and application of

programmable electronic systems

1.4. Definitions

Alarm: audible and visual signal announcing a condition requiring immediate attention or user action.

Application Software: software performing tasks specific to the actual configuration of the computer based system and supported by the basic software.

Basic software: the minimum software, which includes firmware and middleware, required to support the application software.

Code: implementation of particular data on a particular computer program in a symbolic form, such as source code, object code, or machine code.

Computer: A programmable electronic device for storing and processing data, making calculation, or performing control.

Notes: a) Generally, the term computer means a digital computer. b) A computer may consist of a stand-alone unit or may consist of several interconnected units and includes any

programmable electronic systems (PES), including mainframe, minicomputer or microcomputer.

NI 425

july 2008 Bureau Veritas 7

Computer-based system: A system of one or more computers, associated software, peripherals, and interfaces.

Detailed design: logic continuation of preliminary design, considered as a process of separating, usually with the express purpose of isolating one or more attribute(s) of the software, to prevent specific interactions and cross-coupling interference.

Fall back arrangement: automatic reaction of the system to a failure to provide the best possible functionality.

Functionality: ability to perform an intended function which uses systems of display, controls, and instrumentations.

Integration testing: Tests allowing verifying the transitions between different parts of program.

Integrated system: A combination of computer-based systems, which are interconnected in order to allow centralised access to sensor information and/or command/control.

Notes: a) Integrated systems may, for example, perform one or more of the following operations: b) Passage execution (e.g. steering, speed control, traffic surveillance, voyage planning). c) Communications (e.g. radiotelephone, radio telex, GMDSS). d) Machinery (e.g. power management, machinery monitoring, fuel oil, / lubrication oil transfer). e) Cargo (e.g. cargo monitoring, inert gas generation, loading/discharging). f) Safety and security (e.g. Fire detection, fire pump control, watertight doors).

Integrated navigation system: combination of systems or functions of navigational aids those are interconnected to increase safe and efficient navigation when used by suitably qualified personnel.

Integrity: property of information as being accurate, valid with regard to specified requirements, and verified by comparing data from more than one independent source.

Metrics: Refers to the specific measures used in studying a particular process or organisation.

Module testing: Each software module is individually tested to detect and correct errors.

Operating systems (OS): Set of computer programs that manages the hardware and software resources of a computer.

Preliminary design: first approach of software architecture, structure selected to implement the software requirements.

Software component: self-contained part, combination of parts, sub-assemblies, or units which performs a distinct function in a system.

Software integration: process of combining software components.

Software: computer programs, and possibly associated documentation and data pertaining to the operation of a computer system.

Software development planning: description of software life cycle and software development environment. It is an estimated timetable of tasks where restraints, critical points, and resources are identified.

Software requirements: description of what is to be produced by the software given input and constraints. Software requirements include both high level requirements and low-level requirements.

System data: data that is used by the system for the processing and display of essential information. System data of the same type is from the same source. System data, at least for primary navigation information, has been checked for integrity.

NI 425

8 Bureau Veritas July 2008

System integrator: organisation responsible for ensuring that the system complies with the requirements of this standard.

Testing: Operation intended to control the correct operation of an apparatus or the good execution of a program as a whole.

Verification: Confirmation, through the provision of objective evidence, that specified requirements have been fulfilled.

Validation: Confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled.

Validity: conformity of information with formal and logical criteria, or the marking of data as being good or no good (valid or invalid) for the intended use

Vessel: water craft of any description, including non-displacement craft, craft and seaplanes, used or capable of being used as a means of transportation on water

Warning: signal announcing a condition requiring non-immediate attention for precautionary reasons

1.5. Abbreviation DR Design Review DRB Design Review Board ECO Engineering Change Order EUT Equipment Under Test FAT Factory Acceptance Test IEEE Institute of Electrical and Electronics Engineers MMI Man-Machine Interface UDF Unit Development Folder PMS Power Management System SQA Software Quality assurance …

NI 425

july 2008 Bureau Veritas 9

2. Software requirements

2.1. General

2.1.1. Standards/guidelines and naming conventions are to be specified for the document and followed throughout the documents

2.1.2. The basic software is to be developed in consistent and independent modules.

A self-checking function (e.g. auto test) is to be provided to identify failure of software module.

When hardware (e.g. input /output devices, communication links, memory, etc.) is arranged to limit the consequences of failures (e.g. modular concept), the corresponding software is also to be separated in different software modules ensuring the same degree of independence, in order to avoid a common single failure.

2.1.3. Application software is to be tested to verify the ability to perform intended functions with all systems interconnected (or by means of simulation tools, as relevant).

2.1.4. The code of practice employed in the design and testing of the integral software to the operation of the equipment under test is to be specified and conform to a quality control system audited by a competent authority.

2.1.5. The code of practice is to define the methodology used in the development of the software and the standards applied following criteria: a) Complex software is to be structured to support separate testing of single modules or of groups of

associated modules. b) The structure is to support maintenance and updates of software by minimizing the risk of

undetected problems failures and non regression. c) The manufacturer is to supply documentation demonstrating that the software of the System is

developed and tested according to the code of practice and to this document.

2.1.6. A description of operating system configuration is to be documented (listing, limitations, parameters…).

The functions delivered within the operating system packages are to be limited to those needed by the system. The unused options are to be invalidated.

2.1.7. The tools and additional software are to be identified and documented.

2.2. Functional requirements

2.2.1. Functional requirements are to be uniquely numbered.

2.2.2. Interface requirements to other functions or external entities are to be clearly identified.

2.2.3. All common functions are to be identified.

2.2.4. The requirements are to be stated so that they are discrete, unambiguous and testable.

2.2.5. Each decision, selection, and computational function that the software must perform is to be clearly defined.

2.2.6. A dictionary for all data elements is to be provided. The data dictionary is to be complete.

NI 425

10 Bureau Veritas July 2008

2.3. Safety of operation

2.3.1. Facilities are to be provided for protecting all operational software incorporated in the equipment.

2.3.2. Any software required in equipment to facilitate operation in accordance with its equipment standard, including that for its initial activation/reactivation, is to be permanently installed with the equipment, in such a way that it is not possible for the user to have access to this software.

2.3.3. It is not to be possible for the operator to augment, amend, or erase, during normal use, any program software in the equipment required for operation in accordance with the equipment standard. Data used during operation and stored in the system is to be protected in such a way, that necessary modifications and amendments by the user cannot endanger its integrity and correctness.

2.3.4. Default values are to be inserted whenever relevant to facilitate the required operation of the equipment.

2.3.5. Display and update of essential information available in the equipment as well as safety related function is not to be inhibited due to operation of the equipment in any particular mode, for example dialogue mode.

2.3.6. The system is to give an indication when presented information is uncertain or derived from conflicting sources.

2.3.7. Software is to be protected against intrusion by efficient protections, such as antivirus, password levels, firewalls, as relevant.

2.4. Monitoring

2.4.1. Means is to be provided to monitor automatically the operational software and stored data of the equipment. The check is to be carried out during system start-up and at regular intervals, as indicated in the manufacturer's documentation. In the case of a non-automatically recoverable error or failure, the system is to release an independent alarm observable to the user on the workstation. (e.g.: Watchdog)

2.4.2. An error log must be installed and activated in order to supervise the program execution and it unwished execution.

2.4.3. An alarm is to be triggered before the lack of computer resources could endanger the software execution.

2.5. Fall-Back arrangement

2.5.1. After a failure, or when sensor data become invalid, the software is to support the availability of essential information and functions, by using appropriate fallback arrangements.

2.5.2. It is to be clearly indicated when the mode in use is not the normal mode to fully perform the function of the software.

2.5.3. After a fallback arrangement, recovery of normal operation is only to be restored upon confirmation by the operator. Power interruption or intentional shut down is not to be treated as a Fall-back condition

2.6. Documentation

2.6.1. The document is to be concise and easy to follow.

2.6.2. The level of detail provided is to be appropriate to the purpose of the document.

NI 425

july 2008 Bureau Veritas 11

2.6.3. The information is to contain the evidence of document control (e.g. second reading, check by other person than the author…).

2.6.4. The Software Requirement Specification is to describe a high level system overview.

2.6.5. The internal and external interface and data flows are to be described in high level system diagrams.

2.6.6. The system function and/or data flow is to be clearly and completely described.

2.6.7. The software environment (hardware, software, resources, and users) is to be specified.

2.6.8. All referenced documents, definitions acronyms and abbreviations are to be listed.

2.6.9. The Software Requirement Specification is to contain a general description of the software system and operational concepts.

2.6.10. The user characteristics are to be defined.

2.6.11. General design, implementation constraints, dependencies are to be noted. General assumptions that affect implementation are to be stated.

2.6.12. Timing requirements are to be provided. Timing and memory limits are to be compatible with hardware constraints.

2.6.13. Any limit and/or restriction on software performance are to be defined (e.g. contract requirements, hardware capability...).

2.6.14. Each function is to be defined separately, with its purpose and scope.

2.6.15. The functional requirements are to be stated in terms of inputs, outputs and processing.

2.6.16. The functional requirements are to be the bases for detailed design and functional tests cases

2.6.17. The software Requirement Specification is to include a description of the performance requirements for each function.

2.6.18. The hardware limitations are to be identified for each function.

2.6.19. The following are to be identified: � Environmental conditions (risk of inadvertent alteration of software) � Computer software inventory (list of existing software intended to be used) � Databases, � Test software (simulator…) � Tests methods (tests demonstration, analysis or inspection and acceptance criteria) � Computer hardware resources.

2.6.20. A software user’s guide is to be available for assessment.

NI 425

12 Bureau Veritas July 2008

3. Software Development process

3.1. General

3.1.1. Support tools and programming languages a) A suitable set of integrated tools, including languages, compilers, configuration management tools,

and when applicable, automatic testing tools, are to be selected for the required software category. The availability of suitable tools to supply the relevant services over the whole software lifetime is to be considered.

b) Coding standards are to be: � Reviewed by the assessor; � Used for the development of all software.

c) The coding standards specify good programming practice, proscribe unsafe language features, and specify procedures for source code documentation. As a minimum, the following information is to be contained in the source code documentation: � Legal entity (for example company, author(s), etc…); � Description; � Inputs and outputs; � Configuration management history.

d) The software is to be produced to achieve modularity, testability and capability of safe modification. e) For each major component or subsystem described in the software architecture design, further

refinement of the design is to be based on a partitioning into software modules (i.e. specification of the software system design). The design of each software module and the tests to be applied to each software module is to be specified.

3.1.2. Code implementation a) The source code:

� is to be readable, understandable and testable; � is to satisfy the specified requirements for software module design; � is to satisfy the specified requirements of the coding standards; � is to satisfy all relevant requirements specified during safety analyse.

b) Each module of software code is to be reviewed.

3.1.3. Module testing a) Each software module is to be tested as required during software specification. b) These tests are to demonstrate that each software module performs the intended function and does

not lead to unintended behaviour (e.g. software is robust to unattended input data). c) The results of the software module testing are to be documented. d) The procedures for corrective action on failure of test are to be specified.

3.1.4. Integration testing a) Software integration tests are to be specified concurrently during the specification and development

phases. b) The specification of software integration tests are to include:

� The division of the software into manageable integration sets; � Test cases and test data; � Types of tests to be performed; � Test environment, tools, configuration and programs; � Test criteria on which the completion of the test will be judged; and � Procedures for corrective action on failure of test.

c) The tests are to demonstrate that all software modules and software components/ subsystems interact correctly to perform their intended function and do not lead to any unintended behaviour.

d) The results of software integrating testing are to be documented; stating objectives and test criteria have been met. Reasons of any failure are to be analysed.

e) During software integration, any modification or change is to be subjected to an impact analysis, which determines all software modules influenced, and the necessary re-verification and re-design activities.

NI 425

july 2008 Bureau Veritas 13

3.1.5. Validation a) The pass/fail criteria for achievement of software validation is to include:

� The required input signals with their sequences and their values; � The anticipated output signals with their sequences and their values; and � Other acceptance criteria, for example memory usage, timing and value tolerances.

3.1.6. Software modification a) Prior to carrying out any software modification, relevant procedures are to be made available. b) A modification is to be initiated within the scope of authorized procedure for which specifications are

to be detailed as follows: � The reasons of change; � The proposed changed; � The possible hazard.

c) An analysis is to be carried out on the impact of the proposed software modification on the system functionalities in order to determine: � whether or not a hazard and risk analysis is required; � which software lifecycle phases will need to be repeated.

d) The impact analysis results are to be documented. e) Regression tests are to be specified

3.1.7. Verification a) The verification of software is to be planned according to software quality plan. b) The software verification planning refers to the criteria, techniques, and tools to be used in the

verification activities, and address: � The evaluation of the safety integrity requirements where applicable; � The selection of documentation strategies, activities and techniques; � The selection and utilisation of verification tools; � The evaluation of verification results; � The corrective actions to be taken.

c) Documentary evidence is to point out that software control has been satisfactorily completed at each step.

d) For each verification step documentation is to include: � Identification of items; � Identification verification data; � Non-conformances.

e) The following verification activities are to be performed: � Verification of software requirements; � verification of software architecture; � verification of software system design; � verification of software modules design; � verification of code; � data verification; � software module testing; � software integration testing; � programmable hardware integration testing; � Software safety requirements testing, where applicable.

f) Regression tests are to be performed, after modifications, where applicable.

3.2. Software Quality Plan (Guidelines)

3.2.1. Generality

The Software Quality Assurance (SQA) Plan establishes the methods to be used to achieve the objectives of the SQA process. The SQA Plan may include descriptions of process improvement, metrics, and progressive management methods. This plan should include: a) Environment: A description of the SQA environment, including scope, organizational responsibilities

and interfaces, standards, procedures, tools and methods. b) Authority: A statement of the SQA authority, responsibility, and independence, including the

approval authority for software products.

NI 425

14 Bureau Veritas July 2008

c) Activities: The SQA activities that are to be performed for each software life cycle process and throughout the software life cycle including: � SQA methods, for example, reviews, audits, reporting, inspections, and monitoring of the

software life processes. � Activities related to the problem reporting, tracking and corrective action system. � A description of the software conformity review activity.

d) Transition criteria: The transition criteria for entering the SQA process. e) Timing: The timing of the SQA process activities in relations to the activities of the software life

cycle processes. f) SQA Records: A definition of the records is produced by the SQA process. g) Supplier’s control: A description of the means of insuring those sub-tier suppliers’ processes and

outputs will comply with the SQA Plan.

3.2.2. Quality assurance and control

Quality assurance and control are based on the production and the follow-up of the quality plan.

The quality plan is to be updated following the development progress. It is to be reviewed by all the participants in its production.

Note 1: ISO/IEC 90003 gives guidelines for the application of ISO 9001 to the development, supply, and maintenance of software.

Note 2: Refer also to SOLAS Ch. V Reg. 18 – 5.

NI 425

july 2008 Bureau Veritas 15

4. Software Assessment

4.1. Software classification

The guidance is based on the classification defined for the computer-based systems. The computer based systems are to be assigned into three system categories as shown in Table I, according to the possible extent of damage caused by a single failure within the computer-based systems.

Consideration is to be given to the extent of the damage directly caused by a failure, but not to any consequential damage.

Identical redundancy will not be taken into account for the assignment of a system category.

The assignment of a computer-based system to the appropriate system category is to be made according to the greatest likely extent of direct damage. For example, refer to Table II

Note: Where independent effective backup or other means of averting danger is provided the system category III may be decreased by one category.

Table I: System/software categories

Category Effects System functionality

I Those systems, failure of which will not lead to dangerous situations for human safety, safety of the vessel and/or threat to the environment.

� Monitoring function for Informational administrative tasks

II Those systems, failure of which could eventually lead to dangerous situations for human safety, safety of the vessel and / or threat to the environment

� Alarm & monitoring functions � Control functions which are necessary to maintain the ship in its normal operational and habitable conditions

III Those systems, failure of which could immediately lead to dangerous situations for human safety, safety of the vessel and / or threat to the environment.

� Control functions for maintaining the vessel’s propulsion and steering � Safety functions (1) � Navigation Functions

(1) Software systems related to safety functions are subjected to specific analysis by the Society in accordance within IEC 615.08 x (e.g. emergency shut-down – engine safeties)

NI 425

16 Bureau Veritas July 2008

Table II: Example of assignment to system/software categories

Category Examples (1)

I � Maintenance support systems � Information and diagnostic systems

II � Alarm & monitoring equipment � Tank capacity measuring equipment � Control systems for auxiliary machinery � Main propulsion remote control systems � Fire detection systems � Fire extinguishing systems � Bilge systems � Governors

III � Machinery protection systems/equipment � Burner control systems � Electronic fuel injection for diesel engines � Loading instrument, Stability computer � Navigation equipment used for course control � Control systems for propulsion and steering � Course control systems, including positioning systems where manual intervention to

avert danger in the event of failure or malfunction is no longer possible (1) The examples listed are not exhaustive

4.2. Software Category notations

To the three categories of software defined in 4.1 correspond respectively three notations as follows: � ST1: software category I � ST2: software category II � ST3: software category III

The category notations are brought on attestation certifying that the considered software has been examined by the Society, in accordance with the requirements of the present document. The attestation, issued within the scope of the Bureau Veritas Marine Division General Conditions, is valid five years. After this period, it may be renewed, the possible modifications being submitted to the Society.

Any changes in software design or in using conditions, which would have not be submitted to the Society, could entail withdrawal of these attestations

4.3. Software Assessment

4.3.1. Condition of granting

Software approval is granted, subject to the assessment of the development quality and test results as referred in.

The whole documentation and test evidence are to be provided to the Society.

Additional documents may be requested for evaluation of software.

4.3.2. A Software Assessment attestation may be issued at the request of the manufacturer when approval is granted

NI 425

july 2008 Bureau Veritas 17

4.3.3. Documentation

Table III: Tests reports and documentation to be submitted Software Category

No. Tests and evidence I II III 1. Evidence of quality system 1.1. Quality plan for software M S S 1.2. Quality control in production M M 1.3. Final test reports M S S 1.4. Traceability of software M M S 2. Evidence of software validity requirements :

- safety of operation - Monitoring - Fallback arrangement

2.1. Software description M M S 2.2. Failure analysis for safety related functions only M S 3. Evidence of software testing 3.1. Evidence of software testing according to quality plan M S 3.2. Analysis regarding existence and fulfilment of programming

procedures for safety related functions M S

3.3. Code inspection, walk–through M 4. Software tests (1) 4.1. Module tests M S 4.2. Subsystem tests M S 4.3. System Tests M S 5. Performance tests 5.1. Integration test M W 5.2. Fault simulation W W 5.3. Factory acceptance Test (FAT) M W W 6. Modifications 6.1. Tests after modifications (2) W/S W/S 7. User’s guide

M = Evidence kept by manufacturer and submitted on request S = Evidence checked by the society W = To be witnessed by the society

(1) In addition to the above, specific tests for program approval would be required for system which are parts of safety system or which integrate safety functions (See Note 1 Table II.) (2) Subsequent significant modifications to the software are to be submitted for approval. Note: A significant modification is a modification which influences the functionality and/or safety of the system

NI 425

18 Bureau Veritas July 2008

NI 425

july 2008 Bureau Veritas 19

Annex A Questionnaire on the tests and inspections to be performed

NI 425

20 Bureau Veritas July 2008

NI 425

july 2008 Bureau Veritas 21

§ REF Description Y, N, NA

ST I

ST II

ST III Documentation

provided

GENERAL

4.3.3 Are requested document lists available �� �� ��

SOFTWARE REQUIREMENT

2.1 General

2.1.1 Are standards and naming conventions specified?

� �� ��

2.1.2 Is the software developed in independent modules to limit the consequence of failure?

� �� ��

2.1.3 Is the software tested to verify the ability to perform intended functions?

� �� ��

2.1.4 Is the code of practice specified and conform to a quality control system?

� �� ��

2.1.5 a)

Is complex software structured to support separated testing of single modules or of groups of associated modules?

� � ��

2.1.5 b) Is the structure able to support maintenance and updates?

� �� ��

2.1.6 Is configuration of operating system documented?

�� �� ��

2.1.7 Are tools and additional software identified and documented?

� �� ��

� � �

2.2 Functional requirements � � �

Are functional requirements clearly identified and documented?

�� �� ��

� � �

2.3 Safety of operation � � �

2.3.1 Is software protected by facilities incorporated in the equipment?

�� �� ��

2.3.2

Is software part of equipment installed with the equipment in such a way that it is not possible for the user to have access?

� �� ��

2.3.3 Is software part of equipment protected against modification by the user?

�� �� ��

2.3.4 Does the software contain default values to facilitate the user when needed?

�� �� ��

2.3.5 Can the display and update of essential operation be run during any other particular mode?

� �� ��

2.3.6 Does the equipment indicate if information is uncertain?

� � ��

2.3.7 Is software protected against intrusion? �� �� �� � � �

NI 425

22 Bureau Veritas July 2008

§ REF Description Y, N, NA

ST I

ST II

ST III Documentation

provided

2.4 Monitoring � � �

2.4.1 Is mean provided to monitor the equipment automatically, checked at regular intervals?

� �� ��

2.4.1 Does the system release an alarm in case of non automatically recoverable error?

� �� ��

2.4.2 Is an error log installed and activated in order to supervise the program execution and it unwished execution?

� � ��

2.4.3 Is an alarm triggered before the memory is full?

� �� ��

� � �

2.5 Fallback arrangement � � �

2.5.1 Is it indicated if the mode in use is not the normal mode to fully perform the function?

� � ��

2.5.2

Does fallback arrangements support essential information and function in case of failure?

� �� ��

2.5.3 After fallback arrangement, is recovery of normal operation restored upon operator confirmation ?

� � ��

2.6 Documentation

Is documentation concise, including functional descriptions, all evidences of control, interfaces, limitations?

� � ��

SOFTWARE DEVELOPMENT PROCESS

3.1.1 Support tools and

programming language

a) Are integrated tools selected for the required software category?

� � ��

b) Are the coding standards reviewed by the assessor and used for the development of all the software?

� � ��

c) Are legal entity, description, inputs and outputs, configuration management history contained in the source code?

� � ��

d) Can the software be modular, testable and safely modifiable?

� � ��

e) Is each module software design and tested separately?

� � ��

� � � � � � � � � � � � � � � � � �

NI 425

july 2008 Bureau Veritas 23

§ REF Description Y, N, NA

ST I

ST II

ST III Documentation

provided

3.1.2 Code implementation � � �

a) Is the source code readable, understandable and testable?

� �� ��

a) Does the source code satisfy the specified requirements for a software module design?

� �� ��

a) Does the source code satisfy the specified requirements of the coding standards?

� �� ��

a)

Does the source code satisfy all relevant requirements specified during safety analyse?

� �� ��

b) Is each module of software code reviewed?

�� �� ��

� � �

3.1.3 Module testing � � �

a). Is each software module tested? �� �� ��

b) Do the tests show that each module performs its intended function and does not perform unintended function?

� �� ��

c). Are the result of module testing documented?

�� �� ��

d). Are the procedures for corrective action on failure of tests specified?

� � ��

3.1.4 Integration testing � � �

a) Are the software integration tests specified during the specification and development phases?

� �� ��

b)

Do integration tests specify the division of the software into manageable integration sets?

� �� ��

b) Do integration tests specify test cases and test data?

� �� ��

b) Do integration tests specify types of tests to be performed?

�� �� ��

b) Do integration tests specify test environments, tools, configuration and programs?

�� �� ��

b) Are test criteria specified, on which the completion of the test will be jugged? (expected results)

� �� ��

b) Do integration test specify procedures for corrective action of failure of test?

� �� ��

c) Do the software tests demonstrate the protection against unintended behaviour?

� �� ��

d) Are the results of integration software tests documented, indicating reason of failure, if any?

�� �� ��

e) Is an analyse carried out when a modification of software is necessary during the integration software?

� �� ��

NI 425

24 Bureau Veritas July 2008

§ REF Description Y, N, NA

ST I

ST II

ST III Documentation

provided

3.1.5 Validation � � �

a)

Do the pass/fail criteria for validation include input/output signals with their sequences and values and other acceptance criteria ?

�� �� ��

� � �

3.1.6 Software modification � � �

a) Are procedures of modification available?

� �� ��

b). Are the modification initiated within of an authorised procedure?

� � ��

c) d)

Is an analyse on the impact of software modification on system functionalities performed and documented?

� � ��

e) Are regression tests specified? � �� ��

� � �

3.1.7 Verification � � �

a) Is the verification software available and planned in accordance with the quality plan?

� �� ��

b) Are criteria, techniques, and tools used for verification activities planned?

� � ��

c) Is the documentation result available? �� �� ��

d) Are evidences of all steps of verification available?

�� �� ��

e) Have all verification activities been performed (specification, design, coding, testing)?

� �� ��

f) Have regression tests been performed after modifications?

� �� ��

SOFTWARE QUALITY PLAN

3.2.1 Generality

Have standards / guidelines been identified to define the work product?

� �� ��

Has project specific criteria been added?

� � ��

Was this assessment conduct as scheduled?

�� �� ��

Is there evidence that the work product was reviewed by all stakeholders?

� �� ��

Have acceptance criteria been established for the work product?

� � ��

Does the work product have a clearly define purpose and scope?

� �� ��

Does the purpose specify what software items are covered by the plan?

� � ��

Does the purpose specify the intended use of the software?

� � ��

NI 425

july 2008 Bureau Veritas 25

§ REF Description Y, N, NA

ST I

ST II

ST III Documentation

provided

Does the purpose state what portion of the life cycle is covered by the plan?

� � ��

Does the plan contain applicable references?

� �� ��

Are versions and dates provided for each reference document?

� � ��

Does the plan contain a section outlining the management structure for the project?

� � ��

Does the plan detail the documentation governing the development, verification, validation, use and maintenance of the software produced?

� � ��

Is there a set of documents listed and described?

�� �� ��

Does this plan list which documents are to be assessed by the Software Quality Plan?

� �� ��

Are standards, practices, and quality requirements use identified (e.g. IEC, ISO, IEEE …)?

� �� ��

Does this plan describe how process and product conformance shall be monitored and assured (e.g. tracked, reported, and trended)?

� � ��

Is there a schedule of reviews? �� �� ��

Does the plan describe Software Quality Plan's role at reviews?

� � ��

Does the plan identify and describe the Software Quality Plan's role in the software verification?

� � ��

Does the plan identify and describe the Software Quality Plan's role in the software validation?

� � ��

Does the plan describe the practices and procedures reporting, tracking, and problem resolution?

� � ��

Does the plan discuss which tools and techniques shall be used to support software assurance activities (e.g. checklists, plan and report templates, tracking databases)?

� � ��

Does the plan discuss how insight / oversight will be accomplished to ensure supplier control to customer requirements (e.g. inspections, assessments / audits, monthly status reports)?

� � ��

� � � � � � � � � � � � � � � � � � � � �

NI 425

26 Bureau Veritas July 2008

§ REF Description Y, N, NA

ST I

ST II

ST III Documentation

provided

3.2.2 Quality assurance and

control

� � �

Can an evidence of a quality system be provided? (E.g. to ISO 9000…) If not, what procedures for quality assurance measures are used in the company? Is the software quality plan available?

� �� ��

SOFTWARE ASSESSMENT

4.1 Software classification

Are system/software categories clearly identified?

�� �� ��

� � �

4.3 Software Assessment � � �

Are all requested documents available, as indicated in Table III?

�� �� ��

NI 425

july 2008 Bureau Veritas 27

Annex B Software Quality Plan Guidelines

NI 425

28 Bureau Veritas July 2008

NI 425

july 2008 Bureau Veritas 29

B.1. Development and design

The purpose of this procedure is to ensure that a product being developed by a company meets all specifications and requirements, whilst at the same time to be as simple and rational as possible to produce and test.

B.1.1. Scope

The procedure comprises the management of the product development that is being carried out by the development departments. This is an example but other models may be applied, following types of products, manufacturer’s standard procedures and application fields.

B.1.2. Development Projects

Development projects are defined, as projects that are approved by the company management team, are self financed and where company specifies the requirements, the projects is to be part of the company product development plans.

B.1.3. Design description

The Design Description is a definition of the software architecture and the low-level requirements that will satisfy the software high-level requirements. This data should include:

� A detailed description of how the software satisfies the specified software high-level requirements, including algorithms, data structures, and how software requirements are allocated to processor and task.

� The description of the software architecture defining the software structure to implement the requirements.

� The input/output description, for example a data dictionary, both internally and externally throughout the software architecture.

� The data flow and control flow of the design.

� Resource limitation, the strategy for managing each resource and its limitations, the margins, and the method for measuring those margins, for example, timing, and memory.

� Scheduling procedures and inter-processor/inter-task communication mechanisms, including time-rigid sequencing, pre-emptive scheduling, and interrupts.

� Design methods and details for their implementation, for example, software and data loading, user-modifiable software, or multiple-version dissimilar software.

� Partitioning methods and means of preventing partition breaches.

� Description of the software components, whether they are new or previously developed, and, if previously developed, reference to the baseline from which they are taken.

� Derived requirements resulting from the software design process.

� If the system design contains deactivated code, a description of the means to ensure that the code cannot be enable in the target computer.

� Rationale for those design decisions that are traceable to safety related systems requirements.

NI 425

30 Bureau Veritas July 2008

B.1.4. Re-use

Prior to a development project starting, checks is to be made regarding existing products or parts thereof in order to ascertain whether or not any hardware or software can be re-used or modified.

B.2. Methodology for development

The management of a software project is based, on the one hand, on the software development planning and, on the other hand, on the quality assurance and control.

B.2.1. Development planning

The development planning describes, under modular way, the software life cycle: � Specification requirements � Preliminary design � Detailed design � Coding and unit testing � Integration testing � Validation and installation � Exploitation.

It is an estimated timetable of tasks where restraints, critical points (memory capacity, response time, utilization ratio of central units...) are identified.

B.2.2. Specification requirements

Task Software Requirement Specification

Person in charge Software Manager

Purpose Produce the technical specification

Assess the feasibility

Activities Actions and restraints:

- analysis of system requirements (functions, performance, operating modes, man-machine interface…)

- assessment of feasibility.

Examination: reading by the whole persons in charge.

Output documents Specification requirements

Checking at end of phase Review of the specification requirements

NI 425

july 2008 Bureau Veritas 31

B.2.3. Preliminary design

Task Preliminary design of software

Person in charge Project manager

Purpose build an architecture satisfying the specification requirements

prepare integration and validation of software

Input documents Software Requirement Specification

Activities Actions and constraints: this first phase of abstraction is aimed at to display the required functions under a modular form ; critical points are taken into account (synchronization, interrupt processing); drafting integration and validation project plans

Means: tools generating functional diagrams, temporal restraints

Examination: model

Output documents Preliminary design file including architecture, its justification

Integration and validation project plans

Checking at end of phase Review of the preliminary design file

B.2.4. Detailed design

Task Detailed design of software components

Person in charge Project manager

Purpose Provide an algorithmic language for each component

Produce the detailed design file defining the internal structures and the component interfaces

Prepare unit testing of each component

Input documents Preliminary design file

Activities Actions and restraints: programming modules defined at previous phase are detailed, each of them making a software component. This phase results in a structured pseudo-code

Means: tools generating modules and defining interfaces

Examination: inspection and simulation

Output documents Detailed design file

Checking at end of phase Review of the detailed design file

NI 425

32 Bureau Veritas July 2008

B.2.5. Coding and module testing

Task Coding and module testing of components

Person in charge Project manager or software programmer

Purpose Translate the result of detailed design into a program by means of a programming language.

Carry out the unit testing of components to check that the code writing and its sequence are in compliance with the description of detailed design file.

Input documents Detailed design file

Activities Actions and restraints: automatic translation (or not) of pseudo-code (or the description of data and algorithms in the detailed design file) into the planned programming language, the soft components are separately tested before integrating the soft unit

Means: - translator, code generator

- testing tools, symbolic running (logic tests, functional tests)

Examination: - cross-reading of code, analyzers, plotters

- examination of the test cover and checking of results

Output documents Coding and testing file

Checking at end of phase Review at end of phase

Document checking by the project manager, in coordination with the software assurance quality manager (including the listing of original language)

B.2.6. Integration testing

Task Integration of software components into software unit

Person in charge software project manager

Purpose Progressively collect the software components and check the conformity with preliminary and detailed designs

Input documents Preliminary design file

Activities Actions and restraints: components are introduced one by one into the software element function by function. Every function is separately then fully tested

Means: integration and test tools

Examination: examination of the test coverage, and checking of the results. Testing consists of modular testing called 'white box' and functional testing called ‘black box’

Output documents Integration file

Checking at end of phase Checking of integration file by the system manager in coordination with the software assurance quality manager

NI 425

july 2008 Bureau Veritas 33

B.2.7. Validation and installation

Task Validation and installation

Person in charge Software project manager and system manager

Purpose This phase is aimed at to ensure, essentially by means of tests, that the software is in conformity with the specification requirements

Input documents Specification requirements

Activities Actions and restraints: environment evolution: the software is submitted to alteration, particularly through new elements that may be faulty

Means: - testing bench, simulated operation

- installation and user's manual

Examination: - examination of the testing coverage, checking of the results

- examination of the specified functions and restraints

Output documents Validation and qualification file

Checking at end of phase Review of software qualification

B.2.8. Exploitation

Task Software exploitation

Person in charge Software project manager and user

Purpose Exploitation, use, maintainability and software adjustment

Input documents Use and maintainability documents

Activities Bringing the software into operation in a defined environment

Detection and correction of residual faults

Activities bound to the modifications of the specifications

Output documents Document of maintainability

B.3. Responsibility

B.3.1. Product Manager

The Product Manager is responsible for ensuring that all potentially new products are described and evaluated in a Product and Market Analysis.

B.3.2. Development Manager

The Development Manager approves all plans for development projects and is responsible for the completion of the projects.

The Development Manager establishes and maintains progress reports and present to the Product Manager.

NI 425

34 Bureau Veritas July 2008

The Development Manager assigns Project Managers for development projects and ensures that they have adequate resources to carry out their tasks. The Development Manager is responsible for ensuring that projects follow the company’s quality requirements regarding documentation of development projects, and that everyone involved knows and follows the given guidelines.

The Development Manager is responsible for ensuring that configuration management of hardware and software, along with related documentation, is established, implemented, and maintained.

The Development Manager identifies and maintains an overview of those persons responsible for each hardware and software module, including software tools.

B.3.3. Project Manager

The Project Manager is responsible for the planning, completion and control of the project and reports all technical and economical matters to the Development Manager.

The Project Manager is responsible for ensuring that required documentation is prepared and written according to guidelines defined in the company’s procedures. The documents are presented to the Design Review Board (DRB) for review according to established procedures. This is normally to be completed before the next phase is started.

B.4. Execution

B.4.1. Requirements

It is a requirement of development projects in the company that the following points are taken into consideration: � Technological enhancement (state of the art) � Able to meet market needs � Be functional and have an appealing design � Cost effective to produce and develop in accordance with the budget � Be robust and dependable (high quality) � Able to be produced � Maintainable � Test and serviceable.

B.4.2. Project Handbook

Each project has a Project Handbook, which as a minimum has the following content: � Project Plan � Manning Plan � Cost Plan � Test Plan � Document Configuration plan.

The Test Plan may be integrated in the Project Plan.

B.4.3. Project Plan

The Project Manager prepares a Project Plan (Progress Plan) based on the Product Market and Analysis document and related specifications. The Development Manager approves the Project Plan.

The Project Plan defines the specific activities and milestones that are used to manage the development project. Milestones will usually be points in time for Design Reviews (DRs) or tests to be performed.

NI 425

july 2008 Bureau Veritas 35

The Project Manager informs potential co-workers in the project and distributes tasks. Whenever possible, there is only one established Progress Plan, but for major projects, it may be necessary to have two: � Detailed Progress Plan � Overall Progress Plan.

The Detailed Progress Plan is used to evaluate the daily progress of the project. The Overall Progress Plan is a summary of the Detailed Progress Plan and is used primarily as a control at regular intervals and when reporting to the management.

B.4.4. Manning Plan

The development Manager prepares a Manning Plan which gives a general view of those software functions that the project organization consists of. In addition, the tasks and the responsibility that belong to the functions are stated, along with the personnel allocated to those tasks.

As this plan details the allocation of personnel, it is distributed to the leaders who are allocating the resources.

B.4.5. Cost Plan

In co-operation with the Finance Department, the project Manager prepares a Cost Plan for each project. The Cost Plan is so sub-divided that it enables the Project Manager to keep account of the economy throughout the duration of the project.

B.4.6. Documentation Plan

The project Manager prepares a documentation Plan is prepared for each project. The Documentation Plan is an overview of all documentation to be prepared and configuration managed in the project and refers to: � Documents � Drawings � Software

The Documentation Plan is the basis for change control in the project and gives at any time an overview of all documentation in the project and its status, together with the ability to trace back to previous activities and decisions.

B.4.7. Test Plans

The Project Manager is responsible for preparing and maintaining all Test Plans. The development Manager prepares the Software test plan. These may be integrated in the Project Plan. Where contractual requirements do not state otherwise, this is carried out in accordance with The Company referential.

B.5. Project Monitoring

The Project Manager is responsible for the completion of the project according to the plans and calculations set up and approved. The Project Manager presents, whenever necessary (when changes or non-conformity occurs, on passing milestones or similar) and at least once every month, a progress report containing among other things: � Updated progress report � Project evaluation.

The reports are presented to the Development Manager for further action.

NI 425

36 Bureau Veritas July 2008

B.6. Project Evaluation

At defined period as specified in quality plan, a progress report, is prepared for each project by the Project Manager, to be forwarded to the Development Manager with a copy to the economics department.

B.7. Design Review

B.7.1. Purpose

The purpose of a design review is to develop suitable products based on economical criteria through co-operation. The term ‘suitable’ implies reliability, ease of production and ease of test and service.

B.7.2. Scope

A design review consists of planned activities that take place during different phases of a project. A design review is included as a milestone in the Project Plan.

B.7.3. Responsibility

The Project Manager is responsible for the implementation of an Design Review Board as an activity (milestone) in the project plan. The Project Manager prepares the agenda, notify participants and conduct the meeting, as well as prepare and file the minutes of the meetings.

To ensure that the participants are prepared for the meeting, they are provided with all documentation (drawings. documents. etc.) to be reviewed.

There are prepared checklists for the different phases in the project where a design review is to be performed. The Design Review Board report gives reference to the checklist used. Responsibility for actions and details of their completion includes in the report.

B.7.4. Constitution of the Design Review Board

The Chairman of the Design Review Board is normally the Development Manager. The composition of the rest of the board will depend on which phase of the development the project is currently at. Thus personnel not directly involved in the design or development, but who have experience with the topic being investigated and evaluated, may take part.

Distribution of information is an important function, in conjunction with the design review. This can be addressed by forwarding minutes of meeting and other documentation to all relevant personnel and groups.

The Design Review Board has a consultative function and may give comments and corrections to the specifications and design solutions.

B.8. Prototype

The production of a prototype of the product is an important indicator of how close specifications it has been possible to reach. The Project Manager is responsible for obtaining necessary documentation for the production of the prototype, as well as test procedures and programs for both the technical performance and the testing of the environmental conditions, if this is a requirement.

The test results from the prototype are the basis for any modifications that may be necessary prior to the production of the final documents.

NI 425

july 2008 Bureau Veritas 37

B.8.1. Documentation

The documentation accompanying all products is produced in order to: � Verify against Product Specification/Requirement Specification � Test � Maintain � Enhance.

Prior to the hand-over meeting, the Project Manager prepares the following documentation: � Production papers � Test procedures including an FAT procedure � User documentation � Hardware and Software documentation.

B.8.2. Factory Acceptance Test

Through a Factory Acceptance Test (FAT) approved by the customer (Product Group), the Development Department verifies for the customer that the product is in conformity, with the Product Specification and the Requirement Specification.

All test results is to be logged.

B.8.3. Hand-over Meeting

When all remaining actions from the FAT have been closed, the Project Manager hand over, in a hand-over meeting, all the product and production documents, including test equipment test procedures and test reports to the Product Group.

B.8.4. Training

Time required for the training of test and production personnel takes to account when preparing the Project Plan. Training is to be a separate activity in the Project Plan.

B.9. Relevant Documents � Planning and Management of Delivery Projects � Preparation of Drawings and Production Documents � Preparation and Filing of Documents � Document of Development Projects � Configuration management � Test and Verification of Software � Software module list :

o Identification of all hardware drivers with their version numbers. o Details of folder structures, and description of the different programs usefulness.

B.10. Test and verification of software

The purpose of this procedure is to establish rules for the testing and verification of software in order to be able to document all requirements stated in the contract and/or the System Requirement Specification.

The procedure is the guideline for the testing and verification of software in all companies development projects, (both basic and product related), and delineates a set of basic test documents which are associated with the dynamic aspects of software testing (i.e. the execution of procedures and code). The procedure defines the purpose of each basic document. Reference is given to the prototype outline of each document.

The procedure may be applied to any software that runs on a computer based devices or systems. Size, complexity, or criticality of the software does not restrict applicability.

NI 425

38 Bureau Veritas July 2008

The procedure addresses the documentation of both initial development testing and the testing of subsequent software releases i.e. both new and modified versions of earlier developed software. For a particular software release, it may be applied to all phases from unit testing through user acceptance.

The procedure does not call for specific testing methodologies, approaches, techniques, facilities, or tools, and does not specify the documentation of their use. Additional test documentation may be required (for example, code inspection checklists, and reports).

Within each standard document, the content of each section (i.e. the text that covers the designated topics) may be tailored to the particular application and the particular testing phase. In addition to tailoring the content, additional documents may be added to the basic set, additional sections may be added to any document and additional content to any section. It may be useful to organize some of the sections into sub-sections. Some or all of the contents of a section may be contained in another document that is then referenced.

Each project will need to specify the specific documents required for a particular test phase, and is to specify additional content requirements and conventions in order to reflect their own particular methodologies, approaches, techniques, facilities, or tools for testing.

Test and verification of software may be performed on four levels: � Unit tests � Subsystem (module) Integration test � System integration tests � System tests.

B.11. Software Test Documentation

B.11.1. Software Test and Verification Plan

The purpose of the Software Test and Verification Plan is to: � Describe the scope, approach, resources, and the schedule of testing activities. � Identify the items to be tested, the features to be tested, the testing tasks to be performed, the

personnel responsible for each task, and the risks associated with the plan.

A Software Test and Verification Plan is prepared; Test and Verification of Software. This document gives a detailed plan for all tests from unit tests, integration tests through to system tests and final tests. The plan identifies all the test documents that are associated with the testing of software, i.e. the execution of procedures, code, and associated hardware. It will then be possible to verify that the system meets all specified requirements.

B.11.2. Software Test Specification

The purpose of the Software Test Specification is to: � Specify refinements of the test approach and to identify the features to be tested by this design and

its associated tests. � Define test cases identified by a Software Design Description. � Specify the steps for executing a set of test cases or, more generally, the steps used to analyze a

software item in order to evaluate a set of features.

B.11.3. Test Log

The purpose of the Test Log is to: � Provide a chronological record of relevant details about the execution of tests. � Identify the test items being transmitted for testing.

This includes: � The person responsible � The physical location � Status.

NI 425

july 2008 Bureau Veritas 39

Any variation from the current item requirements and designs is noted in this report: � Document any event that occurs during the testing process which requires investigation. � Find trends and prevent repetition of errors and problems. � Summarize the results of the designated testing activities and provide evaluations based on these

results.

B.12. Failure report

The aim of this report is to put forward the major causes of systems failures, to analyze the proposed causes and to justify them with actual examples taken from the recent past. The report then proposes a classification scheme into which the various proposed and justified factors may be placed. This scheme provides a generalized view of the areas where systems failure may be caused. The generalized view given by this classification scheme then allows us to conclude some general techniques and/or practices, which will greatly reduce the number of system failures and thus decrease the impact these failures have. The system failures used as examples and proposed here are not just those that cause the complete collapse of a system but also those that through erroneous operation of that system make large impressions in other ways, e.g. loss of money, life, functionality etc…

B.13. Software maintenance

Modification of program contents and data, as well as a change of version, is to be documented and submitted for approval.