251
Software Testing Concept and Methodologies

Testing concepts prp_ver_1[1].0

Embed Size (px)

Citation preview

Page 1: Testing concepts prp_ver_1[1].0

Software Testing Concept and

Methodologies

Page 2: Testing concepts prp_ver_1[1].0

Fundamentals of Software Testing

Page 3: Testing concepts prp_ver_1[1].0

3

Software Testing Concepts

3

Is there any difference when you work for same assignment ???

NON-IT?IT

Page 4: Testing concepts prp_ver_1[1].0

4

Software Testing Concepts

4

Software/Application??

Group of programs designed for end user using operating system and system utilities.

A self contained program that performs well- defined set of tasks under user control.

Programs, procedures, rules, and any associated documentation pertaining to the operation of a system.

Page 5: Testing concepts prp_ver_1[1].0

5

Software Testing Concepts

5

Evolution of software product

Marketing Survey is done by marketing people for various product and benchmark

the products. Create MRS( Marketing Requirement survey)

Requirement analysis Feasibility study (social, economical etc) investigate the need for possible software automation in the given

system. Domain expert Create URS ( User Requirement specification)

Design software's overall structure is defined. Software architecture, interdependence of modules, interfaces, database

etc is defined. System analyst Create SRS( High level design, Low level design etc)

Page 6: Testing concepts prp_ver_1[1].0

6

Software Testing Concepts

6

Evolution of software product( Cont.)

Code Generation design must be translated into a machine-readable form taking input as

SRS. Done by Team of developers. Reviews after every 500 lines of code

• Code Inspection• Code Walkthrough

Testing New/patched build is tested by Test engineers for stability of the

application.

Maintainence Software is maintained due to changes ( unexpected values into the

system)

Page 7: Testing concepts prp_ver_1[1].0

7

Software Testing Concepts

7

Software development life cycle

Operational Testing

Ongoing Support

Requirement Analysis

High level design

Detailed Specifications

Coding

Unit Testing

Integration Testing

Review/Test

Page 8: Testing concepts prp_ver_1[1].0

8

Software Testing Concepts

8

OutputInputProcess

System

An inter-related set of components, with identifiable boundaries, working together for some purpose.

Page 9: Testing concepts prp_ver_1[1].0

9

Software Testing Concepts

9

Analysis

The process of identifying requirements, current problems, constraints ,Opportunities for improvement , timelines and Resources costs .

Page 10: Testing concepts prp_ver_1[1].0

10

Software Testing Concepts

10

Design

The business of finding a way to meet the functional requirements within the specified constraints using the available technology

Page 11: Testing concepts prp_ver_1[1].0

11

Software Testing Concepts

11

Phases or stages of a project from inception through completion and delivery of the final product…

… and maintenance too!

Software Development life cycle

Page 12: Testing concepts prp_ver_1[1].0

12

Software Testing Concepts

12

Three Identifiable Phases:

1. Definition

2. Development

3. Maintenance

Software Development life cycle

Page 13: Testing concepts prp_ver_1[1].0

13

Software Testing Concepts

13

Definition Phase

Focuses on WHAT What information to be processed?

What functions and performances are desired?

What interfaces are to be established?

What design constraints exists?

What validation criteria are required to define a success system?

Page 14: Testing concepts prp_ver_1[1].0

14

Software Testing Concepts

14

Development Phase

Focuses on

How the database should be designed ?

How the software architecture to be designed ?

How the design will be translated in to a code ?

How testing will be performed ?

Three specific steps in Development Phase are:-

a. Design

b. Coding

c. Testing (ignored due to lack of time, due time to market, additional cost involved, lack of testing requirement understanding etc.) )

Page 15: Testing concepts prp_ver_1[1].0

15

Software Testing Concepts

15

“Maintainability is defined as the ease with which software can be understood, corrected, adapted and enhanced”

Maintenance Phase

Maintenance phase focuses on CHANGE that is associated with

Error correction

Adaptation required as the software environment evolves

Enhancements brought about by changing customer requirements

Reengineering carried out for performance improvements

Page 16: Testing concepts prp_ver_1[1].0

16

Software Testing Concepts

16

SDLC

Identify Problems/Objectives

Determine information Requirements

Analyze System needs

Design the recommended system

Develop and Document software

Testing the System

Implementation and maintaining the system

SDLC Phases

Page 17: Testing concepts prp_ver_1[1].0

17

Software Testing Concepts

17

• Request for Proposal• Proposal• Negotiation• Contract• User Requirement Specification• Software Requirement Specification

SDLC Phases : Requirement Identification & Analysis Phase

Page 18: Testing concepts prp_ver_1[1].0

18

Software Testing Concepts

18

IEEE 830 :Software Requirement Specification is a means of translating the ideas in the minds of the clients(the inputs)into a set of formal document (the output) of the requirement phase

The Role

Bridge the communication gap between the client the user and the developer

Software Requirement Specifications

Page 19: Testing concepts prp_ver_1[1].0

19

Software Testing Concepts

19

High Level Design

SDLC Phases- Design

HLD Document contains items in a macro level List of modules and a brief description of each

Brief functionality of each module

Interface relationship among modules

Dependencies between modules

Database tables identified with key elements

Overall architecture diagrams along with technology details

Page 20: Testing concepts prp_ver_1[1].0

20

Software Testing Concepts

20

Low Level Design

HLD and LLD phases put together called Design phase

SDLC Phases- Design

Detailed functional logic of the module, in pseudo code

Database tables, with all elements, including their type and size

All interface details

All dependency issues

Error MSG listing

Complete input and output format of a module

Page 21: Testing concepts prp_ver_1[1].0

21

Software Testing Concepts

21

SDLC Phases

Code Generation

design must be translated into a machine-readable form taking input as SRS.

Done by Team of developers.

Reviews after every 500 lines of code

• Code Inspection

• Code Walkthrough

Testing New/patched build is tested by Test engineers for

stability of the application.

Maintainence

Software is maintained due to changes ( unexpected values into the system)

Page 22: Testing concepts prp_ver_1[1].0

22

Software Testing Concepts

22

What is testing?

We Test !! We Test !! Why?

Testing Defined

Is Product Successful

Product Success criteria

Testability

Test factors

Get Started with Testing !!!!!!!

Page 23: Testing concepts prp_ver_1[1].0

23

Software Testing Concepts

23

What is Testing?

process used to help identify the correctness, completeness and quality of developed computer software.

Find out difference between actual and expected behavior.

The process of exercising software to verify that it satisfies specified requirements of end user and to detect errors

The process of revealing that an artifact fails to satisfy a set of requirements

Page 24: Testing concepts prp_ver_1[1].0

24

Software Testing Concepts

24

What is Testing ( Cont.) ?

Establishing confidence that a system does what it is supposed to do

Confirming that a system performs its intended functions correctly

Does not guarantee bug free product

No substitute for good programming

Can’t prevent/debug bugs, only detect

Offer advise on product quality and risks.

Page 25: Testing concepts prp_ver_1[1].0

25

Software Testing Concepts

25

We Test !! We Test !! Why

Detect programming errors - programmers, like anyone else, can make mistakes.

To catch bugs/defect/errors.

To check program against specifications

Cost of debugging is higher after release

Client/end user should not find bugs

Some bugs are easier to find in testing

Challenge to release a bug-free product.

Verifying Documentation.

Page 26: Testing concepts prp_ver_1[1].0

26

Software Testing Concepts

26

To get adequate trust and confidence on the product.

To meet organizational goals Like meeting requirements, satisfied customers, improved market

share, Zero Defects etc

Since the software can perform 100000 correct operations per second, it has the same ability to perform 100000 wrong operations per second, if not tested properly.

Ensuring that system is ready for use

Understanding limits of performance.

Learning what a system is not able to do

Evaluating capabilities of system

We Test !! We Test !! Why?

Page 27: Testing concepts prp_ver_1[1].0

27

Software Testing Concepts

27

Testing defined !!

Def-1

Process of establishing confidence that a program or system does what it is supposed to.

Def-2

Process of exercising or evaluating a system or system component by manual or automated means to verify that it satisfies specified requirement (IEEE 83a)

Def-3

Testing is a process of executing a program with the intent of finding errors (Myers)

Def-4

Testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results.

Page 28: Testing concepts prp_ver_1[1].0

28

Software Testing Concepts

28

Is Product successful ???

When Client/Customer perceives it as value-added to his business.

Timeliness of delivery of the product within budget and scope.

The business perceives that the system satisfactorily addresses the true business goals.

End user feels that look, feel, and navigation are easy.

Team is prepared to support and maintain the delivered product.

Page 29: Testing concepts prp_ver_1[1].0

29

Software Testing Concepts

29

Product Success Criteria

Functionality

Usability

Likeability

Configurability

Maintainability

Interoperability

Page 30: Testing concepts prp_ver_1[1].0

30

Software Testing Concepts

30

Testability

Operability

Controllability

Observability

Understandability

Suitability

Stability

Accessibility

Navigability

Editorial Continuity

Scalability

Context Sensitivity

Structural Continuity

Page 31: Testing concepts prp_ver_1[1].0

31

Software Testing Concepts

31

Test Factors

Functionality (exterior quality)

Engineering (interior quality)

Adaptability (future quality)

Correctness Efficiency Flexibility

Reliability Testability Reusability

Usability Documentation Maintainability

Integrity Structure   

Page 32: Testing concepts prp_ver_1[1].0

Software Testing Life Cycle

Page 33: Testing concepts prp_ver_1[1].0

33

Software Testing Concepts

33

DesignDesign BuildBuild Test & FixTest & Fix

* Here testing was happening only towards the end of the life cycle

SpecSpec

Conventional Testing Process

Page 34: Testing concepts prp_ver_1[1].0

34

Software Testing Concepts

34

56%

10%

7%

27%

RequirementsDesignCodeOther

Source: IBM/TRW/Mitre

27%

7%

10%

56%

Distribution of Defects in the life cycle

Page 35: Testing concepts prp_ver_1[1].0

35

Software Testing Concepts

35

Software development life cycle

Operational Testing

Ongoing Support

Requirement Analysis

High level design

Detailed Specifications

Coding

Unit Testing

Integration Testing

Review/Test

Page 36: Testing concepts prp_ver_1[1].0

36

Software Testing Concepts

36

STLC-V Model

RequirementRequirement

Req. ReviewReq. Review

DesignDesign

Design ReviewDesign Review

CodeCode

Code ReviewCode Review

Develop Unit TestDevelop Unit Test

Review Unit TestReview Unit Test

Execute Unit TestExecute Unit Test

Execute Integration testsExecute Integration tests

Execute System TestsExecute System Tests

Develop integration TestsDevelop integration Tests

Review Integration TestsReview Integration Tests

Develop Acceptance TestsDevelop Acceptance Tests

Review Acceptance testsReview Acceptance tests

Page 37: Testing concepts prp_ver_1[1].0

37

Software Testing Concepts

37

STLC:- Activities

Scope/Requirement

Base line inventory

Acceptance criteria

Schedule

Prioritization

Test references

Sign off req

Plan

Approach

Process and Tools

Methodology

Delivery Models

Risk Plan

Project Overflow

Quality Objectives

Configuration Plan

Design

Test Design

Specifications

Test Scenarios

Test Cases

Test Data

Tool Development

Page 38: Testing concepts prp_ver_1[1].0

38

Software Testing Concepts

38

STLC:- Activities

Execution

Implement Stubs

Test Data Feeders

Batch Processes

Execute Testing

Collate Test Data

Identify Bugs

Defect Analysis

Check Unexpected Behavior

Identify defective application areas

Identify erroneous test data

Identify defect trends/patterns

Page 39: Testing concepts prp_ver_1[1].0

39

Software Testing Concepts

39

Test Approach

Test Process :- The project under development or incorporation of accepted changes in the project or project under maintenance which implemented changes, use the testing process. Based on the nature of the project, adequate testing shall be arrived at the project level.

The Test Approach • sets the scope of system testing• the overall strategy to be adopted• the activities to be completed• the general resources required• the methods and processes to be used to test the release.• details the activities, dependencies and effort required to conduct the System Test.

Page 40: Testing concepts prp_ver_1[1].0

40

Software Testing Concepts

40

Test Approach( Cont.)

Test approach will be based on the objectives set for testing

Test approach will detail the way the testing to be carried out

Types of testing to be done viz Unit, Integration and system testing

the general resources required

The method of testing viz Black–box, White-box etc.,

Details of any automated testing to be done

Details the activities, dependencies and effort required to conduct the System Test

Page 41: Testing concepts prp_ver_1[1].0

41

Software Testing Concepts

41

1.Requirement Analysis2.Prepare Test Plan3.Test Case Designing4.Test Case Execution5.Bug Reporting, Analysis and

Regression testing6.Inspection and release7.Client acceptance and support

during acceptance8.Test Summary analysis

Software Testing Life Cycle- Phases

Page 42: Testing concepts prp_ver_1[1].0

42

Software Testing Concepts

42

Requirement Analysis

Objective

The objective of Requirement Analysis is to ensure software quality by eradicating errors as earlier as possible in the developement process, as the errors noticed at the end of

the software life cycle are more costly compared to that of

early ones, and there by validating each of the Outputs.

The objective can be acheived by three basic issues:• Correctness • Completeness• Consistency

Page 43: Testing concepts prp_ver_1[1].0

43

Software Testing Concepts

43

Type of Requirement

Functional

Data

Look and Feel

Usability

Performance

Operational

Maintainability

Security

Scalability

Etc…….

Page 44: Testing concepts prp_ver_1[1].0

44

Software Testing Concepts

44

Evaluating Requirements

What Constitutes a good Requirement?

Clear:- Unambiguous terminology

Concise:- no unnecessary narrative or non-relevant facts

Consistent requirements that are similar are stated in similar terms.

Requirements do not conflict with each other.

Complete all functionality needed to satisfy the goals of the system

is specified to a level of detail sufficient for design to take place.

Page 45: Testing concepts prp_ver_1[1].0

45

Software Testing Concepts

45

Requirement Analysis

Difficulties in conducting requirement analysis

Analyst not prepared

Customer has no time/interest

Incorrect customer personnel involved

Insufficient time allotted in project schedule

Page 46: Testing concepts prp_ver_1[1].0

46

Software Testing Concepts

46

Prepare Test Plan- Activities

Scope Analysis of project

Document product purpose/definition

Prepare product requirement document

Develop risk assessment criteria

Identify acceptance criteria

Document Testing Strategies.

Define problem - reporting procedures

Prepare Master Test Plan

Page 47: Testing concepts prp_ver_1[1].0

47

Software Testing Concepts

47

Design-Activities

Setup test environment

Design Test Cases: Requirements-based and Code-based Test Cases

Analyze if automation of any test cases is needed

Page 48: Testing concepts prp_ver_1[1].0

48

Software Testing Concepts

48

Execution- Activities

Initial Testing, Detect and log Bugs

Retesting after bug fixes

Final Testing

Implementation

Setup database to track components of the automated testing system, i.e. reusable modules

Page 49: Testing concepts prp_ver_1[1].0

49

Software Testing Concepts

49

Bug Reporting, Analysis, and Regressing Testing

Activities

Detect Bugs by executing test cases

Bug Reporting

Analyze the Error/Defect/Bug

Debugging the system

Regression testing

Page 50: Testing concepts prp_ver_1[1].0

50

Software Testing Concepts

50

Inspection and Release-Activities

Maintaining configuration of related work products

Final Review of Testing

Metrics to measure improvement

Replication of Product

Product Delivery Records

Evaluate Test Effectiveness

Page 51: Testing concepts prp_ver_1[1].0

51

Software Testing Concepts

51

Client Acceptance

Software Installation

Provide Support during Acceptance Testing

Analyze and Address the Error/Defect/Bug

Track Changes and Maintenance

Final Testing and Implementation

Submission, client Sign-off

Update respective Process

Page 52: Testing concepts prp_ver_1[1].0

52

Software Testing Concepts

52

Support during Acceptance-Activities

Pre-Acceptance Test Support

Installing the software on the client’s environment

Providing training for using the software or maintaining the software

Providing hot-fixes as and when required to make testing activity to continue

Post Acceptance Test Support

Bug Fixing

Page 53: Testing concepts prp_ver_1[1].0

53

Software Testing Concepts

53

Test Summary Analysis- Requirement

Quantitative measurement and Analysis of Test Summary

Evaluate Test Effectiveness

Test Reporting

Report Faults – (off-site testing)

Report Faults – (on-site/ field testing)

Page 54: Testing concepts prp_ver_1[1].0

54

Software Testing Concepts

54

Testing Life Cycle - Team Structure

An effective testing team includes a mixture of members who has

Testing expertise

Tools expertise

Database expertise

Domain/Technology expertise

Page 55: Testing concepts prp_ver_1[1].0

55

Software Testing Concepts

55

Testing Life Cycle - Team Structure (Contd…)

The testing team must be properly structured, with defined roles and responsibilities that allow the testers to perform their functions with minimal overlap.

There should not be any uncertainty regarding which team member should perform which duties.

The test manager will be facilitating any resources required for the testing team.

Page 56: Testing concepts prp_ver_1[1].0

56

Software Testing Concepts

56

Testing Life Cycle - Roles & Responsibilities

Clear Communication protocol should be defined with in the testing team to ensure proper understanding of roles and responsibilities.

The roles chart should contain both on-site and off-shore team members.

Page 57: Testing concepts prp_ver_1[1].0

57

Software Testing Concepts

57

Testing Life Cycle - Roles & Responsibilities

Test Manager

Single point contact between Wipro onsite and offshore team

Prepare the project plan

Test Management

Test Planning

Interact with Wipro onsite lead, Client QA manager

Team management

Work allocation to the team

Test coverage analysis

Page 58: Testing concepts prp_ver_1[1].0

58

Software Testing Concepts

58

Testing Life Cycle - Roles & Responsibilities

Test Manager cont.. Co-ordination with onsite for issue resolution.

Monitoring the deliverables

Verify readiness of the product for release through release review

Obtain customer acceptance on the deliverables

Performing risk analysis when required

Reviews and status reporting

Authorize intermediate deliverables and patch releases to customer.

Page 59: Testing concepts prp_ver_1[1].0

59

Software Testing Concepts

59

Testing Life Cycle - Roles & Responsibilities

Test Lead Resolves technical issues for the product group

Provides direction to the team members

Performs activities for the respective product group

Review and Approve of Test Plan / Test cases

Review Test Script / Code

Approve completion of Integration testing

Conduct System / Regression tests

Ensure tests are conducted as per plan

Reports status to the Offshore Test Manager

Page 60: Testing concepts prp_ver_1[1].0

60

Software Testing Concepts

60

Testing Life Cycle - Roles & Responsibilities

Test Engineer Development of Test cases and Scripts

Test Execution

Result capturing and analysing

Defect Reporting and Status reporting

Page 61: Testing concepts prp_ver_1[1].0

Software Testing Phases

Page 62: Testing concepts prp_ver_1[1].0

62

Software Testing Concepts

62

Software Testing Phases

Unit Testing

Functional Testing

Integration Testing

System Testing

Acceptance Testing

Interface Testing

Regression Testing

Special Testing

Page 63: Testing concepts prp_ver_1[1].0

63

Software Testing Concepts

63

Unit Testing

Unit Testing is a verification effort on the smallest unit of the software design the software component or module.

Page 64: Testing concepts prp_ver_1[1].0

64

Software Testing Concepts

64

?Why Unit Testing?

Test early for each component and prevent the defect from being carried forward to next stage.

To ensure that the design specifications have been correctly implemented.

Page 65: Testing concepts prp_ver_1[1].0

65

Software Testing Concepts

65

Approach

• Uses the component-level design description as a guide.

Important control paths are tested to uncover errors within the boundary of the module.

Unit testing is white-box oriented, and this can be conducted in parallel for multiple components.

the relative complexity of tests and uncovered errors are limited by the constraints scope established for unit testing.

Page 66: Testing concepts prp_ver_1[1].0

66

Software Testing Concepts

66

Test Cases

Interfaces (input/output)

Local Data structures

Boundary Conditions

Independent Paths

Error Handling PathsModule

Unit Testing

Page 67: Testing concepts prp_ver_1[1].0

67

Software Testing Concepts

67

Unit testing to uncover errors like

Comparison of different data types

Incorrect logical operators or precedence

Expectation of equality when precision errors makes equality unlikely.

Incorrect comparison of variables

Improper or nonexistent loop termination

Failure to exit when divergent iteration is encountered.

Improperly modified loop variables, etc.

Page 68: Testing concepts prp_ver_1[1].0

68

Software Testing Concepts

68

Misunderstood or incorrect arithmetic precedence

Mixed mode operations

Incorrect initialization

Precision inaccuracy

Incorrect symbolic representation of an expression

Some of the Computational Errors uncovered while Unit Testing

Page 69: Testing concepts prp_ver_1[1].0

69

Software Testing Concepts

69

Error description is unintelligible

Error notes does not correspond to error encountered

Error condition causes system intervention prior to error handling

Exception- condition processing is incorrect

Error description does not provide enough information to assist in the location of the cause of error.

Potential errors while error handling is evaluated

Page 70: Testing concepts prp_ver_1[1].0

70

Software Testing Concepts

70

Unit Testing Procedure

Unit testing is normally considered as an adjunct to the coding step.

Unit test case design begins ,once the component level design has been developed, reviewed and verified.

A review of design information provides guidance for establishing test cases that are likely to uncover errors.

Each test case should be coupled with a set of expected results.

Page 71: Testing concepts prp_ver_1[1].0

71

Software Testing Concepts

71

Unit Test Steps

The Unit test criteria, the Unit test plan, and the test case specifications are defined.

A code walkthrough for all new or changed programs or modules is conducted.

Unit Test data is created, program or module testing is performed, and a Unit test report is written.

Sign-offs to integration testing must be obtained, Sign-off can be provided by the lead programmer, project coordinator, or project administrator.

Page 72: Testing concepts prp_ver_1[1].0

72

Software Testing Concepts

72

Functional Testing

Functional Testing is a kind of black box testing because a program’s internal structure is not considered.

Give the inputs, check the outputs without concentrating on how the operations are performed by the system.

When black box testing is conducted , the SRS plays a major role and the functionality is given the utmost importance.

Page 73: Testing concepts prp_ver_1[1].0

73

Software Testing Concepts

73

Functional Testing

Focus on system functions developed from the requirements

Behavior testing

Should know expected results

test both valid and invalid input

Unit test cases can be reused

New end user oriented test cases have to be developed as well.

Page 74: Testing concepts prp_ver_1[1].0

74

Software Testing Concepts

74

User Interface

This stage will also include Validation Testing

which is intensive testing of the new Front end

fields and screens. Windows GUI Standards;

valid, invalid and limit data input; screen & field

look and appearance, and overall consistency

with the rest of the application.

Page 75: Testing concepts prp_ver_1[1].0

75

Software Testing Concepts

75

Vertical First Testing: - When the complete set of functionality is taken for one module and tested it is called Vertical First testing.Horizontal First Testing : - If a similar function is taken across all the modules and it is tested, it is called horizontal-first testing.

Vertical

Horizontal

Page 76: Testing concepts prp_ver_1[1].0

76

Software Testing Concepts

76

Integration testing

testing with the components put together.

Page 77: Testing concepts prp_ver_1[1].0

77

Software Testing Concepts

77

Why integration Testing ?

Data can be lost across an interface.

One module can have an inadvertent, adverse effect on another.

Sub-functions, when combined, may not produce the desired major function.

Individually acceptable imprecision may be magnified to unacceptable levels.

Global data structures can create problems, and so on…

Page 78: Testing concepts prp_ver_1[1].0

78

Software Testing Concepts

78

A

x

1 2

zy

Types of approaches- Top-Down

Top-Down is an incremental approach to testing of the program structure. Modules are integrated by moving downward through the control hierarchy, beginning with the main control module, this could be done as depth- first or breadth-first manner.

Page 79: Testing concepts prp_ver_1[1].0

79

Software Testing Concepts

79

A

x

1 2

zy

as the name implies, begins construction and testing with atomic modules i.e., from the components at the lowest levels in the program structure

Type of Approaches- Bottom-Up

Page 80: Testing concepts prp_ver_1[1].0

80

Software Testing Concepts

80

Integration testing- Example

E.g.: Login.java and ConnectionPool.java

Login class calls the ConnectionPool object , Integration testing identifies errors not observed while code Debugging or Reviews.

Page 81: Testing concepts prp_ver_1[1].0

81

Software Testing Concepts

81

System Testing

Purpose

Test the entire system as a whole

Assumptions

Completed

• Unit Testing

• Functional Testing

• Integration Testing

Page 82: Testing concepts prp_ver_1[1].0

82

Software Testing Concepts

82

Expectations

Verification of the system

Software Requirements

Business Workflow perspective

Final verification of requirements and design

External Interfaces

Performance tests

Affected documentation

Non-testable requirements

Page 83: Testing concepts prp_ver_1[1].0

83

Software Testing Concepts

83

Interface Testing

Purpose Interfaces with the system

Assumptions Unit, functional and integration testing All Critical Errors

Expectations Interfaces with External Systems Planning and Co-ordination meetings with the external organizations in

preparation for testing.• Who will be the primary contacts?• When is testing scheduled?

– If there is no test environment available testing may have to occur on weekends or during non-production hours.

Page 84: Testing concepts prp_ver_1[1].0

84

Software Testing Concepts

84

Interface Testing: Expectations (Contd.)

Expectations (Contd.) What types of test cases will be run, how many and what are

they testing?

• Provide copies of test cases and procedures to the participants

• If the external organization has specific cases they would like to test, have them provide copies

Who will supply the data and what will it contain? What format will it be in (paper, electronic, just notes for someone else to construct the data, etc.)?

Who is responsible for reviewing the results and verifying they are as expected?

How often will the group meet to discuss problems and testing status?

Page 85: Testing concepts prp_ver_1[1].0

85

Software Testing Concepts

85

Interface Testing: Expectations (Contd.)

Expectations (Contd.) Both normal cases and exceptions should be tested on both sides of

the interface (if both sides exchange data). The interface should be tested for handling the normal amount and flow of data as well as peak processing volumes and traffic.

If appropriate, the batch processing or file transmission “window” should be tested to ensure that both systems complete their processing within the allocated amount of time.

If fixes or changes need to be made to either side of the interface, the decisions, deadlines and re-test procedures should be documented and distributed to all the appropriate organizations.

Page 86: Testing concepts prp_ver_1[1].0

86

Software Testing Concepts

86

Performance Testing

Purpose The purpose is to verify the system meets the performance

requirements.

Assumptions/Pre-Conditions • System testing successful.• Ensure no unexpected performance.• Prior to Acceptance Testing.• Tests should use business cases, including normal, error and

unlikely cases.

Page 87: Testing concepts prp_ver_1[1].0

87

Software Testing Concepts

87

Performance Testing (Contd…)

• Performance tests–Load Test–Stress Test–Volume Test–Test data–Response time–End-to-end tests and workflows should be performed–Tracking tool for comparison

Page 88: Testing concepts prp_ver_1[1].0

88

Software Testing Concepts

88

Regression Testing

Page 89: Testing concepts prp_ver_1[1].0

89

Software Testing Concepts

89

Regression Testing

Approach

Definition and Purpose

Types of regression testing

Regression test problems

Regression testing tools

Page 90: Testing concepts prp_ver_1[1].0

90

Software Testing Concepts

90

Regression Testing

Definition “Selective retesting to detect faults introduced

during modification of a system or system component, to verify that modifications have not caused unintended adverse effects, or to verify that a modified system or system component still meets its specified requirements.”

“...a testing process which is applied after a program is modified.”

Page 91: Testing concepts prp_ver_1[1].0

91

Software Testing Concepts

91

Regression Testing

Purpose of Regression testing Locate errors

Increase confidence in correctness

Preserve quality

Ensure continued operations

Check correctness of new logic

Ensure continuous working of unmodified portions

Page 92: Testing concepts prp_ver_1[1].0

92

Software Testing Concepts

92

Perfective Perfective MaintenanceMaintenance

Corrective Corrective MaintenanceMaintenance

Adaptive Adaptive MaintenanceMaintenance

• Corrective - fixing bugs, Design Errors, coding errors

• Adaptive - no change to functionality, but now works under new conditions, i.e., modifications in the environment.

• Perfective - adds something new; makes the system “better” , Eg: adding new modules.

• Preventive – prevent malfunctions or improve maintainability of the software. Eg: code restructuring, optimization, document updating etc

PreventivePreventiveMaintenanceMaintenance

Regression testing types

Page 93: Testing concepts prp_ver_1[1].0

93

Software Testing Concepts

93

Regression Testing

Example 1 - Y2K

During Y2K code changing, regression testing was the essence of the transition phase. What was typically done, was that code was changed at multiple places (it did not turn the original logic upside down, but made subtle changes). Now Regression testing was very important for the fact that even one small piece of code lying untested could lead to huge ramifications in the large amounts of data that is typically handled by these mainframe computers / programs.

Page 94: Testing concepts prp_ver_1[1].0

94

Software Testing Concepts

94

Regression Testing

Example 2 – General Regression testing might even be required when one of the

business associates changes his systems (might be new hardware). Since our system is hooked on to this

transition system, our test engineers are also required to do regression testing on our system which has NOT been changed.

This example brings to light another fact with Regression testing, i.e., sometimes, even an unchanged system needs to be tested!

Page 95: Testing concepts prp_ver_1[1].0

95

Software Testing Concepts

95

Regression testing methods

Regression testing can be done either manually or by automated testing tools. Manual testing: Can be done for small systems, where

investing in automated tools might not be feasible enough.

Automated testing: One class of these tools is called as Capture-playback tool. This is very helpful in situations where the system undergoes lots of version changes.

Page 96: Testing concepts prp_ver_1[1].0

96

Software Testing Concepts

96

Acceptance Testing

Purpose The purpose of acceptance testing is to verify system from user perspective

Assumptions/Pre-Conditions• Completed system and regression testing• Configuration Manager• Test data• Final versions of all documents ready• Overview to the testing procedures• Exit decision• Specific procedures• Acceptance Criteria MUST be documented

– Acceptance Testing

– Project stakeholders

Page 97: Testing concepts prp_ver_1[1].0

97

Software Testing Concepts

97

Acceptance Testing

Expectations• Verification from the user’s perspective

• Performance testing should be conducted again

• Extra time

• User manual to the testers

• Non-testable requirements

• Review with the Sponsor and User

• Plans for the Implementation

Page 98: Testing concepts prp_ver_1[1].0

98

Software Testing Concepts

98

Field Testing

Purpose The purpose of field testing is to verify that the systems work in actual user

environment.

Assumptions/Pre-Conditions System and/or acceptance testing successful.

Expectations Verification of the system works in the actual user environment.

Pilot test with the final product.

Pilot system should work during a problem.

Page 99: Testing concepts prp_ver_1[1].0

Software Testing Strategies

Page 100: Testing concepts prp_ver_1[1].0

100

Software Testing Concepts

100

Reliability Model

Evaluation

Testing Debug

Test Configuration

Software Configuration

Expected Results

Predicted Reliability

Corrections

Test Results

Error Data Rate

Errors

Testing Information Flow

NOTESSoftware Configuration includes a Software Requirements Specification, a Design Specification and source code. A test configuration includes a Test Plan and Procedures, test cases and testing tools. It is difficult to predict the time to debug the code, hence it is difficult to schedule.

Page 101: Testing concepts prp_ver_1[1].0

101

Software Testing Concepts

101

Software Testing Strategies and Techniques

Concise Statement of how to meet the objectives of software testing

To Clarify expectations with the user, sponsor and bidders

To Describe the details of how the testing team will evaluate the work products, systems and testing activities and results

To describe the approach to all testing types and phases and the activities for which they are responsible

Page 102: Testing concepts prp_ver_1[1].0

102

Software Testing Concepts

102

Test Strategy for Maintenance

Includes a greater focus on regression testing, on keeping the users informed of specific fixes or changes that were requested.

Test process should be described in terms of the periodic release cycles that are part of the change control process.

Also describe a set of minimum tests to be performed when emergency fixes are needed (for instance, due to failed hardware or recovering from a database crash).

Page 103: Testing concepts prp_ver_1[1].0

103

Software Testing Concepts

103

Test Strategy: Inputs & Deliverables

Test Strategy

Priority & Criticality

Types of Applications

Project Success Criteria

Time Required for Testing

No. & Levels of Resources

Rounds of Testing

Exit Criteria

Test Suspension Criteria

Resumption Criteria

Deliverables

Time Required for Testing

No. & Levels of Resources

Rounds of Testing

Exit Criteria

Test Suspension Criteria

Page 104: Testing concepts prp_ver_1[1].0

104

Software Testing Concepts

104

Typical Test Issues

Test Participation

Test Environments

Approach to Testing External

Interfaces

Approach to Testing COTS

products

Scope of Acceptance

Testing

Verification of Un-testable

Requirements

Criteria for Acceptance of

the System

Pilot of Field Testing

Performances and Capacity

Requirement/Testing Test Issues

Page 105: Testing concepts prp_ver_1[1].0

105

Software Testing Concepts

105

Common Test Related Risks and Considerations

Test Related & Risks & Considerations

Poor Requirements

Stakeholder Participation

Test Staffing

Testing of COTS

External Interfaces

Performance and Stress Testing

Schedule Compression

Requirement Testability

Acceptance

Page 106: Testing concepts prp_ver_1[1].0

106

Software Testing Concepts

106

Test Exit Criteria

Executed at least once?

Requirements been tested or verified?

Test Documentation

Documents updated and submitted

Configuration Manager

Test Incidents

Page 107: Testing concepts prp_ver_1[1].0

Software Test Plan

Page 108: Testing concepts prp_ver_1[1].0

108

Software Testing Concepts

108

How to achieve good testing

Start planning early in the project.

Prepare a Test Plan.

Identify the objectives.

Document objectives in Test Plan.

Page 109: Testing concepts prp_ver_1[1].0

109

Software Testing Concepts

109

Test Plan

Objective

A test plan prescribes the scope, approach, resources, and

schedule of testing activities. It identifies the items to be

tested, the features to be tested, the testing tasks to be

performed, the personnel responsible for each task, and the

risks associated with the plan.

Page 110: Testing concepts prp_ver_1[1].0

110

Software Testing Concepts

110

Test planning process is a critical step in the testing process.Without a documented test plan, the test itself cannot be verified, coverage cannot be analyzed and the test is not repeatable

Importance

Why Plan Test?

Repeatable

To Control

Adequate Coverage

Page 111: Testing concepts prp_ver_1[1].0

111

Software Testing Concepts

111

To support testing, a plan should be there, which specifies

What to do?

How to do?

When to do?

Test plan

Page 112: Testing concepts prp_ver_1[1].0

112

Software Testing Concepts

112

•Test plans need to identify

The materials needed for testing

What tests will be run

What order the tests will be run

• Test plans also need to:– Name each test– Predict how long the test should take– Scripts and test cases will be needed for

most tests

Test Plan

Page 113: Testing concepts prp_ver_1[1].0

113

Software Testing Concepts

113

1. Test plan identifier

2. Introduction

3. Test items / integration components

4. Features to be tested

5. Features not to be tested

6. Test Approach

7. Item pass/fail criteria

8. Suspension criteria and resumption requirements

Continue..

* As defined by the IEEE 829 Test documentation Std

Structure

Page 114: Testing concepts prp_ver_1[1].0

114

Software Testing Concepts

114

9. Test deliverables (PPlan)

10. Environmental needs (H/w & S/w)

11. Responsibilities (PPlan)

12. Staffing and Training needs (PPlan)

13. Schedule (PPlan)

14. Risks and Contingencies (PPlan)

15. Approvals

Ref : Test Plan Template

Test Plan Template

Structure

Page 115: Testing concepts prp_ver_1[1].0

115

Software Testing Concepts

115

S oftware D eve lopm ent / Im plem enta tion P rocess

S oftware Testing & V a lida tion P rocess

Re

lea

se

D

ate

T est P lann ing1.0

M anagem ent R eporting6.0

Test Execution4.0

Test R esu lts A na lysis5 .0

Test E nvironm entP repara tion

3.0

Test Case Development2.0

R equirem entsD efin ition

S pecifica tionD efin ition

C ode C reation Testing

Testing Process

Page 116: Testing concepts prp_ver_1[1].0

Code Based Test Case Design

Requirements Based Test Case Design

Testing Techniques

Page 117: Testing concepts prp_ver_1[1].0

117

Software Testing Concepts

117

Testing Techniques

Specification Based (Black Box/Functional Testing) Equivalence Partitioning

Cause Effect Graphing

Boundary Value Analysis

Category Partition

Formal Specification Based

Control Flow Based Criteria

Data Flow based criteria

Fault Based Error Guessing

Mutation

Fault Seeding

Page 118: Testing concepts prp_ver_1[1].0

118

Software Testing Concepts

118

Testing Techniques (Contd…)

Usage Based Statistical testing

(Musa’s)SRET

Specific Technique Object Oriented Testing

Component Based Testing

Code Based (White Box/Structural testing) Statement Coverage

Edge Coverage

Condition Coverage

Path Coverage

Cyclomatic Complexity

Page 119: Testing concepts prp_ver_1[1].0

119

Software Testing Concepts

119

Test Data Adequacy Criteria

Test Data Adequacy Criteria

Code Based Testing

Requirement Based Testing

•Have I

•Tested

•Exercised

•Forced

•Found

•Have I

•Thought

•Applied all inputs

•Completely explored

•Run all the Scenarios

Page 120: Testing concepts prp_ver_1[1].0

120

Software Testing Concepts

120

Test Preparation Checklist

Test Id

Version

Users A/c

Input DB

Training

Release to System

Reset System

Test Environment

Stake Holders

Schedule……

Page 121: Testing concepts prp_ver_1[1].0

Code Based Test Case Design

Requirements Based Test Case Design

Test Design Specifications

Page 122: Testing concepts prp_ver_1[1].0

122

Software Testing Concepts

122

Purpose of Test Design Specification

Requirements of Test Approach

Identify the features to be tested

Arrive at High Level

Page 123: Testing concepts prp_ver_1[1].0

123

Software Testing Concepts

123

Contents of Test Design Specification

Identification and Purpose

Features to be tested

Approach Refinements

Test Identification

Pass/Fail Criteria

Page 124: Testing concepts prp_ver_1[1].0

124

Software Testing Concepts

124

Approach

Study Business Requirements

Arrive at Environmental Requirements

Identify test related Risks

Decide Automation Requirements

Prepare Test Documents

Plan for Test Completion

Analyze Track changes

Review Test design effectiveness

Page 125: Testing concepts prp_ver_1[1].0

125

Software Testing Concepts

125

Test Cases

Page 126: Testing concepts prp_ver_1[1].0

126

Software Testing Concepts

126

To Capture Details:

1.Testcase ID (should be unique, e.g.: c_01.1, c_01.1a, c_01.2,…)

2.Feature functionality to be tested (each Requirement/feature could be from Usecase/COMP)

3.Test Description/ test input details (test input, test data, action to be performed to test the feature, complex test cases be split to more than one)

4.Expected behavior ( in messages, screens, data, to be with correct details)

5.Actual and Status

Test Case Sheet

Page 127: Testing concepts prp_ver_1[1].0

127

Software Testing Concepts

127

Identify all potential Test Cases needed to fully test the business and technical requirements

Document Test Procedures

Document Test Data requirements

Prioritize test cases

Identify Test Automation Candidates

Automate designated test cases

Test case development process

Page 128: Testing concepts prp_ver_1[1].0

128

Software Testing Concepts

128

Type Source

1.Requirement Based Specifications2.Design based Logical system3.Code based Code4.Extracted Existing files or test cases5.Extreme Limits and boundary conditions

Type of test cases

Page 129: Testing concepts prp_ver_1[1].0

129

Software Testing Concepts

129

1. Identify the basic cases that indicate program functionality.

2. Create a minimal set of tests to cover all inputs and outputs.

3. Breakdown complex cases into single cases.4. Remove unnecessary or duplicate cases.5. Review systematically and thoroughly.6. Design based test cases supplement requirements

based test cases.

Steps for selecting test cases:

Requirement based test cases

Page 130: Testing concepts prp_ver_1[1].0

130

Software Testing Concepts

130

• Every statement exercised at least once.

• Every decision exercised over all outcomes.

Goals for complete code based coverage:

Code based test cases

Page 131: Testing concepts prp_ver_1[1].0

131

Software Testing Concepts

131

• Looks for exceptional conditions,

extremes, boundaries, and abnormalities.

• Requires experience, creativity of the Test Engineer

Need:

Extreme cases

Page 132: Testing concepts prp_ver_1[1].0

132

Software Testing Concepts

132

• Extracted cases involved extracting samples of real data for the testing process.

• Randomized cases involved using tools to generate potential data for the testing process.

Extracted and randomized cases

Page 133: Testing concepts prp_ver_1[1].0

133

Software Testing Concepts

133

• Specific• Non-redundant• Reasonable probability of

catching an error• Medium complexity• Repeatable• Always list expected results

Characteristics of good test case

Page 134: Testing concepts prp_ver_1[1].0

134

Software Testing Concepts

134

• Developed to verify that specific requirements or design are satisfied

• Each component must be tested with at least two test cases: Positive and Negative

• Real data should be used to reality test the modules after successful test data is used

Test case guidelines

Page 135: Testing concepts prp_ver_1[1].0

135

Software Testing Concepts

135

The Testing process

TestCases

TestCases

TestData

TestData

Test Results

Test Results

Test Reports

Test Reports

Design Test Cases

Design Test Cases

Prepare testdata

Prepare testdata

Run Prg withTest data

Run Prg withTest data

Compareresults

Compareresults

Page 136: Testing concepts prp_ver_1[1].0

Statement Coverage

Edge Coverage

Condition Coverage

Path Coverage

Cyclomatic Complexity

Code Base Test Case Design

Page 137: Testing concepts prp_ver_1[1].0

137

Software Testing Concepts

137

Purpose

Understand the Objective

Effective conversion of specifications

Checking Programming Style with coding standards

Check Logic Errors

Incorrect Assumptions

Typographical Errors

Page 138: Testing concepts prp_ver_1[1].0

138

Software Testing Concepts

138

Code Based Testing - White Box Testing

Coding Standards

Logic Programming Style

Complexity of Code

Structural Testing

Ensure Reduced Rework

Quicker Stability

Smooth Acceptance

Structure of the Software itself

Valuable Source

Selecting test cases

Page 139: Testing concepts prp_ver_1[1].0

139

Software Testing Concepts

139

Code Based Testing or White Box Testing

Testing control structures of a procedural design.

Can derive test cases to ensure: All independent paths are exercised at least once.

All logical decisions are exercised for both true and false paths.

All loops are executed at their boundaries and within operational bounds.

All internal data structures are exercised to ensure validity.

Contd..2

Page 140: Testing concepts prp_ver_1[1].0

140

Software Testing Concepts

140

Code Based Testing or White Box Testing (Contd..)

Why do white box testing when black box testing is used to test

conformance to requirements?

• Logic errors and incorrect assumptions most likely to be made when coding

for "special cases". Need to ensure these execution paths are tested.

Page 141: Testing concepts prp_ver_1[1].0

141

Software Testing Concepts

141

Code Based Testing or White Box Testing (Contd..)

• May find assumptions about execution paths incorrect and so make

design errors. White box testing can find these errors.

• Typographical errors are random. Just as likely to be on an obscure

logical path as on a mainstream path.

– "Bugs lurk in corners and congregate at boundaries"

Page 142: Testing concepts prp_ver_1[1].0

142

Software Testing Concepts

142

Types of Code Based Testing & Adequacy Criteria Involve Control Flow Testing

Statement Coverage• Is every statement executed at least once?

Edge Coverage• Is every edge in the control flow graph executed?

Condition Coverage• Is edge + every Boolean (sub) expression in the control flow graph

executed?

Path Coverage• Is every path in the control flow graph executed?

Cyclomatic Complexity• Is the logical structure of the program appropriate?

Page 143: Testing concepts prp_ver_1[1].0

143

Software Testing Concepts

143

Test Cases

Derive Test CasesDerive Test Cases

Independent Path

Independent Path

Logical DecisionsLogical

DecisionsBoundariesBoundaries

Data Structures

Data Structures

Page 144: Testing concepts prp_ver_1[1].0

144

Software Testing Concepts

144

Types of Code Based Testing (1) - Statement Coverage

Control Flow elements to be exercised in statements.

Statements coverage criterion requires elementary statement, where program is executed at least once.

Statement coverage (C) =

Number of Executed Statements (P)

Total Number of Statements (T)

Page 145: Testing concepts prp_ver_1[1].0

145

Software Testing Concepts

145

Types of Code Based Testing (2) - Edge Coverage (Branch Coverage)

Focus is on identifying test cases executing each branch at least once.

Edge Covers (C) =

Number of Executed Branches (P)

Total Number of Branches (T)

Page 146: Testing concepts prp_ver_1[1].0

146

Software Testing Concepts

146

Types of Code Based Testing(3) - Conditions Coverage

Combination of Edge Coverage and more detailed conditions.

Examples: True & False, Elementary Conditions, Comparisons, Boolean Expressions.

Basic Conditions Coverage (C) =

Number of Executed Conditions (P)

Total Number of Conditions (T)

Page 147: Testing concepts prp_ver_1[1].0

147

Software Testing Concepts

147

Types of Code Based Testing(3) - Conditions Coverage (Contd.)

Condition testing aims to exercise all logical

conditions in a program module. It is defined as:

Relational expression: (E1 op E2), where E1 and E2

are arithmetic expressions.

Simple condition: Boolean variable or relational

expression, possibly preceded by a NOT operator.

Page 148: Testing concepts prp_ver_1[1].0

148

Software Testing Concepts

148

Types of Code Based Testing(3) - Conditions Coverage (Contd.)

Compound condition: Composed of two or more simple conditions,

boolean operators and parentheses.

Boolean expression: Condition without relational expressions.

Page 149: Testing concepts prp_ver_1[1].0

149

Software Testing Concepts

149

Types of Code Based Testing(3) - Conditions Coverage (Contd.)

Errors in expressions can be due to:

Boolean operator error

Boolean variable error

Boolean parenthesis error

Relational operator error

Arithmetic expression error

Condition testing methods focus on testing each condition in the program.

Page 150: Testing concepts prp_ver_1[1].0

150

Software Testing Concepts

150

Types of Code Based Testing(3) - Conditions Coverage (Contd.)

Strategies proposed include:

Branch testing - execute every branch at least once.

Domain Testing - uses three or four tests for every relational operator.

Branch and relational operator testing - uses condition constraints.

Page 151: Testing concepts prp_ver_1[1].0

151

Software Testing Concepts

151

Types of Code Based Testing(3) - Conditions Coverage (Contd.)

Example 1: C1 = B1 & B2

where B1, B2 are boolean conditions.

Condition constraint of form (D1,D2) where D1 and D2 can be true (t) or

false(f).

The branch and relational operator test requires the constraint set {(t,t),

(f,t),(t,f)} to be covered by the execution of C1.

Coverage of the constraint set guarantees detection of relational

operator errors

Page 152: Testing concepts prp_ver_1[1].0

152

Software Testing Concepts

152

Path Coverage executed at least once.

Selects test paths according to the location of definitions and use of

variables.

Test for Loops (iterations)

Loop Testing.

Loops fundamental to many algorithms.

Can define loops as simple, concatenated, nested and unstructured.

Types of Code Based Testing(4) - Path Coverage : Data Flow Testing

Page 153: Testing concepts prp_ver_1[1].0

153

Software Testing Concepts

153

Simple Nested Concatenated Unstructured

Types of Code Based Testing(4) - Path Coverage: Loop Testing: Examples

Page 154: Testing concepts prp_ver_1[1].0

154

Software Testing Concepts

154

Simple

Types of Code Based Testing(4) - Path Coverage: Simple Loops Simple Loops of

size n: • Skip loop entirely

• Only one pass through loop

• Two passes through loop

• m passes through loop where, m<n.

• (n-1), n and (n+1) passes through the loop.

Page 155: Testing concepts prp_ver_1[1].0

155

Software Testing Concepts

155

Nested

Types of Code Based Testing(4) - Path Coverage: Nested Testing Nested Loops

Start with inner loop. Set all other loops to minimum values.

Conduct simple loop testing on inner loop.

Work outwards.

Continue until all loops are tested.

Page 156: Testing concepts prp_ver_1[1].0

156

Software Testing Concepts

156

Concatenated

Types of Code Based Testing(4) - Path Coverage: Concatenated Loop

Concatenated Loops test

• If independent loops, use simple loop testing.

• If dependent, treat as nested loops.

Page 157: Testing concepts prp_ver_1[1].0

157

Software Testing Concepts

157

Unstructured

Unstructured loops

• Don't test - redesign.

Types of Code Based Testing(4) - Path Coverage: Unstructured Loops

Page 158: Testing concepts prp_ver_1[1].0

158

Software Testing Concepts

158

Measures the amount of decision logic in a single software module.

The Cyclomatic complexity gives a quantitative measure

of the logical complexity.

This value gives the number of independent paths in the

basis set and an upper bound for the number of tests to

ensure that each statement is executed at least once.

Types of Code Based Testing(5) - Cyclomatic Complexity

Page 159: Testing concepts prp_ver_1[1].0

159

Software Testing Concepts

159

Cyclomatic Complexity

An independent path is any path through a program that introduces

at least one new set of processing statements or a new condition

(i.e., a new edge).

Page 160: Testing concepts prp_ver_1[1].0

160

Software Testing Concepts

160

Relationship with Programming Complexity

Cyclomatic Complexity calculations help the developer/tester to decide whether the module under test is overly complex or well written.

Recommended limit value of Cyclomatic Complexity is 10.

>10

• Structure of the module is overly complex.

>5 and <10

• Structure of the module is complex indicating that the logic is difficult to test.

<5

• structure of the module is simple and logic is easy to test.

Page 161: Testing concepts prp_ver_1[1].0

161

Software Testing Concepts

161

Sequence If While Until Case

Flow Graphic Notation

Page 162: Testing concepts prp_ver_1[1].0

162

Software Testing Concepts

162

Flow Graphic Notation

On a flow graph:

Arrows called edges represent flow of control.

Circles called nodes represent one or more actions.

Areas bounded by edges and nodes are called regions.

A predicate node is a node containing a condition.

Contd..2

Page 163: Testing concepts prp_ver_1[1].0

163

Software Testing Concepts

163

Flow Graphic Notation

Any procedural design can be translated into a flow graph.

Note that compound Boolean expressions at tests generate at

least two predicate node and additional arcs.

Contd..2

Page 164: Testing concepts prp_ver_1[1].0

164

Software Testing Concepts

164

Flow Graphic Notation

Page 165: Testing concepts prp_ver_1[1].0

165

Software Testing Concepts

165

Deriving Cyclomatic Complexity

Cyclomatic Complexity equals number of independent paths through standard control flow graph model.

Steps to arrive at Cyclomatic Complexity

Draw a corresponding flow graph.

Determine Cyclomatic Complexity.

Determine independent paths.

Prepare tests cases.

Page 166: Testing concepts prp_ver_1[1].0

166

Software Testing Concepts

166

1

2

4 3

56

7a

8

7b

1. Do while records remain

read record

2. If record field 1=0

3. Then process record;

store in buffer,

increment counter,

4. Elseif record field 2=0

5. Then reset record;

6. Else process record;

store in file,

7a Endif

Endif

7b.Enddo

8. End

Contd..2

Cyclomatic Complexity: Example PROCEDURE SORT

Page 167: Testing concepts prp_ver_1[1].0

167

Software Testing Concepts

167

Reporting Cyclomatic Complexity

The McCabe Cyclomatic complexity V(G) of a control flow graph measures the maximum number of linearly independent paths through it. The complexity typically increases because of branch points.

Definitions:

Cyclomatic Complexity V(G) = e – n + 2

Page 168: Testing concepts prp_ver_1[1].0

168

Software Testing Concepts

168

Reporting Cyclomatic Complexity

To compute the Cyclomatic complexity: V(G) where v refers to the Cyclomatic number in graph theory and G indicates that the complexity is a function of the graph.

If e is the number of arcs,

n is the number of nodes and

p is the number of connected components or predicates or modules, then

Linearly independent paths,

V(G) = e - n + 2 * p

Page 169: Testing concepts prp_ver_1[1].0

169

Software Testing Concepts

169

Software Testing Technique

Example

– Independent Paths:

» 1, 1, 8

» 1, 2, 3, 7b, 1, 8

» 1, 2, 4, 5, 7a, 7b, 1, 8

» 1, 2, 4, 6, 7a, 7b, 1, 8

– Cyclomatic complexity provides upper bound for number

of tests required to guarantee coverage of all program

statements.

Page 170: Testing concepts prp_ver_1[1].0

170

Software Testing Concepts

170

Summary: Cyclomatic Complexity

The number of tests to test all control statements + one virtual path equals the Cyclomatic complexity.

Cyclomatic complexity equals number of conditions in a program.

Useful if used with care. Does not imply adequacy.

Does not take into account data-driven programs.

Page 171: Testing concepts prp_ver_1[1].0

171

Software Testing Concepts

171

Deriving Test Cases

Using the design or code, draw the corresponding flow graph.

Determine the Cyclomatic complexity of the flow graph.

Determine a basis set of independent paths.

Prepare test cases that will force execution of each path in the basis set.

Note: some paths may only be able to be executed as part of another test.

Page 172: Testing concepts prp_ver_1[1].0

172

Software Testing Concepts

172

Graph Matrices

Can automate derivation of flow graph and

determination of a set of basis paths.

Software tools to do this can use a graph matrix.

Graph matrix:

Is square with # of sides equal to # of nodes.

Rows and columns correspond to the nodes.

Entries correspond to the edges.

Contd..2

Page 173: Testing concepts prp_ver_1[1].0

173

Software Testing Concepts

173

Graph Matrices

Can associate a number with each edge entry.

Use a value of 1 to calculate the Cyclomatic complexity

For each row, sum column values and subtract 1.

Sum these totals and add 1.

Contd..2

Page 174: Testing concepts prp_ver_1[1].0

174

Software Testing Concepts

174

Some other interesting link weights

Probability that a link (edge) will be executed.

Processing time for traversal of a link.

Memory required during traversal of a link.

Resources required during traversal of a link.

Contd..2

Page 175: Testing concepts prp_ver_1[1].0

175

Software Testing Concepts

175

1

2

4 3

56

7a

8

7b

1 2 3 4 5 6 7a 7b 81 1 12 1 13 14 1 15 16 1

7a 17b 1

8

Graph Matrices

Page 176: Testing concepts prp_ver_1[1].0

Introduction to

Static Testing

Page 177: Testing concepts prp_ver_1[1].0

177

Software Testing Concepts

177

Static Testing

Static testing is the process of evaluating a system or component based on its form, structure, content or documentation (without computer program execution).

Reviews form an important activity in static testing.

Page 178: Testing concepts prp_ver_1[1].0

178

Software Testing Concepts

178

Reviews

Reviews are "filters" applied to uncover error from products at the end of each phase.

A review process can be defined as a critical evaluation of an object.

Involve a group meeting to assess a work product. In certain phases, such as the Requirements phase, Prototyping phase and the final delivery phase.

Page 179: Testing concepts prp_ver_1[1].0

179

Software Testing Concepts

179

Benefits of Reviews Identification of the anomalies at the earlier stage of the life cycle

Identifying needed improvements

Certifying correctness

Encouraging uniformity

Enforcing subjective rules

Page 180: Testing concepts prp_ver_1[1].0

180

Software Testing Concepts

180

Types of Reviews

Inspections

Walkthroughs

Technical Reviews

Audits

Page 181: Testing concepts prp_ver_1[1].0

181

Software Testing Concepts

181

Work-products that undergo reviews Software Requirement Specification

Software design description

Source Code

Software test documentation

Software user documentation

System Build

Release Notes

Let us discuss Inspections, Walkthroughs and Technical Reviews with respect to Code.

Page 182: Testing concepts prp_ver_1[1].0

182

Software Testing Concepts

182

Code Inspection Code inspection is a visual examination of a software product to detect

and identify software anomalies including errors and deviations from standards and specifications.

Inspections are conducted by peers led by impartial facilitators.

Inspectors are trained in Inspection techniques.

Determination of remedial or investigative action for an anomaly is mandatory element of software inspection

Attempt to discover the solution for the fault is not part of the inspection meeting.

Page 183: Testing concepts prp_ver_1[1].0

183

Software Testing Concepts

183

Objectives of code Inspection Cost of detecting and fixing defects is less during early stages.

Gives management an insight into the development process – through metrics.

Inspectors learn from the inspection process.

Allows easy transfer of ownership, should staff leave or change responsibility.

Build team strength at emotional level.

Page 184: Testing concepts prp_ver_1[1].0

184

Software Testing Concepts

184

Composition of Code Inspection Team

Author

Reader

Moderator

Inspector

Recorder

Page 185: Testing concepts prp_ver_1[1].0

185

Software Testing Concepts

185

Rules for Code Inspection

Inspection team can have only 3 to 6 participants maximum.

Author shall not act as Inspection leader, reader or recorder.

Management member shall not participate in the inspection.

Reader responsible for leading the inspection team through the program written interpreting sections of work line by line.

Relating the code back to higher level work products like Design, Requirements.

Page 186: Testing concepts prp_ver_1[1].0

186

Software Testing Concepts

186

Inspection Process

Overview

Preparation

Inspection

Rework

Follow up

Page 187: Testing concepts prp_ver_1[1].0

187

Software Testing Concepts

187

Classification of anomaly

Missing

Superfluous (additional)

Ambiguous

Inconsistent

Improvement desirable

Non-conformance to standards

Risk-prone (safer alternative methods are available)

Factually incorrect

Non-implementable (due to system or time constraints)

Page 188: Testing concepts prp_ver_1[1].0

188

Software Testing Concepts

188

Severity of anomaly

Major

Minor

Page 189: Testing concepts prp_ver_1[1].0

189

Software Testing Concepts

189

Benefits of Code Inspection

Synergy – 3-6 active people work together, focused on a common goal.

Work product is detached from the individual.

Identification of the anomalies at the earlier stage of the life cycle.

Uniformity is maintained.

Page 190: Testing concepts prp_ver_1[1].0

190

Software Testing Concepts

190

Guidelines for Code Inspection

Adequate preparation time must be provided to participants.

The inspection time must be limited to 2-hours sessions, with a maximum of 2 sessions a day.

The inspection meeting must be focused only on identifying anomalies, not on the resolution of the anomalies.

The author must be dissociated from his work.

The management must not participate in the inspections.

Selecting the right participants for the inspection.

Page 191: Testing concepts prp_ver_1[1].0

191

Software Testing Concepts

191

Output of Code Inspection

Inspection team members

Software program examined

Code inspection objectives and whether they were met.

Recommendations regarding each anomaly.

List of actions, due dates and responsible people.

Recommendations, if any, to the QA group to improve the process

Page 192: Testing concepts prp_ver_1[1].0

192

Software Testing Concepts

192

Code Walkthrough

Walkthrough is a static analysis technique in which a designer or programmer leads members of the development team and other interested parties through a software program.

Participants ask questions on the program and make comments about possible errors, violation of standards, guidelines etc.

Page 193: Testing concepts prp_ver_1[1].0

193

Software Testing Concepts

193

Objectives of Code Walkthrough

To evaluate a software program, check conformance to standards, guidelines and specifications

Educating / Training participants

Find anomalies

Improve software program

Consider alternative implementation if required (not done in inspections)

Page 194: Testing concepts prp_ver_1[1].0

194

Software Testing Concepts

194

A group of relevant persons from different departments participate in the inspection.

Usually team members of the same project take participation in the walkthrough. Author himself acts the walkthrough leader.

Checklist is used to find faults No checklist used in walkthroughs

Inspection process includes Overview, preparation, inspection, rework and follow up.

Walkthrough process includes Overview, little or no preparation, examination (actual walkthrough meeting), rework and follow up.

Difference between Inspections and Walkthroughs

Page 195: Testing concepts prp_ver_1[1].0

195

Software Testing Concepts

195

Difference between Inspections and Walkthroughs Contd.Formalized procedure in each step. No formalized procedure in the steps.

Inspection takes longer time as the list of items in the checklist is tracked to completion.

Shorter time is spent on walkthroughs as there is not formal checklist used to evaluate the program.

Page 196: Testing concepts prp_ver_1[1].0

196

Software Testing Concepts

196

Code Walkthrough Team

Author

Walkthrough Leader

Recorder

Team member

Page 197: Testing concepts prp_ver_1[1].0

197

Software Testing Concepts

197

Code Walkthrough Process

Overview

Preparation

Examination

Rework / Follow-up

Page 198: Testing concepts prp_ver_1[1].0

198

Software Testing Concepts

198

Outputs of Code Walkthrough

Walkthrough team members

Software program examined

Walkthrough objectives and whether they were met.

Recommendations regarding each anomaly.

List of actions, due dates and responsible people.

Page 199: Testing concepts prp_ver_1[1].0

199

Software Testing Concepts

199

Technical Review of Code

A technical review is a formal team evaluation of a product.

It identifies any discrepancies from specifications and standards or provides recommendations after the examination of alternatives or both.

The technical review is less formal than the formal inspection.

The technical review participants include the author and participants knowledgeable of the technical content of the product being reviewed.

Page 200: Testing concepts prp_ver_1[1].0

200

Software Testing Concepts

200

Technical review process

Step 1: Planning the Technical Review Meeting

Step 2: Reviewing the Product

Step 3: Conducting the Technical Review

Step 4: Resolving Defects

Step 5: Reworking the Product

Page 201: Testing concepts prp_ver_1[1].0

201

Software Testing Concepts

201

Outputs of Technical review

Same as Inspections.

Page 202: Testing concepts prp_ver_1[1].0

Low Level Testing

High Level Testing

Requirement Bases Test Design-

Black Box Technique

Page 203: Testing concepts prp_ver_1[1].0

203

Software Testing Concepts

203

Purpose

Is to find Functional validity of the system

Sensitivity

Tolerance

Operability

Interface errors

Errors in database structures

Performance errors

Initialization and termination errors

Page 204: Testing concepts prp_ver_1[1].0

204

Software Testing Concepts

204

Approach

Positive Testing

Negative Testing

Use case Testing

Page 205: Testing concepts prp_ver_1[1].0

205

Software Testing Concepts

205

Categories of Requirements

Functional Absolutely necessary for

functioning of system

Describes the input/output behaviour of the system

Shalls of the software

Must be testable

Non-functional Restriction or constraints on

system services

Define the attributes of the system as it performs its job

Subjective in nature and not conclusively testable

In real-systems, these are more important than functional requirements!

Page 206: Testing concepts prp_ver_1[1].0

206

Software Testing Concepts

206

Validating Functional Requirements

Black Box Testing Low Level Testing

High Level Testing

Page 207: Testing concepts prp_ver_1[1].0

207

Software Testing Concepts

207

Validating Non-functional Requirements

Software Quality Factors Test cases generated to validate the metrics

Criteria is met

Factor is metfactor

criteria

metric

Prioritization of factors

Important

Page 208: Testing concepts prp_ver_1[1].0

208

Software Testing Concepts

208

Low Level Techniques Equivalence partitioning

Boundary value analysis

Input domain & Output domain

Special Value

Error based

Cause-effect Graph

Comparison Testing

High Level Techniques Specification-based testing

Express requirements in simple formal notations like

State machine

Decision table

Use cases

Flowchart

Boolean logic

Regular expressions

The notation allows generation of scenarios.

Different test cases for every scenario.

Good side effects!

Makes requirements verifiable, finds flaws in requirements.

Requirements Based Test Design - Black Box Techniques

Page 209: Testing concepts prp_ver_1[1].0

High Level Techniques

Requirement Base Test Design- Black Box

Technique

Page 210: Testing concepts prp_ver_1[1].0

210

Software Testing Concepts

210

Techniques

State Machine

Decision Table

Flowchart

Use Cases

Page 211: Testing concepts prp_ver_1[1].0

211

Software Testing Concepts

211

State Machine

Description State based business logic

Covering all paths generate test cases

Diagram may be complicated

For every event generate test cases using BVA, EP…

State Diagram

Page 212: Testing concepts prp_ver_1[1].0

212

Software Testing Concepts

212

Decision Table

Explores combinations of input conditions

Consists of 2 parts: Condition section and Action section Condition Section - Lists conditions and their combinations Action Section - Lists responses to be produced

Exposes errors in specification

Columns in decision table are converted to test cases

Similar to Condition Coverage used in White Box Testing

Value 1 Value 2

Value 3

Login √ √ X

Password X √ X

Successful Login X √ X

Unsuccessful Login √ X √

Warning Message √(W) NA √(W)

CONDITION

ACTION

Page 213: Testing concepts prp_ver_1[1].0

213

Software Testing Concepts

213

Flowchart

Description

Flow based business logic

Generate test cases covering all paths

Simple to use

For every condition generate test cases using BVA, EP…

Page 214: Testing concepts prp_ver_1[1].0

214

Software Testing Concepts

214

Use Cases :

Simple and Effective method to find errors in Object Oriented applications during Analysis phase.

Good start for User Acceptance Testing and Plan.

Accurately reflects business requirements.

Page 215: Testing concepts prp_ver_1[1].0

Low Level Techniques

Requirement Base Test Design- Black Box

Technique

Page 216: Testing concepts prp_ver_1[1].0

216

Software Testing Concepts

216

Techniques

Equivalence partitioning

Boundary value analysis

Input domain & Output domain

Special Value

Error based

Cause-effect Graph

Comparison Testing

Page 217: Testing concepts prp_ver_1[1].0

217

Software Testing Concepts

217

Low Level Techniques (1) - Equivalence Partitioning

Divides the input domain into classes of data for which test cases can be generated.

Attempts to uncover classes of errors.

Divides the input domain of a program into classes of data.

Derives test cases based on these partitions.

An equivalence class is a set of valid or invalid states of input.

Test case design is based on equivalence classes for an input domain.

Invalid

Inputs

Valid

Inputs

SYSTEM`

Output

Page 218: Testing concepts prp_ver_1[1].0

218

Software Testing Concepts

218

Low Level Techniques (1) - Equivalence Partitioning (Contd..)

Useful in reducing the number of Test Cases required.

It is very useful when the input/output domain is amenable to partitioning.

Invalid Valid Range Invalid

Less than 6 Between 6 and 15 More than 15

4 9 17

Input Range (6,15) Test Values (4,9,17)

Page 219: Testing concepts prp_ver_1[1].0

219

Software Testing Concepts

219 Low Level Techniques (1) - Equivalence Partitioning (Contd..)

Here test cases are written to uncover classes of errors for every input condition.

Equivalence classes are:-

RangeUpper bound + 1Lower bound – 1Within bound

ValueMaximum length + 1Minimum length – 1Valid value and Valid lengthInvalid value

SetIn-setOut-of-set

BooleanTrueFalse

Page 220: Testing concepts prp_ver_1[1].0

220

Software Testing Concepts

220 Low Level Techniques (1) - Equivalence Partitioning (Contd..)

Equivalence Partitioning partitions the data to partition of a set.

Partition refers to collection of mutually disjoint subsets whose union is the entire set.

Choose one data element from each partitioned set.

The KEY is the choice of equivalence relation!

EC based testing allows

To have a sense of complete testing.

Helps avoid redundancy.

Page 221: Testing concepts prp_ver_1[1].0

221

Software Testing Concepts

221

Low Level Techniques (2) - Boundary Value Analysis

A Black Box Testing Method

Complements to Equivalence partition

BVA leads to a selection of test cases that exercise bounding values

Design test cases test Min values of an input Max values of an input Just above and below input range

Input Range (6,15) Test Values (5,6,7,15,16)

5

7

166 15

Less than 6 Between 6 and 15 More than 15

Page 222: Testing concepts prp_ver_1[1].0

222

Software Testing Concepts

222

Low Level Techniques (2) - Boundary Value Analysis

Helps to write test cases that exercise bounding values.

Complements Equivalence Partitioning.

Guidelines are similar to Equivalence Partitioning.

Two types of BVA:

Range

• Above and below Range

Value

• Above and below min and max number

Page 223: Testing concepts prp_ver_1[1].0

223

Software Testing Concepts

223

Low Level Techniques (2) - Boundary Value Analysis

Boundary Value Analysis

Large number of errors tend to occur at boundaries of the input domain.

BVA leads to selection of test cases that exercise boundary values.

BVA complements Equivalence Partitioning.

Rather than select any element in an equivalence class, select those at the “edge” of the class.

Page 224: Testing concepts prp_ver_1[1].0

224

Software Testing Concepts

224

Low Level Techniques (2) - Boundary Value Analysis

Examples:

For a range of values bounded by ‘a’ and ‘b’, test (a-1), a, (a+1), (b-1), b, (b+1).

If input conditions specify a number of values ‘n’, test with (n-1), n and (n+1) input values.

Apply 1 and 2 to output conditions (e.g., generate table of minimum and maximum size).

If internal program data structures have boundaries (e.g., buffer size, table limits), use input data to exercise structures on boundaries.

Page 225: Testing concepts prp_ver_1[1].0

225

Software Testing Concepts

225

Low Level Techniques (2) - Boundary Value Analysis

For Two Variables

a < = x1 < = b

c < = x2 < = d

For each variable

Minimum -1

Minimum

Minimum +1

Nominal/mid

Maximum -1

Maximum

Maximum +1

Take Cartesian product of these sets

Page 226: Testing concepts prp_ver_1[1].0

226

Software Testing Concepts

226

Low Level Techniques (3) - Input/Output Domain Testing

Description

From input side generate inputs to map to outputs.

Ensure that you have generated all possible inputs by looking from the output side.

Inputs Outputs

Page 227: Testing concepts prp_ver_1[1].0

227

Software Testing Concepts

227

Low Level Techniques (4) - Special Value Testing

Select test data on the basis of features of a function to be computed.

Tester uses her / his domain knowledge, experience with similar programs.

Ad-hoc / seat-of-pants / skirt testing.

No guidelines, use best engineering judgment.

Special test cases / Error guessing.

Is useful – don’t discount effectiveness!

Page 228: Testing concepts prp_ver_1[1].0

228

Software Testing Concepts

228

Low Level Techniques (5) - Error based Testing

Generate test cases based on

Programmer histories

Program complexity

Knowledge of error-prone syntactic constructs

Guess errors based on data type

Page 229: Testing concepts prp_ver_1[1].0

229

Software Testing Concepts

229

Low Level Techniques (6) - Cause Effect Graphing Techniques

Cause Effect Graphing Techniques

Translation of natural language descriptions of procedures to software based algorithms is error prone.

Uncovers errors by representing algorithm as a cause-effect graph representing logical combinations and corresponding actions.

Contd..2

Page 230: Testing concepts prp_ver_1[1].0

230

Software Testing Concepts

230 Low Level Techniques (6) - Cause Effect Graphing Techniques

Cause Effect Graphing Techniques

How do you test code which attempts to implement this?

Cause-effect graphing attempts to provide a concise representation of logical combinations and corresponding actions.

Causes (input conditions) and effects (actions) are listed for a module and an identifier is assigned to each.

Steps:

A cause-effect graph developed.

Graph converted to a decision table.

Decision table rules are converted to test cases.

Contd..2

Page 231: Testing concepts prp_ver_1[1].0

231

Software Testing Concepts

231

Low Level Techniques (7) - Comparison Testing

Helps to check performance of the software under different hardware and software configurations.

Two variants of Comparison testing are:

Develop the software.

Run the software in parallel and compare the results.

Page 232: Testing concepts prp_ver_1[1].0

232

Software Testing Concepts

232

Low Level Techniques (7)- Comparison Testing

Comparison Testing

In some applications, reliability is critical.

Redundant hardware and software may be used.

For redundant s/w, use separate teams to test the software.

Test with same test data to ensure all provide identical output.

Run the software in parallel with a real-time comparison of results.

Method does not catch errors in the specification.

Page 233: Testing concepts prp_ver_1[1].0

GUI Testing

Page 234: Testing concepts prp_ver_1[1].0

234

Software Testing Concepts

234

Windows Compliance Standards Windows resize options

Maximize, minimize and close options should be available.

Using TAB

Should move the focus (cursor) from left to right and top to bottom in the window.

Using SHIFT+TAB

Should move the focus (cursor) from right to left and bottom to top.

Text

Should be left-justified.

Page 235: Testing concepts prp_ver_1[1].0

235

Software Testing Concepts

235

Windows Compliance Standards (Contd..)

Edit Box

U should be able to enter data.

Try to overflow the text, text should be stopped after the specified length of characters.

Try entering invalid characters - should not allow.

Radio Buttons

Left and right arrows should move ‘ON’ selection. So should UP and DOWN.

Select with the mouse by clicking.

Check Boxes

Clicking with the mouse on the box or on the text should SET/UNSET the box.

Space should do the same.

Page 236: Testing concepts prp_ver_1[1].0

236

Software Testing Concepts

236

Windows Compliance Standards (Contd..) Command Buttons

Should have shortcut keys (except OK and Cancel buttons).

Click each button with the mouse - should activate.

TAB to each button & press Space/Enter - should activate.

Drop Down List

Pressing the arrow should give list of options.

Pressing a letter should bring you to the first item in the list with that start letter.

Pressing Ctrl+F4 should open/drop down the list box.

Page 237: Testing concepts prp_ver_1[1].0

237

Software Testing Concepts

237

Windows Compliance Standards (Contd..) Combo Boxes

Should allow text to be entered.

Clicking the arrow should allow user to choose from the list

List Boxes

Should allow a single selection to be chosen by clicking with the mouse or using the Up and Down arrows.

Pressing a letter should bring you to the first item in the list with that start letter.

Page 238: Testing concepts prp_ver_1[1].0

238

Software Testing Concepts

238

Screen Validation Standards

Aesthetic Conditions

The general screen background should be of correct colour (company standards,….).

The field prompts and backgrounds should be of correct colour.

The text in all the fields should be of the same font.

All the field prompts, group boxes and edit boxes should be aligned perfectly.

Microhelp should be available and spelt correctly.

All dialog boxes and windows should have a consistent look and feel.

Page 239: Testing concepts prp_ver_1[1].0

239

Software Testing Concepts

239

Screen Validation Standards (Contd..) Validation Conditions

Failure of validation on every field should cause a user error message.

If any fields are having multile validation rules, all should be applied.

If the user enters an invalid value and clicks on the OK button, the invalid entry should be identified and highlighted.

In the numeric fields, negative numbers should be allowed to enter.

Should allow the minimum, maximum and mid range values in numeric fields.

All mandatory fields should require user input.

Page 240: Testing concepts prp_ver_1[1].0

240

Software Testing Concepts

240

Screen Validation Standards (Contd..)

Navigation Conditions

The screen should be accessible correctly from the menu and toolbar.

All screens accessible through buttons on this screen should be accessed correctly.

The user should not be prevented from accessing other functions when this screen is active.

Should not allow to open number of instances of the same screen at the same time.

Page 241: Testing concepts prp_ver_1[1].0

241

Software Testing Concepts

241

Screen Validation Standards (Contd..) Usability Conditions

All the dropdowns should be sorted alphabetically (unless specified).

All pushbuttons should have appropriate shortcut keys and should work properly.

All read-only and disabled fields should be avoided in the TAB sequence.

Should not allow to edit microhelp text.

The cursor should be positioned in the first input field or control when opened.

When an error message occurs, the focus should return to the field in error after cancelling it.

Alt+Tab should not have any impact on the screen upon return.

Page 242: Testing concepts prp_ver_1[1].0

242

Software Testing Concepts

242

Screen Validation Standards (Contd..)

Data Integrity Conditions

The data should be saved when the window is closed by double clicking on the close box.

There characters should not be truncated.

Maximum and minimum field values for numeric fields should be verified.

Negative values should be stored and accessed from the database correctly.

Page 243: Testing concepts prp_ver_1[1].0

243

Software Testing Concepts

243

Screen Validation Standards (Contd..)

Modes (Editable, Read-only) conditions

The screen and field colours should be adjusted correctly for read-only mode.

Is the read only field necessary for this screen?

All fields and controls should be disabled in read-only mode.

No validation is performed in read-only mode.

Page 244: Testing concepts prp_ver_1[1].0

244

Software Testing Concepts

244

Screen Validation Standards (Contd..)

General Conditions

“Help” menu should exist.

All buttons on all tool bars should have corresponding key commands.

Abbreviations should not be used in drop down lists.

Duplicate hot keys/shortcut keys should not exist.

Escape key and cancel button should cancel (close) the application.

OK and Cancel buttons should be grouped separately.

Command button names should not be abbreviations.

Page 245: Testing concepts prp_ver_1[1].0

245

Software Testing Concepts

245

Screen Validation Standards (Contd..) General Conditions (Contd..)

Field labels/names should not be technical labels, they should be meaningful to system users.

All command buttons should be of similar size, shape, font and font size.

Option boxes, option buttons and command buttons should be logically grouped.

Mouse action should be consistent through out the screen.

Red colour should not be used to highlight active objects (many individuals are red-green colour blind).

Screen/Window should not have cluttered appearance.

Alt+F4 should close the window/application.

Page 246: Testing concepts prp_ver_1[1].0

Bug Life Cycle

Page 247: Testing concepts prp_ver_1[1].0

247

Software Testing Concepts

247

What is a Bug?

Bug A fault in a program which causes the program to perform in

an unintended or unanticipated manner or deviation from the requirement specification or the design specification is referred as a bug.

Page 248: Testing concepts prp_ver_1[1].0

248

Software Testing Concepts

248

What is a Bug Life Cycle?

No

Yes

Submitted

In-Work

Solved

Validated

Terminated

Deferred

Page 249: Testing concepts prp_ver_1[1].0

249

Software Testing Concepts

249

Classification of Bugs

Two attributes are used whenever a Bug/Defect is detected

Severity (Severity is Technical)

Critical

Serious

Minor

Priority (Priority is Business)

High

Medium

Low

Page 250: Testing concepts prp_ver_1[1].0

250

Software Testing Concepts

250

Reporting/Logging a Bug/Defect A Bug/Defect is reported with the following details

Summary

Description

How to reproduce

Version

Module

Phase

Browser

Environment

Modified Date

Page 251: Testing concepts prp_ver_1[1].0

251

Software Testing Concepts

251

Reporting/Logging a Bug/Defect (Contd..)

A Bug/Defect is reported with the following details

Job assigned to

Severity

Priority

Tester’s name

Status

Database

Type of defect

Reproducible

Attachments