View
226
Download
2
Tags:
Embed Size (px)
Citation preview
CS 5150 1
CS 5150 Software Engineering
Lecture 21
Reliability 2
CS 5150 2
Administration
Assignment 3
Comments have been sent to all project teams
CS 5150 3
Reviews
Carried out throughout the software development process.
Validation & verification
Requirements specification Design Program
REVIEWS
CS 5150 4
Reviews of Design or Code
Concept: Colleagues review each other's work:
• Can be applied to any stage of software development, but particularly valuable to review program design
• Can be formal or informal
Design reviews are a fundamental part of good software development
CS 5150 5
Review Process
Preparation
The developer provides colleagues with documentation (e.g., specification or design), or code listing
Participants study the documentation in advance
Meeting
The developer leads the reviewers through the documentation, describing what each section does and encouraging questions
Must allow plenty of time and be prepared to continue on another day.
CS 5150 6
Benefits of Design Reviews
Benefits:
• Extra eyes spot mistakes, suggest improvements
• Colleagues share expertise; helps with training
• An occasion to tidy loose ends
• Incompatibilities between components can be identified
• Helps scheduling and management control
Fundamental requirements:
• Senior team members must show leadership
• Good reviews require good preparation
• Everybody must be helpful, not threatening
CS 5150 7
Review Team (Full Version)
A review is a structured meeting, with the following people
Moderator -- ensures that the meeting moves ahead steadily
Scribe -- records discussion in a constructive manner
Developer -- person(s) whose work is being reviewed
Interested parties -- people who are doing other parts of the software process
Outside experts -- knowledgeable people who have are not working on this project
Client -- representatives of the client who are knowledgeable about this part of the process
CS 5150 8
Static and Dynamic Verification
Static verification: Techniques of verification that do not include execution of the software.
• May be manual or use computer tools.
Dynamic verification:
• Testing the software with trial data.
• Debugging to remove errors.
CS 5150 9
Static Analysis Tools
Program analyzers scan the source of a program for possible faults and anomalies (e.g., Lint for C programs, Eclipse).
• Control flow: loops with multiple exit or entry points
• Data use: Undeclared or uninitialized variables, unused variables, multiple assignments, array bounds
• Interface faults: Parameter mismatches, non-use of functions results, uncalled procedures
• Storage management: Unassigned pointers, pointer arithmetic
Good programming practice eliminates all warnings from source code, except in special circumstances.
CS 5150 10
Static Analysis Tools (continued)
Examples of static analysis tools
• Cross-reference table: Shows every use of a variable, procedure, object, etc.
• Information flow analysis: Identifies input variables on which an output depends.
• Path analysis: Identifies all possible paths through the program.
CS 5150 11
Static Analysis Tools in Programming Toolkits
CS 5150 12
Static Analysis Tools in Programming Toolkits
CS 5150 13
Static Verification: Program Inspections
Formal program reviews whose objective is to detect faults
• Code may be read or reviewed line by line.
• 150 to 250 lines of code in 2 hour meeting.
• Use checklist of common errors.
• Requires team commitment, e.g., trained leaders
So effective that it is claimed that it can replace unit testing
CS 5150 14
Inspection Checklist: Common Errors
Data faults: Initialization, constants, array bounds, character strings
Control faults: Conditions, loop termination, compound statements, case statements
Input/output faults: All inputs used; all outputs assigned a value
Interface faults: Parameter numbers, types, and order; structures and shared memory
Storage management faults: Modification of links, allocation and de-allocation of memory
Exceptions: Possible errors, error handlers
CS 5150 15
The Testing Process
Unit, System, and Acceptance Testing are major parts of a software project
• They require time on the schedule
• They may require substantial investment in test data, equipment, and test software.
• Good testing requires good people!
• Documentation, including management and client reports, are important parts of testing.
CS 5150 16
Test Design
Testing can never prove that a system is correct.
It can only show that either (a) a system is correct in a special case, or (b) that it has a fault.
• The objective of testing is to find faults.
• Testing is never comprehensive.
• Testing is expensive.
CS 5150 17
Testing Strategies
• Bottom-up testing. Each unit is tested with its own test environment.
• Top-down testing. Large components are tested with dummy stubs.
user interfaceswork-flowclient and management demonstrations
• Stress testing. Tests the system at and beyond its limits.
real-time systemstransaction processing
CS 5150 18
Methods of Testing
Closed box testing
Testing is carried out by people who do not know the internals of what they are testing.
Example. IBM educational demonstration that was not foolproof
Open box testing
Testing is carried out by people who know the internals of what they are testing.
Example. Tick marks on the graphing package
CS 5150 19
Stages of Testing
Testing is most effective if divided into stages
Unit testing unit test
System testing integration test function test performance test installation test
User interface testing
Acceptance testing
CS 5150 20
Testing: Unit Testing
• Tests on small sections of a system, e.g., a single class
• Emphasis is on accuracy of actual code against specification
• Test data is chosen by developer(s) based on their understanding of specification and knowledge of the unit
• Can be at various levels of granularity
• Open box or closed box: by the developer(s) of the unit or by special testers
If unit testing is not thorough, system testing becomes almost impossible. If your are working on a project that is behind schedule, do not rush the unit testing.
CS 5150 21
Testing: System and Sub-System Testing
• Tests on components or complete system, combining units that have already been thoroughly tested
• Emphasis on integration and interfaces
• Trial data that is typical of the actual data, and/or stresses the boundaries of the system, e.g., failures, restart
• Carried out systematically, adding components until the entire system is assembled
• Open or closed box: by development team or by special testers
System testing is finished fastest if each component is completely debugged before assembling the next
CS 5150 22
Acceptance Testing (from Lecture 19)
Acceptance testing
The complete system, including documentation, training materials, installation scripts, etc. is tested against the requirements by the client, assisted by the developers.
• Each requirement is tested separately.
• Scenarios are used to compare the expected outcomes with what the system produces.
• Emphasis is placed on how the system handles problems, errors, restarts, and other difficulties.
Is the system that we have built, the system that you wanted? Have we met your requirements?
CS 5150 23
Acceptance Testing (from Lecture 19)
• Closed box: by the client without knowledge of the internals
• The entire system is tested as a whole
• The emphasis is on whether the system meets the requirements
• Uses real data in realistic situations, with actual users, administrators, and operators
The acceptance test must be successfully completed before the new system can go live or replace a legacy system.
Completion of the acceptance test may be a contractual requirement before the system is paid for.
CS 5150 24
Test Cases
Test cases are specific tests that are chosen because they are likely to find faults.
Test cases are chosen to balance expense against chance of finding serious faults.
• Cases chosen by the development team are effective in testing known vulnerable areas.
• Cases chosen by experienced outsiders and clients will be effective in finding gaps left by the developers.
• Cases chosen by inexperienced users will find other faults.
CS 5150 25
Test Case Selection: Coverage of Inputs
Objective is to test all classes of input
• Classes of data -- major categories of transaction and data inputs.
Cornell student system: (undergraduate, graduate, transfer, ...) by (college, school, program, ...) by (standing) by (...)
• Ranges of data -- typical values, extremes
• Invalid data
• Reversals, reloads, restarts after failure
CS 5150 26
Test Case Selection: Program
Objective is to test all functions of each computer program
• Paths through the computer programs
Program flow graphCheck that every path is executed at least once
• Dynamic program analyzers
Count number of times each path is executed
Highlight or color source code
Can not be used with time critical software
CS 5150 27
Test Strategies: Program
(a) Statement analysis
(b) Branch testing
If every statement and every branch is tested is the program correct?
CS 5150 28
Statistical Testing
• Determine the operational profile of the software
• Select or generate a profile of test data
• Apply test data to system, record failure patterns
• Compute statistical values of metrics under test conditions
CS 5150 29
Statistical Testing
Advantages:
• Can test with very large numbers of transactions
• Can test with extreme cases (high loads, restarts, disruptions)
• Can repeat after system modifications
Disadvantages:
• Uncertainty in operational profile (unlikely inputs)
• Expensive
• Can never prove high reliability
CS 5150 30
Regression Testing
Regression Testing is one of the key techniques of Software Engineering
When software is modified regression testing provides confidence that modifications behave as intended and do not adversely affect the behavior of unmodified code.
• Basic technique is to repeat entire testing process after every change, however small.
CS 5150 31
Regression Testing: Program Testing
1. Collect a suite of test cases, each with its expected behavior.
2. Create scripts to run all test cases and compare with expected behavior. (Scripts may be automatic or have human interaction.)
3. When a change is made to the system, however small (e.g., a bug is fixed), add a new test case that illustrates the change (e.g., a test case that revealed the bug).
4. Before releasing the changed code, rerun the entire test suite.
CS 5150 32
Documentation of Testing
Testing should be documented for thoroughness, visibility and for maintenance
(a) Test plan
(b) Test specification and evaluation
(c) Test suite and description
(d) Test analysis report
CS 5150 33
User Interface Testing (from Lecture 19)
User interfaces need several categories of testing.
• During the design phase, user interface testing is carried out with trial users to ensure that the design is usable. Design testing is also used to develop graphical elements and to validate the requirements.
• During the implementation phase, the user interface goes through the standard steps of unit and system testing to check the reliability of the implementation.
• Acceptance testing is carried out with users, on the completed system.
CS 5150 34
Fixing Bugs
Isolate the bugIntermittent --> repeatableComplex example --> simple example
Understand the bugRoot causeDependenciesStructural interactions
Fix the bugDesign changesDocumentation changesCode changes
CS 5150 35
Moving the Bugs Around
Fixing bugs is an error-prone process!
• When you fix a bug, fix its environment
• Bug fixes need static and dynamic testing
• Repeat all tests that have the slightest relevance (regression testing)
Bugs have a habit of returning!
• When a bug is fixed, add the failure case to the test suite for the future.