Upload
techwell
View
234
Download
0
Embed Size (px)
Citation preview
QA Organization Meeting May 15, 2015
LEAN TEST MANAGEMENT
PLANNING, AUTOMATION, & EXECUTION.
in REDUCE WASTE
OCTOBER 1,2015 ANAHEIM, CALIFORNIA DISNEYLAND HOTEL
TARIQ KING
2 2 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
3 3 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
Increasing Productivity without reducing Quality
Quality = Productivity
Consider… Improve Quality
Lower Costs Less Rework, Fewer Mistakes
Productivity Rises
However…
Force Productivity Up Move Fast, Break Things
Higher Costs
Quality Suffers
Q P
Measuring Productivity
Measuring Quality
CHALLENGE
4 4 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
Moving Faster Quality without Compromising
5 5 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
DESIGN FOR TESTABILITY
LIGHTWEIGHT PLANNING
TEST IMPACT ANALYSIS
RISK-BASED TESTING
6 6 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
W A S T E
! Brittle Automation
Over-Testing
Test Redundancy Untestability
Over-Documentation
7 7 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
DESIGN FOR TESTABILITY
8 8 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
You are not allowed to come near them, nor poke them, nor
probe them.
AFFECT TESTING?
COVERED CLOCKS
9 9 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
DESIGN FOR TESTABILITY Testability refers to the degree to which a system facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met.
Design for Testability (DFT) is an approach in which testability is engineered into the product at the design stage to support validation and verification.
DFT should be performed at all levels of the design, and may involve modifying existing designs.
10 10 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
TESTABILITY ATTRIBUTES Attribute Description Controllability Must be able to set up preconditions and apply input Observability Ability to recognize and interpret the results Availability To test it we have to be able to get at it Simplicity The less complicated it is, the easier it is to test Stability The fewer the disruptions, the faster the testing Operability The better it works, the more efficient the testing
Information The more information we have about the system, the smarter we can test
11 11 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
EXECUTIVE BUY-IN Testability is usually not a technical issue. It is often a people issue.
It has a difficult time competing with functionality, performance, and other aspects of software development.
Management must allow time for testability to be implemented so it can provide long-term benefits.
12 12 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
BENEFITS FOR TESTING EFFORT A = Design done with testability in mind
B = Design made without testability in mind but good fault coverage due to large test effort
C = Design that is very difficult to test
13 13 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
DEVELOPER BUY-IN Developers usually adopt design for testability techniques readily if they can see the immediate return.
Return in the form of fewer defects, less time dealing with those defects, better code and higher productivity.
If they see these techniques as a way of helping them write better code, they’ll do it.
14 14 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
Design for testability is not going to radically change the way you code but it may radically change the way you think about it…
How do my coding practices affect testing?
Polymorphism KISS TDD YAGNI
SOC
Encapsulation SOLID DRY
15 15 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
SOLID PRINCIPLE Single Responsibility � Each class only has one responsibility.
Open/Closed � Modules should be open to extension but closed for modification.
Liskov Substitution � Subclasses should be substitutable for any parent class.
Interface Segregation � Split large interfaces into smaller more client-specific ones.
Dependency Inversion � High-level modules should be independent of low-level modules.
16 16 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
SOLID PRINCIPLE: TEST PERSPECTIVE Single Responsibility � Makes the class definition easier to test.
Open/Closed � Reduces test maintenance since tests for existing objects still work.
Liskov Substitution � Polymorphism cleans up conditionals and promotes mocking.
Interface Segregation � Results in small, well-defined interfaces that are easier to test.
Dependency Inversion � Facilitates the use of stubs and mocks through loose coupling.
17 17 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
FOSTERING DESIGN FOR TESTABILITY Activity/Area Description Choice of Technologies
Libraries, frameworks, repositories, and services should support testability.
Design Conventions
Proper abstraction and design principles. Good test automation practices.
Isolation Frameworks
Tools and approaches for creating stubs, mocks, and spies promote unit, component, and integration testing.
Logging and Dumps
In large systems, often system-level tests fail but unit tests pass. Mechanisms for logging errors and events, and creating memory dumps support testing and debugging.
Flexible Configuration
Support for specifying the desired test environment, data sources, and mock objects through updating a configuration file make testing more convenient.
18 18 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
LIGHTWEIGHT PLANNING
19 19 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
AGILE TESTING
Adaptive
Rapid
Responsive
Flexible
Evolutionary
Continuous
Do we need to plan?
20 20 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
Types of Testing � Functional, Usability, Security, Performance
Automation vs. Manual Testing � Continuous Integration, Exploratory Testing
Levels of Testing � Unit, Integration, System (Pyramid Goals)
Test Infrastructure � Environments, Hardware and Software
Estimation of Testing Effort
At release planning, test plan documentation is developed to describe the testing strategy.
During each sprint of a release, focus on defining test cases to validate stories and features.
21 21 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
LIGHTWEIGHT TEST PLANNING Conduct test planning breadth-first
� Write one-line descriptions of each test case indicating its purpose.
� Review these with relevant stakeholders before filling in the details.
� Favor self-documenting test automation over comprehensive detailed manual test documentation.
� Leverage recorders for capturing test documentation during exploratory testing sessions.
� Store all test information on a central test management server that is accessible to stakeholders.
22 22 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
TEST MANAGEMENT INFRASTRUCTURE
Test Management Server
Test Cases
Exploratory Testing Sessions
Self-Documenting Test Automation
Test Management Tools
Check-In
Attachments
Test Planning, Exploratory Testing
Sync
(Nightly)
Code Repository
23 23 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
CAPTURING EXPLORATORY SESSIONS
Rapid Reporter
Session Tester
Microsoft Test Manager
24 24 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
SELF-DOCUMENTING AUTOMATION Builder Pattern Fluent APIs Domain-Specific Languages
[TestMethod] public void GoogleSearchStory() { new Story("Google Search").Tag("Sprint 1") .InOrderTo("find public information") .AsA("user") .IWant("to search the Web for documents") .WithScenario("simple text search") .Given(IOpenGoogleSearch) .When(IEnterSearchCriteria, "Pi") .And(ISubmitTheRequest) .Then(TheResultsPageContains, "3.1415") .ExecuteWithReport(GetCurrentMethod()); }
25 25 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
DSLs FOR FUNCTIONAL TESTING
getgauge.io
specflow.org cucumber.io
docs.behat.org
storyq.codeplex.com
,
jbehave.org
26 26 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
STABILIZING TEST AUTOMATION
27 27 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
STABILIZING TEST AUTOMATION
Statistics
28 28 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
TEST IMPACT ANALYSIS
29 29 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
REGRESSION TESTING Process of validating modified software to detect whether new defects have been introduced into previously tested code
� Provides confidence that modifications are correct
� Occurs at different levels: – Unit – Integration – System – System Integration
� Regression testing is an expensive process
30 30 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
REGRESSION TESTING
STRATEGIES
Several test selection strategies have been proposed to reduce this expense.
Testing Objectives � Retest Changed Components � Retest Affected Components � Retest Integration (Re-Integration)
Testing Challenges � Identifying Changed Parts � Identifying Impacted Parts � Selecting/Reducing Test Suites � Achieving Adequate Coverage
31 31 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
MODULE FIREWALL CONCEPT A module firewall in a program refers to a changed software module and a transitive closure of all possible affected modules and related integration links in a program based on a control flow graph
a
b
c
32 32 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
With this firewall concept we can reduce regression testing to a smaller scope.
� Retest all modules and integration links within firewall
� This implies retest of all the changed modules themselves and re-integration for all affected modules
� Firewall concept applies to several different test models, e.g., Class, Feature, Data, State.
FIREWALL REGRESSION TESTING
33 33 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
FIREWALL TEST EXAMPLE Main
M1 M2 M3
M4 M5 M6 M7
Changed Module
M8
A module firewall: � M5, M1, Main Unit Level Re-Testing: � M5
Re-Integration? � (M1, M5) � (Main, M1) � (M5, M8)
34 34 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
Testing Smarter, Not Harder
IntelliTest
35 35 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
Periodically analyzes changesets to determine which classes have changed since the last build.
Generate a list of all classes impacted by the changes through dependency analysis.
Identify all of the tests that verify the behavior of the affected classes.
Execute the test cases associated with the code changes and report the results.
Analyze Code Changes
Generate List of Impacted Classes
Identify Tests for Impacted Classes
Execute Tests and Generate Results
Check-In
BUILD INTEGRATION
36 36 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
VISUAL STUDIO EXTENSION
Key Features
Visualize Source Code
Multi-Product Support
Firewall Impact Analysis
Custom Dependencies
Integration Test Ordering
37 37 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
FEATURE-LEVEL TEST IMPACTS
38 38 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK-BASED TESTING
39 39 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK-BASED TESTING A technique for prioritizing testing activities so that we can get the most out of our testing efforts.
Test parts of the software that pose the highest threat to project success most heavily.
� Allows us to optimize testing efforts by: – Testing error-prone features and modules. – Testing critical features and modules. – Searching for the most harmful defects. – Limited or no testing on low-impact areas.
40 40 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK-BASED TESTING (cont.)
How to identify risk areas? How to calculate risk? How to test based on risk calculations?
41 41 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK HEURISTICS Risks associated with the project, staff, management, or the software itself can be used to guide testing.
Categories of heuristics for identifying testing-related risks include:
Business Facing Technology Facing General Requirements Popularity (Frequency) Criticality Market Bad Publicity Liability
Complexity Changes Bad Quality Scheduling Resources Budget
Untestability Integration New Technology Programming Language Weak Testing Tools Unfixability
42 42 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK HEURISTICS: CHEATSHEETS
43 43 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK CALCULATION 1. Choose factors for functional, technical, or other quality risks. 2. Assign weights to chosen factors: (1-Low, 3-Medium, 10-High). 3. Assign points to factors in every area: (1 - 2 - 3 - 4 - 5) 4. Calculate the weighted sum:
Risk = Cost * Probability Cost
(Weight for Impact Factor 1 * Value for Factor) + (Weight for Impact Factor 2 * Value for Factor) + (Weight for Impact Factor n * Value for Factor)
Probability (Weight for Probability Factor 1 * Value for Factor) + (Weight for Probability Factor 2 * Value for Factor) + (Weight for Probability Factor n * Value for Factor)
44 44 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK CALCULATION: EXAMPLE Risk Area Criticality
(Weight: 10) Frequency
(Weight: 3) Complexity
(Weight: 3) Schedule (Weight: 10)
Risk
Feature A 4 4 4 4 (4*10+4*3)*(4*3+4*10)= 2704
Feature B 3 1 2 5 (3*10+1*3)*(2*3+5*10)= 1848
Feature C 3 3 2 1 (3*10+3*3)*(2*3+1*10)= 624
Criticality: 1- Unimportant; 5 – Business Critical Complexity: 1 – Simple; 5 – Highly Complex Popularity/Frequency: 1 – Rarely Used; 5 - Always Used Schedule: 1 – No Time Pressure; 5 – Very Aggressive Schedule
45 45 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
RISK CALCULATION: SCHEMA
https://www.dropbox.com/s/xbtmy2wra9zcrev/RiskBasedTestingCalculationSchema.xlsx?dl=0
46 46 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
PRIORITIZING TESTING EFFORTS
Cost (Impact)
Prob
abili
ty (L
ikel
ihoo
d)
Risk Assessment
High Risk. Thorough Testing
Medium Risk. Moderate Testing
Low Risk. Light Testing 1 2 3 4 5
5 4 3 2 1
High
Low
Medium
47 47 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
PRIORITIZING TESTING EFFORTS (cont.) Area Risk Testing Criteria and Techniques
Component A Low At least 1 Positive and 2 Negative Tests Statement Coverage
Component B Medium At least 3 Positive and 5 Negative Tests Branch Coverage
Subsystem High Basis Path for All Components
Feature High State Transition Testing Robust Boundary
Story Low GUI Interaction Testing Basic Boundary
48 48 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
PRIORITIZING TESTING EFFORTS (cont.)
Risk Area Importance (%) Functionality 50 Performance 20 Security 15 Usability 10 Accessibility 5 Portability 0
An even higher level of prioritization can be achieved by determining the relative importance of other quality areas.
49 49 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
CONCLUSION Common sources of waste in testing activities include untestability, test redundancy, over-documentation, brittle automation and over-testing.
Focusing on added value and considering the relative importance of testing tasks can help us to avoid and eliminate such waste.
Design for test, lightweight test planning, test impact analysis, and risk-based testing are some of the ways you can optimize testing.
By optimizing your testing strategy, you can reduce your time spent testing while maintaining test coverage and product quality.
50 50 Lean Test Management: Reduce Waste in Planning, Automation and Execution Tariq King
THANK YOU!
Acknowledgments Robert Vanderwall Dionny Santiago
Gabriel Nunez Denise Krentz
Risk-Based Testing Heuristics
Business Facing
Complexity □ Feature or requirement may contain many complicated input, processing, and output steps. Changes □ New things – newer features may be the source of failures. □ Changed things – modifications may introduce errors into previously tested features. □ Dependencies – failed features/components may trigger other failures (↑Dependencies, ↑Risk).
Bad Quality □ Lack of System Testing – Defects can hide in untested features (↓Test Coverage, ↑Risk). □ Domain Knowledge – mistakes can be made by analysts, developers, and testers due to lack of understanding of the problem area. □ Bugginess – features with many known bugs may also have many unknown bugs. □ Construction History – previous development and/or testing strategy was narrow and inadequate. Requirements □ Ambiguous Requirements – unclear or imprecise story descriptions and acceptance criteria □ Conflicting Requirements – contradictions in story descriptions and/or acceptance criteria □ Unknown Requirements – incomplete stories and/or missing acceptance criteria □ Evolving Requirements – product vision changes during the course of development Customer-Driven □ Popularity/Frequency – feature is or will be heavily used by customers □ Criticality – feature is very important to the customer □ Market – feature is a key differentiator that separates our product from that of our competitors □ Bad Publicity – bug could appear in the media (CNN, PC Week) □ Liability – bug could cause us or our customers to get sued (e.g., Compliance) Schedule, Resources, and Budget □ Rushed work – moving fast to catch up after falling behind schedule can lead to reduced quality. □ Scope creep – growth in requirements without increasing the schedule, resources, and/or budget. □ Tired staff members – long overtime over several weeks or months causes inefficiencies and errors. □ Late changes – introducing changes late in the development cycle leads to work being done poorly. □ Other staff issues – alcoholic, family member died, staff member rivalry, turnover …
Risk-Based Testing Heuristics
Technology Facing
Complexity □ Subsystem or class may have high measurements for cyclomatic complexity, lines of code, Halstead. Changes □ New things – newer code modules may be the source of failures. □ Changed things – modifications may introduce errors into previously tested code. □ Dependencies – failed features/components may trigger other failures (↑Dependencies, ↑Risk).
Bad Quality □ Lack of Unit Testing – Defects can hide in untested code (↓Code Coverage, ↑Risk). □ Domain Knowledge – mistakes can be made by analysts, developers, and testers due to lack of understanding of the problem area. □ Bugginess – code with many known bugs may also have many unknown bugs. □ Construction History – previous development and/or testing strategy was narrow and inadequate. Design □ Untestability – systems designed without testability in mind run risk of slow, inefficient testing. □ Integration – interconnections with other components (especially third-party) can cause issues.
Implementation □ New Technology – implementing new concepts and constructs can lead to programming mistakes. □ Programming Language – some errors are language or paradigm specific (e.g., wild pointers in C, lambda expressions in C# vs. Java). □ Weak Testing Tools – if tools don’t exist to help identify certain types of errors, such errors are likely to survive testing. □ Unfixability – a decision may be made to not go back to fix this area after it is developed (one-shot). Schedule, Resources, and Budget □ Rushed work – moving fast to catch up after falling behind schedule can lead to reduced quality. □ Scope creep – growth in requirements without increasing the schedule, resources, and/or budget. □ Tired staff members – long overtime over several weeks or months causes inefficiencies and errors. □ Late changes – introducing changes late in the development cycle leads to work being done poorly. □ Other staff issues – alcoholic, family member died, staff member rivalry, turnover …