Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
www.origsoft.com | © Original Software 1 www.origsoft.com | © Original Software
Jim Trentadue
Software Quality Consulting Director
Original Software
April 21, 2016
Make your Test Automation
SMARTER
www.origsoft.com | © Original Software 2
Agenda
1 Key advantages of Test Automation
2 Additional advantages of Test Automation not incorporated
3 Limitations of Test Automation
4 Topics for SMARTER Test Automation
5 Case Study
6 Session recap
www.origsoft.com | © Original Software 3
Key advantages of Test Automation
Other key points to consider besides time & cost savings
Robust Test
Automation
Better distribution of
testing professionals
Unlimited permutations of
data usage
24-hour testing coverage during critical project
timelines
Better Test
Objectives
during
requirements /
user story
Better Test
Metrics of test
effectiveness
and data
coverage
Work with UAT
earlier;
incorporating
customer tests
Increased data
scenarios for
functional, negative,
equivalent class,
boundary, etc.
www.origsoft.com | © Original Software 4
Additional advantages of Test Automation
Going beyond automated testing through your SUT
Test data setup –
not validation
Automate your SQL Editor for table query results
Component-level tests (message queues, scripts, web services, stored prcs)
www.origsoft.com | © Original Software 5
Limitations of Test Automation
Understanding limitations can better serve its purpose
www.origsoft.com | © Original Software 6
Topics for SMARTER Test Automation
• STRATEGY for understanding
application behavior
• METHODOLOGY for implementing the
logical model
• ADAPTABLE across different platforms
and browsers
• ROADMAP for outlining the sequence
and priority of tests
• TOOLBOX to work with technologies
available
• EXPERIENCED personnel requirements
for those to support changes
• REPEATABLE design for future success
Source: Celtic Testing Experts, Inc.
www.origsoft.com | © Original Software 7
STRATEGY
Strategy for understanding application behavior
Consider conditional test execution below, with the logical Test
Suite layout to the right: Test Suite
Test Case 1
- Module 1 (Scenario 1)
- Module 2 (Scenario 2)
(if PASS, execute Test Case 3
if FAIL, execute Test Case 2)
Test Case 2
- Module 3 (Scenario 3)
(if PASS, go to Test Case 4)
- Module 4 (Scenario 4)
(if FAIL, stop the testing)
Test Case 3
- Module5 (Scenario 5)
(if PASS, iterate through Test
Case 1 for data-driven tests)
- Module6 (Scenario6)
(if FAIL, execute Test Case 2)
www.origsoft.com | © Original Software 8
METHODOLOGY
Methodology for implementing the logical model
END-TO-END TESTS CREATED, DIVIDED BY MODULE, GROUPED IN TEST CASES
24 full automation paths (recording or coding)
LOGIN MENU ORDER INVENTORY PROCESSING CONFIRMATION REPORTS
2 (3)* 3 2 2 1 1 1
END-TO-END
MODULE BREAKDOWN
*1 Negative scenario
that doesn’t proceed
LOGIN
TC
Module 1
Module 2
MENU
TC
Module 3
Module 4
Module 5
ORDER
TC
Module 6
Module 7
INVENTORY
TC
Module 8
Module 9
PROCESSING
TC
Module 10
CONFIRMATION
TC
Module 11
REPORTS
TC
Module 12
DATA-DRIVEN; only 1 module required
www.origsoft.com | © Original Software 9
ADAPTABLE
Adaptable across different platforms and browsers
DESKTOP: UI Automation (UIA)
Microsoft Active Accessibility (MSAA)
WEB: Document Object Model (DOM)
USING COMMON STRUCTURES FOR OBJECT RECOGNITION
www.origsoft.com | © Original Software 10
ROADMAP
Roadmap for outlining the sequence and priority of tests
THREE DIFFERENT VIEWS PRIORITIZING GRADING AND SCALING
TOPIC Manual Test
Case
readiness
Automation
Difficulty
Automation
Effort
# of releases automated test
case(s) will be used
TOTAL
SCORE (1-10)
TOPIC # of test cases
for functional
area
Traceability
from req. thru
to test
# of releases manual
test case(s) will be
used
Release frequency of
functional change
TOTAL
SCORE (1-10)
TOPIC Volume of defects
in production
Volume of
defects in test
# of defects
opened in last six
months
# of defects opened
within the last 12
months
TOTAL
SCORE (1-10)
Business Prioritization
Functional (Manual) Testing Prioritization
Test Automation Prioritization
www.origsoft.com | © Original Software 11
TOOLBOX
Toolbox to work with technologies
Application(s)
Under Test
(AUT)
Test Management
www.origsoft.com | © Original Software 12
EXPERIENCED
Personnel requirements for those to support changes
TEST AUTOMATION PROFESSIONALS
System Testers:
SME
knowledge on
Application
Under Test
Writes Test
Plans and
Cases
Executes
system tests
against the
requirements
Analyzes test
results; reports
accordingly
Developers:
Creates code
modules
Conducts code
/ design review
with QA team
Documents
coding
standards for
functions and
objects
Executes unit
testing
Understands:
Object Repositories
Coding Functions
Debugging
Knowledge of:
Testing Process
Test Documentation
System Integration
www.origsoft.com | © Original Software 13
REPEATABLE
Repeatable design for future success
Points always
to consider
Error-handling at
every turn
Use Dynamic
Data, build with
Static Conditions
Elimination of a
manual test/task
www.origsoft.com | © Original Software 14
Case Study: Improving Automation
Background:
Decision to automate was made for the following:
o Save Time and Speed Up testing
o Without an independent test team, the project
team had to do all of the testing; it was thought
that repetitive testing of the application was dull
o The tools existed & looked fun!
Software Under Test:
Large scale sales & marketing automation program.
Client / Server application that ran on different
Operating Systems and Relational Databases
Automated solution used:
SQA Robot
First Generation
Initially tried using Capture / Replay
Functions were written in SQA Basic to enable complex test activities in a single command
Second Generation
12 environments to support, mgmt. kept with automation
Automation was only for simple navigation & basic record; data-driving tests
Modular approach: Each component has a main GUI screen & main procedure
Third Generation
Creation of larger dedicated test team
Designed new automation infrastructure and mapped into a test automation estimation matrix
(expanded in next slide)
ISSUES FACED
Basic use of the
tool was limited;
required coding
LESSONS
LEARNED
Solution requires
coding expertise
Need a champion
Solution lags with
technologies
ISSUES FACED
Object recognition & event actions were unreliable
LESSONS LEARNED
Difficult to build automated region
Tests had to be released to test environment
ISSUES FACED
Technology used
outgrew automated
solution; had object
recognition issues
LESSONS
LEARNED
TA managed as a
project initiative
Understanding of
maintenance costs
Case Study by Steve Allott, Software Test Automation: Fewster & Graham
www.origsoft.com | © Original Software 15
Case Study: Levels of Prioritization
Characteristics of three levels of prioritization & estimation
Level 1 test exercises the simplest aspect of the functionality in a module
It is usually straightforward to test manually
It is easy to automate
The automated test is likely to work
It is unlikely to find a new bug
Level 2 tests explore all module aspects except interfaces to other components
It is possible but time consuming to test manually
It looks easy to automate, but doesn’t always turn out so
The automated test is likely to have bugs
It sometimes finds a bug
Level 3 tests exercise the deepest level of functionality in a module, including
those that interface to other components
Difficult if not impossible to test manually
Hard to automate
Unlikely to run successfully, repeatedly
Very likely to find a bug
Level
1
Level
2
Level
3
www.origsoft.com | © Original Software 16
Case Study: Key Points
Key points and recommendations in this case study
Champion must sell the ideas and benefits of automated testing and manage
other people’s expectations carefully
Ensure your testing process is reasonably mature before you start to automate
Before you buy an automated test tool, first consider your requirements
Evaluate a test tool in your environment against target applications
Build a tier of automation levels of where the tests are at within each level and
what level you want each to be at
www.origsoft.com | © Original Software 17
Presentation Recap
Reviewing the main points of the presentation
Understand all of the advantages of test automation: the well-known key points,
as well as additional advantages
Know the limitations of test automation for your initiative
Build your own definitions using the SMARTER acronym
State an objective for each keyword along with a real or anticipated scenario
Reference this case study of another that illustrates the evolution and
improvement of a test automation journey
www.origsoft.com | © Original Software 18 www.origsoft.com | © Original Software
Jim Trentadue
Software Quality Consulting Director
Original Software
April 21, 2016
Thank you for attending this
presentation!