View
3.456
Download
3
Tags:
Embed Size (px)
DESCRIPTION
This is the slide deck that KMS's Director of Delivery presented at Can Tho University on Saturday, September 28th, 2013
Citation preview
SOFTWARE TESTING METHODOLOGY & TREND
September 2013 KMS Technology - http://kms-technology.com
Vu Pham – Delivery Director [email protected]
AGENDA
• Software Testing Process & Trends 20’
• Automation Testing & Tools 20’
• Future of Software Testing 20’
• Q&A 30’
2
© 2013 KMS Technology
SOFTWARE TESTING PROCESS
AGENDA
• Testing Process Evolution
• Components of Testing Process Framework
4
DEVELOPMENT PROCESS EVOLUTION
60’s: Waterfall 80’s: RUP 00’s: Agile 70’s: V-Model
5
DEVELOPMENT PROCESS EVOLUTION (CONT.)
6
Client Advantages Disadvantages
• Simple model and easy to manage • Applicable for small software
• “Big Design Up Front” • Defect detected at late phases • High amounts of risk and uncertain
• Early testing involvement • Clear relationship between test phases
development phases
• Still possess limitation of sequential model • Require high amount of documentation • Duplication of testing effort
• Risk and uncertain are managed • Testing activities and process are
managed
• Heavy documentation • Late customer involvement – only at UAT
• Adaptable to changes • Early client involvement - Avoid
unrealistic requirements • Avoid spending time on useless
activities
• Require high-capable people • Need representative from client • Problem scaling up the architecture
SO HOW TESTING IS CHANGED?
• Black-box testing
• System testing
• Functional testing
• Part-time tester
• Grey-box testing
• System/Integration
testing
• Functional testing
• Full-time tester
• White-box testing
• System-system
• Non-functional testing
• Fit-for-Use
• Professional tester
7
60’– 80’: Nice To Have ~ 90’: Should Have 00’ : Must Have
AGENDA
• Testing Process Evolution
• Components of Testing Process Framework
8
TESTING CENTER OF EXCELLENCE
Test Solutions
Automation Testing
Performance Testing
Mobile Testing
Specialty Testing
Best Practices
Process Assessment
Testing Estimation
Continuous Process
Improvement
Exploratory/Risk-
based Testing
Quality
Policy
Guidelines &
Templates
Fundamental
Testing Process
Quality Metrics &
Standards
Plan Test
Design Test
Execute Test
Close Test
9
TCoE = Processes + Practices + Solutions
WHY TEST SOLUTIONS?
10
About the Client Clearleap was the first company providing data streaming solution to offer a complete platform that allows TV everywhere possible
Business Challenges
• Simulate high volume of concurrent
users 100,000+ • Complete within a tight schedule • Limited budget for tool
KMS’s Solutions
• Tool Evaluation: Execute a proof of concept to evaluate both commercial and open source tools
• Planning: Determine a test strategy, approaches
• Test Design and Development: Design and develop scalable load testing architecture
• Execution and Reporting: Perform load testing and analyzing/reporting test results
Achievements
• Developed a scalable solution based on Jmeter
• Extremely reduced the cost of testing and tremendously increased ROI
• Found critical performance issues
WHY TEST SOLUTIONS? (CONT.)
• It takes months to build up solution from beginning
• Cost of commercial tools v.s open source tools
• Effective solutions differentiates us from other vendors
Typical Testing Solutions:
– Automation testing (web, desktop, mobile)
– Performance/Load Testing
– Security Testing
– Database/ETL Testing …
11
WHY BEST PRACTICES?
12
About the Client Global company supporting clinical trials in 67 countries. The Client offers services which include behavioral science, information technology, and clinical research
Business Challenges • 100% on time delivery with zero critical
bugs • Complicated paper process following
FDA regulations • Various testing platforms for both mobile
devices and desktop
KMS’s Solution • Process Establishment: Identify gaps in
current process; Leverage start-of the-art practices
• Process Improvement: Define and measure performance /quality metrics
• Lifecycle Testing: Perform all lifecycle testing activities
• Test Automation: Develop an automation framework to shorten test cycle
Achievements • New process helps reducing 60% testing
effort • No ‘critical’ defects identified during 1 year
of engagement • Moved paper work process to test
management system open new trend in clinical trial industrial
WHY BEST PRACTICES? (CONT.)
13
• Best practice improves outcome of activities
• Best practice has been proved of it effectiveness
• The more practices we use the higher maturity we are
Typical Testing Best Practice:
– Review and Lesson-Learnt
– Root Cause Analysis
– Risk-based/Exploratory Testing
– Estimation Method, ROI Model
– Quality Metric Dashboard
Definition: CPI is an ongoing effort to improve quality of products, services, or processes
In software testing CPI is seeking for improvement of: • Quality
• Productivity
• Cost of Quality
• Time to Market …
CONTINUOUS PROCESS IMPROVEMENT
14
Assess
Plan Implement
Evaluate
• Three metric categories in practice:
– Product Quality Metrics – How good the overall quality of the product
– Process Effectiveness Metrics – How the processes of delivery are performed
– Testing and Test Automation Metrics – Detail status of testing activities, test outcome
Metrics are standards of measurement by which efficiency, performance, progress, or quality of a plan, process, project or product can be assessed with the aim to support continuous improvement
Wikipedia
QUALITY METRICS
15
• Defects by Status • Open Defects by Severity • Open Defects by Severity & Functional Area • Open Defects by Severity & Release • Open Defect Aging …
Product Quality Metrics
• Defect Identification in Pre-Prod / Prod • Weekly Defect Rates per Environment • Defect Escape Ratio
• Defects by Phase Found / Functional Area • Defects by Origin / Functional Area …
Process Effectiveness Metrics
• Test Coverage Planning • Execution Status / Execution Rate by
Functional Area/Cycle • Defect Rejection Ratio • Test Productivity …
Testing Metrics
• Percent Automatable • Automation Progress • Percent of Automated Testing Coverage …
Test Automation Metrics
QUALITY METRICS (CONT.)
16
Definition: Risk-based testing is testing method that base on identified risks to
– determine the “right level” of quality
– prioritize the tests and testing effort
– focus on most important testing areas first
with the aim to be clear of current quality status and to get the best return by the time completing testing
RISK-BASED TESTING
17
EXPLORATORY TESTING
18
“A style of testing in which you explore the software while
simultaneously designing and executing tests, using feedbacks from the last test to inform the
next.” Elisabeth Hendrickson
This type of testing helps: • Discovering unknown and un-detected
bugs
• Testers in learning new methods, test strategies, think out of the box
© 2013 KMS Technology
AUTOMATION TESTING & TOOLS
AGENDA
• Software Test Automation
• Software Performance Testing
• Tools Support Testing
20
THINKING OF AUTOMATION
Test Automation is…
Business values of Automation
Greater Coverage – More time for QA doing manual exploratory/risk-based
testing.
Improved Testing Productivity – Test suites can be run earlier and nightly
Reduced Testing Cycle – Help shorten time-to-market
Doing what manual testing cannot – Load testing
Using Testing Effectively – Automation testing reduces tediousness,
improve team morale
Increased Reusability – Tests can be ran across different platforms and
environments
The use of software and tools to perform the testing
Code-Driven – Testing at source code level with a variety of input arguments.
GUI-Driven – Testing at GUI level via keystrokes, mouse clicks, etc.
21
THINKING OF RETURN ON INVESTMENT
Tool, Implementation, Maintenance, Training,
etc. Save Time, Early
Response, Reliable, Repeatable, etc.
ROI: The most important measurement for test automation • ROI (effort): planning, development, maintenance, training, etc.
• ROI (cost): tool license, environment, management, automation resources, etc.
• ROI (quality): found defect, test coverage, etc.
22
END-TO-END TEST AUTOMATION PROCESS
1
• Assessment
• Evaluation
2
• Pilot
• Planning
3
• Design
• Implementation
4
• Execution
• Report
5 • Maintenance
23
Plan Test
Design Test
Execute Test
Close Test
ASSESSMENT & EVALUATION
• Assessment – Understand organization vision,
priorities, process & methodology
– Understand Application & Technology
– Identify the Test requirements
• Evaluation: – Vendor discussion (optional)
– Tool evaluation
– Recommendations
– Finalize Testing tools
24
PILOT & PLANNING
• Pilot – Do Proof of Concept
– Define Test process
– Finalize Test Approach & Methodology
– Define Entry & Exit criteria
• Planning: – Identity test requirements, test
cases for Automation
– Set up test environment
– Define Automation framework
– Finalize Resources and Test schedule
25
DESIGN & IMPLEMENTATION
• Design – Define standards, guidelines, Pre
& Post test procedures
– Design input, output data
– Monitoring tools and report metrics
– Design Automation framework
• Implementation:
– Build driver script, actions, keywords, data driven
– Build scripts
– Validate and run under application test
26
EXECUTION & MAINTENANCE
• Execution & Report – Setup environment
– Run and schedule tests
– Provide detailed and summary report
– Provide automation handbook & training
• Maintenance: – Implement new change request
– Define new enhancement
– Keep up-to-date with new function of application under test.
27
AUTOMATION CHALLENGES
High up-front investment cost
Demanding of skilled resource
Selection of the best testing tools and approach
Ineffective collaboration process
Persuade stakeholders to say “Yes”
28
AGENDA
• Software Test Automation
• Software Performance Testing
• Tools Support Testing
29
PERFORMANCE TESTING
Determines…
User expectations
System constrains
Costs
Focuses on…
To answer… How many…?
How much…?
What happens if…?
Speed
Scalability
Stability
30
CROWD SPEED AVAILABITY
How many users before crashing?
Do we have enough hardware?
Where are the bottlenecks in the system?
Is the system fast enough to make customers happy?
Will it slow down or will it crash?
Did I purchase enough bandwidth from my ISP?
How reliable is our system
Will our system cope with the unexpected?
What will happen if our business grows?
The failure of an application can be costly Locate potential problems before our customer do Assume performance and functionality under real-work conditions Reduce infrastructure cost
31
PERFORMANCE TESTING OVERVIEW
THE FUTURE CHALLENGES OF AUTOMATION
32
AGENDA
• Software Test Automation
• Software Performance Testing
• Tools Support Testing
33
TESTING TOOLS LANDSCAPE
34
ALM – Application Life-cycle Management
• Purpose: communicates across multiple project teams
• Typical Tools: Rally, VersionOne, HP ALM
TMS – Test Management System
• Purpose: manages requirement test matrix
• Typical Tools: HP QC, Test Link, QAComplete, qTest
DTS: Defect Tracking System
• Purpose: manage defect
• Typical Tools: BugZilla, Jira, Mantis
ATT: Automation Testing Tools
• Purpose: Regression and specific tests
• Typical Tools: QTP, TestComplete, Selenium, Watir, JMeter, LoadRunner
NEW TREND IN TESTING TOOLS
35
• Auto-sync requirements, test cases & defects
• Import/export, integrate with other systems
• Capture tools integrate into defect tracking tool
Save Time & Less Work
• View, mark result, update test cases and defects without leaving the target test application
• Create defect quickly Faster & Easy to Use
• Easy to customize new features
• Integrate into many specified tools
Customization & Integration
• Control and keep track of changes, assignments
• Track status across lifecycles
• View the real-time status, statistical data, associated trends
More Control, Visibility
• Flexible and low cost Cloud Deployment
© 2013 KMS Technology
FUTURE OF SOFTWARE TESTING
WHERE WE ARE?
• Ho Chi Minh City and Hanoi are continuously in the top 10 emerging IT outsourcing cities (‘07 Today) http://www.tholons.com/Top50_article.pdf
37 Confidential
• What is typical ratio of Testers in VN IT company?
WHERE WE ARE? (CONT.)
38 Confidential
Ho Chi Minh city is destination of global outsourcing in testing
WHAT ARE OUR OPPORTUNITIES?
Facts: • Testing outsourcing
market value triple increased for every 4 year
• Many VN outsourcing companies are testing focus: Logigear, TMA, Global CyberSoft, KMS, FSOFT …
39 Confidential
FUTURE OF SOFTWARE TESTING
1. Faster – Higher – Stronger Faster release
– Need value from every hour spent on testing
Higher quality
– Greater test coverage of specified and implied requirements
Stronger capability
– Not only functionality but also performance, security, usability …
– Ability to develop test solutions
2. Complicated technology/application platform – Cloud Computing, Mobile, Enterprise System …
40
FUTURE OF SOFTWARE TESTING (CONT.)
3. Global testing team – global competition – Communication, Crowd-source Testing ...
4. Automation testing is must – More effective solutions are needed
5. Less on processes, more on practices – Agile, Exploratory, Rapid testing
41
SUMMARY
1. Testing is crucial for today business
2. It becomes professional of choice
3. Vietnam is destination of testing outsourcing
4. Automation testing is must in future
5. Requires intellectually, analytically and creatively mindset
6. It takes years to become good
7. Can’t be good if just learn from daily works
8. Is fast-paced career advancement
42
© 2013 KMS Technology
ABOUT KMS TESTING SERVICE
KMS QA SERVICES FRAMEWORK
Testing Tools
Proprietary Tools
Commercial Tools
Open source Tools
Automation
Frameworks
Test Processes
Process Assessment
Best Practice
Implementation
Continuous Process
Improvement
Quality and Project
Management Metrics
Test Management
Scrum QA
Services
Regression QA
Services
Automation QA
Services
Performance & Load
Testing Services
Code Analysis
Services
Flexible Staffing Option
Streamlined Processes & Frameworks
Tools & Automation Strategic Solution/ Best Practices
Test Planning & Estimation
Test Design & Implementation
Test Execution QA Metrics
Driven Monitoring
QA Metrics Driven Process Improvements
44 KMS Technology Confidential
Sprint Planning &
Communication
• Plan tasks
• Estimate tasks
• Coordinate tasks
• Participate in Scrum
• Leverage qEstimate
Test Scenario & Test Case Creation
• Create Scenarios
• Create Test Cases
• Cross-Team review
• Report Progress
• Test Scenario/Case mapping with Mind Mapping Tool
Test Execution & Defect
Identification
• Execute Test Cycles
• Log Defects
• Leverage qTrace for documenting defects
Defect Management
• Verify defect fixes
• Follow up on failures
• Monitor Aging Defects
• Root cause analysis on defects
SCRUM QA SERVICES
45 KMS Technology Confidential
Scrum QA Services
Regression QA Services
Automation QA Services
Performance & Load Testing Services
Code Analysis Services
Accurate, repeatable and
transparent testing effort
estimates
Visual Mind-map creates a visual traceable link
between requirements and
test cases
Clear and detailed defect descriptions to
shorten the break-fix cycle
‘Preventative’ Defect Injection
& better business
alignment = Higher Quality
qTest - Test Management
Regression Test
Planning
• Analyze Prod Defects
• Establish Critical Business Areas
• Ongoing Sprints Analysis
• Perform Root Cause Analysis
• Organize based on Business Priority
Test Cases Automation
• Build Automation Library
• Optimize & Maintain
• Leverage qAutomate for Test Case automation
Execution Cycles &
Monitoring
• Execute Test cases
• Log Defects
• Follow up on failures
• Monitor Aging Defects
• Monitor Quality Trends
REGRESSION QA SERVICES
46 KMS Technology Confidential
Scrum QA Services
Regression QA Services
Automation QA Services
Performance & Load Testing Services
Code Analysis Services
Critical Business Area Focus = higher ROI
Lower cost/effort to build & maintain Test
Case Library using qAutomate
Iterative analysis of application quality &
business priority drive adjustment to
regression testing focus
Assessment
• Understand Business Need
• Evaluate Tools
• Establish ROI
• Estimate Effort
Planning
• Define Scope
• Define Schedule
• Configure Tools
Implementation
• Setup Environment
• Establish Framework with qAutomate
• Convert Tests
• Leverage Telerik or other tools
Execution
• Execute Tests
• Report & Analyze
• Monitor Quality Trends
Maintenance
• Optimize Test Cases
• Expand Test Case Library
• Optimize & Extend Framework
47 KMS Technology Confidential
Scrum QA Services
Regression QA Services
Automation QA Services
Performance & Load Testing Services
Code Analysis Services
AUTOMATION QA SERVICES
Improved coverage with iterative defect injection
analysis
Optimization based by business priority & risk
Max coverage/min cost of continuous code
integration Automation Expertise across multiple tools
‘Consultative’ review/planning
establishes a best approach to deploy
automation for maximum ROI impact
Planning
• Identify Goals
• Establish KPIs
Development
• Identify Key Scenarios
• Identify Traffic Patterns
• Identify Transaction Loads
Deployment
• Simulate Load
• Assess Test Results
• Identify Bottlenecks
• Identify Aging, Throttle & Stress Limits
Maintenance
• Monitor Performance
• Detect & Escalate Issues
Upgrades & Updates
• Analyze Scalability
48 KMS Technology Confidential
KMS PERFORMANCE & LOAD TEST SERVICES
Ongoing optimization for continued scalability
Scrum QA Services
Regression QA Services
Automation QA Services
Performance & Load Testing Services
Code Analysis Services
Optimize peak system performance with
preemptive monitoring
Access to Technical Architects
Early planning minimizes performance
impacts
Leverage automation library for increased ROI
Identify common code compliance violations using Automated analysis tools
Identify ‘defects’ outside of QA scope such as violations with architectural goals, code comments, reusability, maintainability, globalization, secured coding, and
coding style preferences
Correct application of external/open source licensing
49
Developer Self-Review
• Conducted prior to unit testing phase
• Verify code meets the requirements & design specification
• Adheres to checklist of best practices
Peer Reviews
• Conducted prior to code release to QA
• Peer developers conduct review of each other’s code via code walk through
• QA can help as well by catching typical issues discovered during functional testing
Independent Audit
• Conducted during or after QA testing phase
• Software Architect skilled individual reviewing code of entire module or key new functionality
• Verify code structure and compliance from architecture perspective
KMS Technology Confidential
Scrum QA Services
Regression QA Services
Automation QA Services
Performance & Load Testing Services
Code Analysis Services
CODE ANALYSIS SERVICE
KMS SOFTWARE TESTING SERVICES
Testing Tools
Proprietary Tools
Commercial Tools
Open source Tools
Automation &
Performance
Testing Frameworks
Test Processes
Process Assessment
Best Practice
Implementation
Continuous Process
Improvement
Quality and Project
Management Metrics
KMS Testing Services
Testing Consulting
Services
Life-cycle Testing Services
Automation Testing Services
Performance & Load Testing Services
Mobile and Specialty Testing
Services
Flexible Staffing Option
Streamlined Processes & Frameworks
Tools & Automation Strategic Solution & Best Practices
Test Planning & Estimation
Test Design & Implementation
Test Execution QA Metrics
Driven Monitoring
QA Metrics Driven Process Improvements
50
© 2013 KMS Technology
SOFTWARE TESTING ESTIMATION
IMPORTANT OF SOFTWARE ESTIMATION
• Software estimation
– process of determining the size, cost, time of software projects, often before work is performed
• Estimation is important for the success or failure of software projects. It provides input for:
– Making investment decisions
– Budget and staff allocation
– Stakeholder/Client negotiation …
52
WHY TESTING ESTIMATION IMPORTANT?
• Testing may consume up to 50% of project effort
– ~ 70% effort in critical mission systems
• Current problem
– No estimation for testing
– Estimation is done for the whole project rather than testing
53
POPULAR SOFTWARE ESTIMATION METHODS
• Sizing Methods
– Source Lines of Code (SLOC)
– Function Points Analysis …
• Effort Estimation Methods
– Expert Judgment/Experience
– Productivity Index …
• “Guestimate” Estimation Method
– Using a test distribution percentage (Ex: Testing is 30% of total effort)
54
QESTIMATE – TESTING ESTIMATION
• qEstimate - TCPA estimates the size of testing using test cases as input
• Test case complexity is based on 4 elements:
• Checkpoints
• Precondition
• Test Data
• Type of Test
55
qEstimate: http://www.qasymphony.com/media/2012/01/Test-Case-Point-
Analysis.pdf
QESTIMATE – TESTING ESTIMATION (CONT.)
Test Cases
Count Checkpoints
Determine Precondition Complexity
Determine Test Data
Complexity
Unadjusted TCP
Adjust with Test Type
TCP
56
ESTIMATE TESTING EFFORT (CONT.)
Typically, testing effort is distributed into phases as below:
57
PRODUCTIVITY INDEX
• Effort is computed using Productivity Index of similar completed projects
• Productivity Index is measured as TCP per person-hour
PI = Average (TCP/Actual Effort)
Effort (hrs) = TCP/Productivity Index
Simple method
58
REGRESSION ANALYSIS
• Estimate effort of new projects using size and effort of completed projects
A and B is calculated based on historical data
y = Ax + B
0
10
20
30
40
50
60
70
80
90
100
0 100 200 300 400 500 600 700 800 900 1000
Eff
ort
(P
M)
Adjusted TCP
59
© 2013 KMS Technology
THANK YOU