Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Leading the Transition to Effective Testing in Your Agile Team
Fran O’Hara – Inspire Quality Services
Resources Schedule
Scope/Requirements
Plan Driven
FIXED
ESTIMATED
Resources Schedule
Scope/Requirements
Value Driven
QualityQuality
Flipping the Iron Triangle
?2
Technical DebtSymptoms of technical debt• Bugs found in production• Incomprehensible, un-maintainable
code• Insufficient or un-maintainable
automated tests• Lack of CI• Poor internal quality• Etc.
3
Quality & Test
4
• Quality is not equal to test. Quality is achieved by putting development and testing into a blender and mixing them until one is indistinguishable from the other.
• Testing must be an unavoidable aspect of development, and the marriage of development and testing is where quality is achieved.
from ‘How google tests software’, James Whittaker et. al.
Release planning level – a testing perspective
Add value in release (re-)planning by:
• Supporting the Product Owner in writing User Stories/Epics and making sure they are testable,
• Participating in the high level risk analysis of those User Stories/Epics,
• Ensuring Estimation includes testing perspective
• Planning the testing for the release level. That is, to create a test strategy/approach for it (resources, tools, test levels, static testing, test environments, test automation targets), based on the scope and risks identified for that release
• Playing a key role in defining the acceptance criteria (definition of done) of the release, and later on of the iteration/Sprint.
5Adapted from ISTQB Agile Tester Extension Syllabus
Lessons Learnt / Challenges
Test Automation
Line Management
Definition of Done
Test Competency
Requirements(e.g. Story size,
Non-Fn.)
Test Strategy & Risk
Techniques (e.g. exploratory),
Planning for Quality, Documentation, …..
6EuroSTAR webinar poll: 84% had experienced 3 or more (58%... 4 or more)
Basic Testing within a Sprint
Automated Acceptance/Story
basedTests
Automated Unit Tests
Manual Exploratory
Tests
Represent Executable requirements
Represent Executable Design specifications
Provides Supplementary
feedback
Test Strategy &
Risk
© 2014 Inspire Quality Services 7
Agile Testing Quadrants – Risk!
© 2014 Inspire Quality Services 8
Test Strategy & Risk
Definition of ‘Done’
9
An agreement between PO and the Team
• Evolving over time to increase quality & ‘doneness’
Used to guide the team in estimating and doing
Used by the PO to increase predictability and accept Done PBIs
‘Done’ may apply to a PBI and to an Increment
A single DoD may apply across an organisation, or a product
• Multiple teams on a product share the DoD
Definition of Done
DoD example
10
Story level
• Unit tests passed,
• unit tests achieving 80% decision coverage,
• Integration tests passed
• acceptance tests passed with traceability to story acceptance criteria,
• code and unit tests reviewed,
• static analysis has no important warnings,
• coding standard compliant,
• published to Dev server
Sprint level
• Reviewed and accepted by PO,
• E-2-E functional and feature tests passed
• all regression tests passing,
• exploratory testing completed,
• performance profiling complete,
• bugs committed in sprint resolved,
• deployment/release docs updated and reviewed,
• user manual updated
Release level
• Released to Stage server,
• Deployment tests passed,
• Deployment/release docs delivered,
• large scale integration performance/stress testing passed
Definition of Done
The Automation Pyramid
Unit/Component layerDeveloper Tests
e.g. JUnit
API/Service layerAcceptance Tests
e.g. Fitnesse, Cucumber
GUI layere.g. Selenium
Manual Tests e.g. exploratory
Automate at feature/workflow level
Automate at story level
Automate at design level
Based on Mike Cohn
11
Test Automation
Agile Development Team(Analysts, Progmrs., Testers, Architect, DBA, UI/UX, etc)
Architect
Team Lead
Developer1
Developer2
QA Lead
Tester1
Tester2
BA Lead
BA1
BA2 12
Create each increment of ‘Done’ Product
No Specialised Sub-Teams
Test Competency
?
Is testing fully integrated?
Code CodeCode & Bug Fix
Test
Sprint 1 Sprint 2
Code
Test
Sprint 1 Sprint 2Code & Bug Fix Code
Test
Code & Bug Fix
Code & Bug Fix
Test
Sprint 1 Sprint 2Code & Bug Fix
Test
A
B
C
13
Requirements(e.g. Story
size, Non-Fn.)
Sprint Backlog
….
User Story Example – Hotel ReservationReservation Cancellation
As a user I want to cancel a reservation so that I avoid being charged full rate
Confirmation:• Verify a premium member can cancel
the same day without a fee• Verify a non-premium member is
charged 10% for same day cancellation but otherwise not charged
• Verify an email confirmation is sent to user with appropriate information
• Verify that the hotel is notified within 10 minutes of a cancellation
CONVERSATION: • What if I am a premium member –
do I have charges?• When is a non-premium member
charged and how much?• How do these vary depending on
when cancellation occurs?• Do we need to send the user
confirmation by email?• When does the hotel need to be
notified?• What if the user has paid a
deposit?
14Consider also specification by example/BDD
Context : Acceptance Test Driven Development
15
PO
, TM
, SM
(A
ll)
All
All All
All,
C
UA
ll
WEEK1
TM
WEEK2
PO: Product Owner – SM: ScrumMaster - TM: Development Team – CU: Customer
16
Each Event is Timeboxed. Times provided are maximum times from the Scrum Guide at scrum.org based on a 1 month sprint. Each event is an opportunity to Inspect and Adapt
Examples of how to evolve quality/test practices…
• See Google’s ‘Test Certified’ levels• Paddy Power’s review of teams practices – using scale of 0-5 for items such as
– Code Reviews, – Pair Programming, – Code Analysis, – Unit Tests, – Continuous Integration, – Automated Acceptance Tests, – Data Generation, – Performance Analysis, – TDD, etc.
(from Graham Abel, Softtest Test Automation Conference 2013)
• Communities of Practice, Tribes, Mentors, etc.
17
Conclusions to leading the way
• Educate yourself on what agile really means and how quality and test fit in your context
• Avoid the common pitfalls, use retrospectives to address the current issues
• Be proactive in release level planning re test strategy/approach and ‘big picture’ thinking
– Get help from outside the team if needed
• Organise a test CoP!
• Collaborate, Communicate and Question constantly – Prevention!
18