Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Discover The Quality of Your
Testing Process
Thomas M. Cagley Jr.
Testing and Quality Assurance Today
• Testing activities include both dynamic testing, which occur late in the
lifecycle, and static activities, which occur earlier in the development
process.
• Testing is both integrated into software teams and part of independent
QA teams.
• Many testers and test organizations are adapting and adopting Agile.
• Average spending on QA as a percentage of the total IT budget has
risen from 18% in 2012 and 23% in 2013, to now 26% in 2014. The
share of testing budgets is projected to grow further in the coming
years, expected to reach 29% by 2017. (Capgemini, 2014, 6th World Quality Report)
The Larger Development World
According to Capgemini, 2014, 6th World Quality Report:
• As organizations focus on speed to market, efficiency and business
agility is still a challenge. Many organizations are still facing
challenges, such as a lack of a proven Agile testing approach (61%),
difficulties with applying test automation (55%) to Agile, and the lack of
availability of the right Agile testing tools (42%).
• Cost reduction still remains high in terms of IT priorities and as
companies look to reduce test infrastructure costs.
• Testing is becoming ever more critical as consumers expect a
seamless end-user experience across multi-channel applications and
devices in an “always-on” world.
Just A Bit More Context
Other Pressures:
• Year-to-year IT budgets remain flat (+/-10%, which is in line with
historical trends). Tech Pro Research (TPR)
• Spiceworks' survey finds that IT budgets remain flat, with planned
spending rising by just 0.7% overall (7.9% in North America, -0.3% in
EMEA).
But Change is Hard
• Change pushes our buttons
– Fear, Uncertainty, Confusion, Doubt
• May trigger the grief cycle
– Denial, Guilt, Anger, Depression, Resignation
• Need to confront how work is really done
– Documentation is not always what people really do
• New process may seem overwhelming
– Big bang wave of change
– Too many rules
– Added overhead (costly in terms of time and money)
Change Leads To …
• Reduce time-to-market
• Increase reliability
• Increase speed of performance
• Support an increased number of users
• Reduced cost of testing
• Adopt new methods, techniques, and tools
• Adapt to changes in how work is sourced
All the while, technical environments and functionality are increasing in
complexity.
The Journey Begins With
A Plan To Start
• Learn what you don’t know
• Confirm what you know
• Benchmark & assess compared to industry standards and other
similar companies
• Explore & evaluate opportunities
Independent model
Integrated best practices derived from over 14 quality and test models
Underpinned by Risk Management and Monitoring & Control
Focused on moving organizations from detection to defect prevention
Fit for purpose – Appropriate process in the appropriate places
Focused on
Testing
ONLY
Define Who is Impacted
The “You Are Here Moment:”
Benchmarking
• Develop and provide a standard reference model
• Identify testing strengths and best practices
• Improve integrated test effectiveness and efficiency
• Identify issues and risks in an organization
• Identify test process improvements
• Provide governance, measurement, and accreditation
• Provide a detailed model for test process improvement
• Use maturity levels for process evaluation and improvement
The Assessment Path
• Assessment Process
– Survey
– Process and Artifact
Review
– Interviews
• Analysis and Reporting
– Gap Analysis
» Strengths and
Weaknesses
» Heat Maps
» Prioritized Improvements
• Action Planning
Process and
Artifact Review
Interviews
Synthesis
Planning
Survey
65%
35%
Execute The Survey Vehicle
• 32 questions across the TMMi Maturity Level 2 and 3
process areas.
• The survey is designed to elicit information from
practitioners to target the assessment.
Sample of Questions
Review Project and Organizational
Documents
Policies
Process Documents
Project Artifacts
Examples
• Testing Lifecycle Policy
• Unit Testing Policy
Examples
Scenario Test Data Matrix
Testing V Model (2 flavors)
Integrated Testing Overview
Examples
Migration Test and Back-Out Plan
Test schedule
Test Data
Review Process
1. Documents are compared to the
model requirements.
2. Internal consistency of
documents are traced and
evaluated (i.e. Policies relate to
process, process relates to
project artifact).
Group and Individual Interviews
The Assessment Requires
Time and Effort
• Assessment Team
• Survey Respondents (30 min)
• Interviews (1-1.5 hours)
• Answering questions about
documents – ad hoc (30
minutes per project and
process)
• Review Results (1 hour)
• Planning (1-2 days)
Process and
Artifact Review
Interviews
Synthesis
Planning
Survey
65%
35%
Q. 1
Survey Responses
Max Min Average
Interpret Survey Responses
• Each survey statement
has 6 possible answers
– Strongly Agree
– Agree
– Neutral
– Disagree
– Strongly Disagree
– Not Applicable
• Chart shows:
– Highest answer
– Average of all answers
– Lowest
Strongly
Agree
Agree
Disagree
Strongly
Disagree
Analysis Example:
This statement shows a
wide range of
perceptions (potentially
different processes or
communications issues),
but the majority of
respondents agree with
the statement in the
survey.
Q. 1 Q. 2 Q. 3 Q. 4 Q. 5 Q. 6 Q. 7 Q. 8 Q. 9 Q. 10 Q. 11 Q. 12 Q. 13 Q. 14 Q. 15 Q. 16 Q. 17 Q. 18 Q. 19 Q. 20 Q. 21 Q. 22 Q. 23 Q. 24 Q. 25 Q. 26 Q. 27 Q. 28 Q. 29 Q. 30 Q. 31 Q. 32
Survey Responses
Max Min Average
Overall Survey Data
• Test monitoring (Q11-13) was perceived to be a strength. Interviews suggest this is a
reflection of status giving rather than a quantitative evaluation of the plan, goals, or
measurement.
• Test environments (Q14 -16) were perceived to be planned, production-like and
managed for the projects in the sample. Interviews suggested this is a reflection of
inherited enhancement and maintenance environments.
• Organizational test and process assets (Q17-21) were perceived to be weak or not in
place. Process asset reviews and interviews confirmed this perception.
Organizational Test
Processes and Assets
Non-functional
Testing
Test
Monitoring Environment
Strongly
Agree
Agree
Disagree
Strongly
Disagree
Reading Heat Charts
• Heat charts are based on the assessment of process documents
(how the organization indicates work should be done), project
deliverables (how work is being done), and interviews (how people
perceive work is done) compared the TMMi model.
• Each specific practice is evaluated based on a scale:
– Partially Satisfied
– Largely Satisfied
– Fully Satisfied
Sample data!
SG1: S
P 1
SG1: S
P 2
SG1: S
P 3
SG1: S
P 4
SG1: S
P 5
SG1: S
P 6
SG1: S
P 7
SG2: S
P 1
SG2: S
P 2
SG2: S
P 3
SG2: S
P 4
SG2: S
P 5
SG2: S
P 6
SG2: S
P 7
SG3: S
P 1
SG3: S
P 2
SG3: S
P 3
SG3: S
P 4
SG4: S
P 1
SG4: S
P 2
SG4: S
P 3
SG4: S
P 4
SG4: S
P 5
SG5: S
P 1
SG5: S
P 2
SG5: S
P 3
PA 2.1 Test Policy and Strategy
PA 2.2 Test Planning
PartiallySatisfied Most of the projects in the sample have not or partially implemented the practice.
Largely Satisfied Most of the projects in the sample have fully implemented or all have largely implemented the practice.
Fully Satisfied All projects in sample have fully implemented the practice.
Not Applicable This practice is not applicable for this specific goal.
TMMi Level Three
Indicative Heat Chart
Observations and Comments:
• Strengths are supplied by a new(ish) Testing COE currently
focused on all high-profile projects.
• Test architect acts as test consultant to Agile teams.
• No consistent approach to testing non-functional attributes.
SG1: SP 1
SG1: SP 2
SG1: SP 3
SG1: SP 4
SG1: SP 5
SG1: SP 6
SG1: SP 7
SG2: SP 1
SG2: SP 2
SG2: SP 3
SG2: SP 4
SG2: SP 5
SG2: SP 6
SG2: SP 7
SG3: SP 1
SG3: SP 2
SG3: SP 3
SG3: SP 4
SG4: SP 1
SG4: SP 2
SG4: SP 3
SG4: SP 4
SG4: SP 5
SG5: SP 1
SG5: SP 2
SG5: SP 3
PA 3.1 Test Organization
PA 3.2 Test Training Program
PA 3.3 Test Lifecycle and Integration
PA 3.4 Non-functional Testing
PA 3.5 Peer Reviews
PartiallySatisfied Most of the projects in the sample have not or partially implemented the practice.
Largely Satisfied Most of the projects in the sample have fully implemented or all have largely implemented the practice.
Fully Satisfied All projects in smaple have fully implemented the practice.
Not Applicable This practice is not applicable for this specific goal.
Results Example: Maturity Level 2 -
Test Design and Execution
• People
– Mentor project and product
personnel (requires internal
test process consultant).
• Process
– Complete the documentation
of the test processes used in
CRM projects.
– Develop scaling process and
extend current practices to
other types of projects (note
this is a ML 3 suggestion).
• Technology
– Extend scaled version of
practices to other projects.
PA 2.4 Test Design and Execution
SG 1 Perform Test Analysis and Design Using Test Design Techniques
SP 1.1 Identify and prioritize test conditions. PS
SP 1.2 Identify and prioritize test cases NS
SP 1.3 Identify necessary specific test data PS
SP 1.4 Maintain horizontal traceability with requirements FS
SG 2 Perform Test Implementation
SP 2.1 Develop and prioritize test procedures PS
SP 2.2 Create specific test data PS
SP 2.3 Specify intake test procedure FS
SP 2.4 Develop test execution schedule PS
SG 3 Perform Test Execution
SP 3.1 Perform intake test NS
SP 3.2 Execute test cases PS
SP 3.3 Report test incidents FS
SP 3.4 Write test log PS
SG 4 Manage Test Incidents to Closure
SP 4.1 Decide disposition of incidents in configuration control board FS
SP 4.2 Perform appropriate action to fix the test incidents PS
SP 4.3 Track the status of test incidents PS
Results Example: Maturity Level Two
and Three Comparisons
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Test Organisation
Test Training Programm
Test Lifecycle and Integration
Non Functional Testing
Peer Reviews
Test Policy and Strategy
Test Planning
Test Monitoring and Control
Test Design and Execution
Test Environment
Manufacturer Profile
Fully Achieved Partially Achieved Not Achieved
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Test Organisation
Test Training Programm
Test Lifecycle and Integration
Non Functional Testing
Peer Reviews
Test Policy and Strategy
Test Planning
Test Monitoring and Control
Test Design and Execution
Test Environment
Financial Profile
Fully Achieved Partially Achieved Not Achieved
• Harold’s (a financial service
consultancy) is early in maturing their
testing processes.
• Manufacturing data suggests
organizations later in their testing
improvement journey.
• Financial profile shows a range of
strengths and opportunities.
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
Test Organisation
Test Training Programm
Test Lifecycle and Integration
Non Functional Testing
Peer Reviews
Test Policy and Strategy
Test Planning
Test Monitoring and Control
Test Design and Execution
Test Environment
Harold's Consulting
Fully Achieved Partially Achieved Not Achieved
Results Example: Global Observations
Weaknesses• People
– Testing COE is underfunded.
– Testing personnel are not certified.
• Process
– No quality and productivity
measurement program.
– No organized estimation (estimation is
experiential).
– Test-out versus built-in quality
philosophy.
– Risk analysis does include not testing
processes.
– Test processes do not scale.
• Technology
– Tool versus process focus.
Strengths • People
– Newly formed Testing COE with
test mentoring structure.
• Process
– Solid DevOps process.
– Defect tracking through closure.
– Change management process is
rigorous.
• Technology
– Tools provide significant structure
and process discipline.
– Process and tool experimentation
(Six Sigma Program).
Results Example: The Modeled
Prioritized Improvements
1. Implement peer reviews for all primary project and product deliverables including testing
deliverables.
– Industry data suggests every hour spent in peer reviews has a 20-80% ROI (Jones).
2. Implement a measurement program that collects data to validate testing and quality
activities. Testing and quality activities should improve productivity (more functionality
delivered), faster time-to-market and fewer delivered defects. Measurement needs to
address functional size (e.g. function points).
3. Develop and implement a parametric estimation process for testing (effort, duration, and
staffing). This will support both planning and monitoring process areas.
4. Integrate a formal risk management process into test planning and estimation. This will
improve testing effectiveness and cost efficiency.
5. Solve Support/Development change collision (impact of changes to code base without
test case change).
6. Augment and incubate the new Testing/Quality Center of Excellence
– The COE should provide support for project and product efforts rather than independent
testing/quality support.
– Approximate staff: 3 FTE.
Putting Assessment Results Into Action
Supporting Delivering Results: TMMi
Consulting Support
Workshops
• Education
• Planning
• Orientation
Training
• TMMi Overview
• TMMi Professional Training (includes certification test)
iTM – Standard Test Methodology
• All process maps, templates, guidelines, checklists, and tools to support from initiation through to requirements, build and test to project closure
Typical Benefits
• Direct
– Rise in productivity.
– Early defect detection resulting in less live fixing.
• Indirect
– Improved staff experiences and morale & motivation – leading to better
quality work.
– Increased customer loyalty.
– Allows greater movement of employees within projects.
– Improved working environments.
– Recognizes testing as a profession and integrated in the development
process.
– Whether traditional models like V model, waterfall, or Agile are used.
• TMMi will help lead to introducing and using a structured / controlled
set of test processes.
• ROI is calculated by direct costs and benefits –
easier to determine than indirect.
Leveraging The TMMi: Options
TMMi Professional Training
TMMi Gap Analysis (Invest in Improved Performance)
• Interviews across the lifecycle / Review of documentation
• Detailed Process Improvement Plan / ROI
Testing Center of Excellence (COE)
• Build – Operate – Transfer
• Policy, Governance, Operations
• TMMi Best practices-centric
TMMi Certification (Sustain success)
• Independent validation of capability
The Result
• Systems and applications are playing an ever-increasing part in every
day life (social and business).
• There is a greater emphasis on quality aspects.
• Customers and users are becoming more demanding.
• Software failures are much more visible now and can have major
impact on day to day activities – economically, socially, and
internationally etc.
• Testing is becoming even more of a key activity for helping to improve
quality and the performance of the development process.
• The fast pace of change in the industry – what has happened in the
last 5 years?
Typical Effort for Assessments
• Effort and costs can vary depending on:
– The assessing organization’s costs – i.e. their charge rate.
– The size and scope of the assessment – i.e. is it an organization, sector of
the business, program, or project?
– How many projects typically are running as TMMi assessments have to
assess a percentage of live projects (covered in how to do assessments)?
– How many interviews need to be conducted (covered in how to do
assessments)?
– Locations where interviews have to be conducted.
– Whether it is a formal or informal assessment.
– Very often only direct costs and benefits are considered but indirect
benefits can be larger than direct ones.
29
TMMi Assessment Continuum
TMMi Gap Analysis (Invest in Improved Performance)
• Interviews across the lifecycle / Review of documentation
• Detailed Process Improvement Plan / ROI
TMMi Certification (Sustain success)
• Independent validation of capability
Questions?
Tom Cagley, CFPS, CSM, CTFL
Vice President
DCG Software Value
(440) 668-5717
Software Process and Measurement Podcast
http://www.spamcast.net (or iTunes)
Software Process and Measurement Blog
http://tcagley.wordpress.com