Where Do Testers Spend Their Time?
The Future of Testing
Brought to You by
2
Overview 3
Organizational Information 4
Tools & Automation 6
Handling Defects 8
Where Do Testers Spend Their Time? 10
Moving into Mobile 14
Conclusion 16
Table ofContents
TechWell, IBM, uTest, and keynote partnered on a survey
to explore where today’s testers are spending their time,
what obstacles they most often encounter, and where
they think their attention should be focused.
There are references available on the Internet that
discuss where testers’ attention—test related or not—is
required but many of these resources are based on an-
ecdotes or derived from discussions the author has had
with a small number of testers.
Purpose:Our goal was to create a data-driven report analyzing
where testers are really spending their time and discov-
ering where they think their time is best utilized.
Other topics covered in our survey and resulting report include:
• How release cycles have changed in the past few
years
• How frequently and what types of tests are being
automated
• How testers handle defects and delays
• What non-testing activities are vying for testers’ time
• What investments are being made in mobile testing
Timeframe:The survey was conducted in April 2014 by TechWell.
Participants:TechWell surveyed 250 software testing professionals
from six continents, with 63% of respondents reporting
North America as the headquarters of their organization.
Forty percent of respondents are test managers or test
leads, and 89% of respondents have six or more years of
testing experience.
3
Overview
4
OrganizationalInformation
We asked what types of applications you test, and the
vast majority reported web-based (82%) followed by
client-server (54%), mobile (50%), and service-based
(41%). (Figure 1)
Twenty-three percent of respondents work for organiza-
tions that deliver testing services and, hence, they are not
a direct part of a development team. Forty-seven percent
work on teams with a tester-to-programmer ratio ranging
from 1:1 to 1:5.
With iterative development methodologies seemingly the
norm these days, we looked for some data to support
that perception. And we found it. While 43% of respon-
dents cite Scrum as their main development methodol-
ogy (24% are sticking with waterfall), we discovered that
56% of organizations are using some variation of iterative
development. (Figure 2)
This increase in agility may be affecting release fre-
quency. During the past three years, 15% of respondents
claimed semiannual releases; only 8% say that is their
current release frequency. Conversely, 4.5% of the tes-
ters who answered our survey say they used to release
semimonthly; that number has grown to 10% for the cur-
rent cycles. We also noted that 38% of respondents are
currently doing monthly or quarterly releases while 15%
report they release continuously. (Figure 3)
5
0%
20%
40%
60%
80%
100% Types of Applications Tested
MainframeWeb-based Service-basedMobileClient-server EmbeddedAPIs Packaged applications
Other
Perc
ent o
f Res
pond
ents
0% 10% 20% 30% 40% 50%Extreme programming
Water-scrum-fallClient's methodology
Hybrid (combination of methodologies)IterativeKanban
No methodologyPlan-based
V-modelWaterfall
Scrum
What Methodology Do You Use to Develop Software?(more than one answer per respondent possible)
How Have Release Cycles Changed?
No change in release of frequency
Increased frequency of releases
Decreased frequency of releases
8%
32%60%
Figure 1
Figure 2
Figure 3
6
Tools &Automation
Our survey found that 81% of respondents use commer-
cial tools from HP, Microsoft, and IBM for QA manage-
ment or test execution, and 61% keep it low-tech with
text documents and spreadsheets (e.g., Word, Excel,
Google docs). Forty-seven percent of respondents use
open source tools like Bugzilla, Selenium, Cucumber, and
RSpec, and 9% build their own. (Figure 4)
When it comes to managing test data, 64% create test
data manually, and only 7% use a commercial test data
solution. (Figure 5)
Survey results indicate that increasing the number of
automated tests is something a lot of testers want. Let’s
look at which tests you currently automate (at least oc-
casionally) and which tend to be automated infrequently
or not at all.
Sixty-nine percent of respondents automate functional
verification tests at least some of the time. Sixty-nine per-
cent automate load and performance testing with some
degree of regularity with 22% automating load and per-
formance testing more than three quarters of the time.
Security and user acceptance testing have the lowest fre-
quency of automation with 43% and 35% of respondents,
respectively, reporting some level of automation, but
57% and 65% of respondents claim they never automate
security or user acceptance testing. (Figure 6)
7
QA Management/Test Execution Tools(more than one answer per respondent possible)
Text docs and spreadsheets HP products Microsoft
productsOther Borland
productsIBM productsOpen source tools
In-house tools
Perc
ent o
f Res
pond
ents
0%
10%
20%
30%
40%
50%
60%
70%
80%
How Do You Create and Manage Test Data?(more than one answer per respondent possible)
Manually create test data from scratch
Extract production subset & add seed data
Clone production database & use as is
Clone production database & manually edit out secure info
Clone production database & automate scrubbing of secure info
Use commercial test data management solution
Other25%
34%64%
29%25%
7%5%
0%
20%
40%
60%
80%
100%
None 1% − 50% 51% − 100%
Unit
Percent of Automated Test Scripts
IntegrationUnit User acceptance
Build verification
System integration/end-to-end
Functional verification Regression Load/
performance Security System
Perc
ent o
f Res
pond
ents
37% 43% 40% 51%47%
38%
31%40% 42%
41% 40% 35% 31% 21% 31%57% 42% 40%
65%
25%22% 17% 25% 18% 32% 31%
12% 18% 18% 10%
Figure 4
Figure 5
Figure 6
8
HandlingDefects
When it comes to feeling good about what you do, 57%
report being very confident to completely confident in
the quality of the products they release. (Figure 7) But
as we know, defects happen. The question is: Where
do they happen the most? Well, the good news is 59%
say they seldom find defects in production. Forty-one
percent always find defects during the testing stage, 41%
frequently find defects during user acceptance testing,
and 33% frequently find defects in dev. (Figure 8)
9
How Confident Are You in Your Product's Quality
Completely confidentVery confidentSomewhat confidentNot confident
52%38%
6%5%
Where and How Often Do You Find Defects?
Percent of Respondents100% 80% 60% 40% 20% 0%
Always Very often Frequently Seldom None
In production
During user acceptance
During the test stages
During development
Early in the project 7%28%30%18%17%
6%23%33%22%17%
.4%4%25%29%41%
1%29%41%17%12%
2%8%6% 59%25%
Figure 7
Figure 8
10
Where Do TestersSpend Their Time?
Based on our survey results, testers spend their time in
three ways: navigating development delays and missing
functionality; dealing with unplanned, non-test activities;
and, of course, testing.
When development delays threaten to bring testing to a
halt, three mitigation techniques seem to be the most of-
ten utilized, at least to some degree. Ninety-one percent
say they have been known to slip the release date on
occasion, 86% say they’ve brought in additional resources
to keep the release date, and 71% have gone ahead and
released only what has been tested. (Figure 9)
When missing functionality is holding up end-to-end test-
ing, 55% write mocks to emulate the missing functionality,
54% wait to test until the missing code is available, and 6%
use a commercial service virtualization tool. (Figure 10)
Figure 7
Figure 8
How Do You Mitigate Delays?
0% 20% 40% 60% 80% 100%
Never Rarely/Sometimes Half the time Frequently Always
Release without testing to desired level
Release only what has been tested
Slip the release date
Bring in more testing resources
Percent of Respondents
14% 52% 17% 15% 2%
9% 44% 26% 18% 3%
29% 38% 12% 15% 6%
32% 39% 14% 13% 1%
Figure 9
How Do You Deal with Missing Functionality or Code?(more than one answer per respondent possible)
Manually write simulation mocks
Defer testing until dependent code is available
Descope testing of that functionality
Use open source mocking solutions
Other
Use commercial service virtualization solution33%
55%
54%
12%8%
6%
Figure 10
11
Everyone knows the frustration of not getting work done
because of unplanned activities and distractions like
email, impromptu meetings, management requests, fires
that need to be put out, and bugs that need to be fixed—
right now. But how much time are testers really spending
on this non-test work and which activities do they find
most distracting?
Almost half of respondents (49%) spend about eight
hours a week on non-testing work, while a scary 36%
spend 50% to 100% of their workweek not testing.
Fifty-eight percent cite ad hoc requests as their biggest
disrupter, 66% say audit and compliance tasks don’t eat
up much of their time, and 26% find unplanned meetings
only moderately disruptive. (Figure 11)
Once you work around the delays and finish your myriad
ad hoc assignments, it’s finally time to test. So, what are
testers testing and how often? And the more interesting
question: What testing tasks do testers want to spend
little to no time on and on which would they like to spend
a lot of time?
Seventy-four percent spend a moderate amount of time
on regression testing, 65% spend a lot of time on func-
tional verification testing, and 15% spend no time on
security testing (tsk, tsk). (Figure 12)
How Much of Your 40-Hour Workweek Is Spent on Unplanned, Non-Testing Activities?
40 hours
30 hours
20 hours
8 hours
4 hours
None
16%
.5%
49%
7%
27%
1.5%
Most Disruptive Non-Testing Activities
Ad hoc requests
General meetings
Defect triage
Audit/compliance
45%48%
21%
58% 29%
Figure 11
How Much Time Do You Spend on These Testing Activities?
IntegrationUnit User acceptance
Build verification
System integration/end-to-end
Functional verification
Regression Load/performance
Security System0%
20%
40%
60%
80%
100%
None Minimal Moderate Large
Perc
ent o
f Res
pond
ents
9%
45%
19%
27%
5%
31%
34%
30%
10%
39%
26%
25%
3%
15%
17%
65%
12%
42%
25%
21%
2%
24%
23%
51%
15%
49%
20%
16%
6%
28%
22%
44%
5%
22%
24%
49%
5%
34%
25%
36%
Figure 12
12
When you look at the larger project landscape, testers re-
port the activities that take up more time than they’d like
are: waiting for test assets such as requirements (59%),
non-test activities such as email (53%), and rescoping test
coverage due to changes (47%). (Figure 13)
When asked what one change they would like to see in
their organization’s culture, we received a broad range of
responses. However, we did see five changes repeated
frequently. (Figure 14) These are obvious pain points:
1. Increase automation
2. Involve testers early in the lifecycle
3. Increase corporate-wide awareness of testers’ value
4. Improve requirements
5. Hire more testers
On Which Activities Do You Spend More Time Than You'd Like?(more than one answer per respondent possible)
Percent of Respondents0% 10% 20% 30% 40% 50% 60%
None
Other
Waiting for the review process to end
Absorbing changed test environments
Reporting activities
Rescoping test coverage due to change
Nontest activities
Waiting for test assets
Figure 13
Top Five Desired Organizational Culture Changes
Increase automated testingInvolve testers earlierPromote value of testersImprove requirementsHire more testers
32%
20% 17%
17%
14%
Figure 14
13
Where do Testers Want to Spend Their Time?
Investigating & submitting defects
Creating automated tests
Performing exploratory tests
Executing automated tests
Designing tests
Planning tests
Reviewing test results
Running pre-defined tests
Maintaining automated tests
Creating reports
Maintaining automated test tools
Configuring test tools
Creating test data
Rerunning tests
Setting up labs
I want to Spend MORE time
I want to Spend LESS time
We asked testers to compare their current efforts on
specific test activities to the amount of time they would
like to spend on those activities. (Figure 15) On the little-
to-no-time-spent side, there wasn’t much difference
between the current and desired efforts. On the sizable-
to-large-amount-of-time end we found:
• 46% want to spend significant time creating automat-
ed tests, but 64% currently spend little to no time on it
• 41% want to spend significant time performing ex-
ploratory testing, but 55% currently spend little to no
time on it
• 30% want to spend a significant amount of time
executing automated tests, but 68% currently spend
little to no time on it
The top five test activities that respondents say they want
to spend a sizeable amount to a lot of time on are:
1. Creating automated tests
2. Performing exploratory tests
3. Executing automated tests
4. Designing tests
5. Planning tests
The top five test activities that respondents say they want
to spend little to no time on are:
1. Setting up, configuring, and refreshing test environments
2. Investigating & submitting defects
3. Rerunning tests
4. Creating/refreshing test data
5. Installing and configuring test tools
Figure 15
14
Moving intoMobile
We included a few questions to gauge how testers and
their organizations are faring with the new challenges
presented by mobile devices and their range of plat-
forms. When asked about the main challenges to mobile
testing, 48% cite having enough time to test as the big-
gest challenge, but 49% say the least challenging aspect
is having access to mobile devices. (Figure 16)
When it comes to investing in mobile development,
organizations are putting the most resources into build
automation and continuous integration. (Figure 17)
We also determined three main methods companies are
using for functional testing mobile apps and websites:
(Figure 18)
• Devices in hand
• Emulators
• Remote device cloud solutions
15
Ranking of Mobile Testing Challenges
0% 20% 40% 60% 80%
Most challenging Moderately challenging Least challenging
Availability of mobile testing experts
Having enough time to test
Implementing the right test process
Access to mobile devices
Availability of proper testing tools
Percent of Respondents
30% 18% 39%
32% 11% 49%
35% 27% 28%
48% 17% 25%
37% 19% 33%
Figure 16
Where Organizations Are Investing Their Mobile Development Dollars(more than one answer per respondent possible)
Build automationContinuous integrationOtherBuild distributionGated deployments or A/B testingNot developing mobile apps
38%
21%
37%
14%
10%
25%
Figure 17
How Do You Test across Devices and Platforms?
Emulators Remote device cloud solution OutsourcingOtherDevices in hand CrowdsourcingNot sure
Perc
ent o
f Res
pond
ents
0%
10%
20%
30%
40%
50%
Figure 18
16
Conclusion
Based on the survey results, testers are spending a sig-
nificant amount of time on non-test-related tasks as well
as on testing activities they don’t feel are the best use of
their time.
Organizational improvements many testers cited that
would likely free up time for them to focus on testing and
improving product quality include increasing automa-
tion, involving testers earlier in the lifecycle, improving
requirements, and hiring more testers to do the work.
Do you find yourself bogged down by time-consuming
busy work, bottlenecks and delays, or non-vital testing
activities? How do you mitigate these challenges to en-
sure you release a high-quality product on time? Email us
and let us know: [email protected].
Thank YouThanks to all of the testers who took the time to com-
plete the survey. Your input is invaluable to helping
understand where testing is heading and the chal-
lenges you are facing along the way.
Special thanks to our survey review panel: Michael
Bolton, Dorothy Graham, Janet Gregory, Linda Hayes,
and Karen Johnson. Your expertise and dedication to
the testing craft is admirable and always appreciated.