Oooo those graphs look pretty!
How to manage performance testing when you are not a technical
specialist.
Tony Simms - TMF
October 27th 09
AgendaBackground to the NSPCC Helplines programme
Procurement
Execution
Results
Open discussion
The Background
Childline was receiving around 2,500 telephone calls a day of which only 60% were actually being answered.
We wanted to;
increase the number of calls we received
increase the number of calls we answered
increase the ways children could contact us
improve our IT infrastructure
introduce new web counselling services
Multimedia Counsellors
Multiple locations
PDCPDC SDCSDC Two brand new data centres
Web and voice basesCounselling services
• No volume data for web traffic
• Good volume data for voice
• New data centres in the process of being built
• The business users insisting on evidence we can support the service before we switch it on
Procurement
Approach to Supplier Selection
No in house tools or skill set
Attended an exhibition and approached smaller performanceVendors and specialist
The name Facilta kept being mentioned
Got Facilita in to ‘pitch’
Approach to Supplier Selection
Facilita were not geared up to do the performance testing of our voice system, and recommended Empirix
Our telephony supplier recommended Empirix
Got Empirix in to ‘pitch’
Appointed Facilita and Empirix to do the work.
Possible Discussion Points
What are the questions that need to be asked of a performance test consultancy
What are the key indicators that the supplier understands performance testing and how it applied to your system
The pros and cons of not going to full competitive tender
All suppliers are liars.’ how can we trust the salesmen
Execution
Preparation
Information on environment
Information on applications
Information (or not) on demographics
Scripting time
Management
Totally hands off
Immediate notification of serious issues
Direct access to developers and architects
Test – correct – test - report
Possible Discussion Points
What added value could have been gained by being more involved in the execution
How much technical knowledge is required to validate the performance test approach
Is performance testing as simple as test and report, or should there be significant ‘tinker time’
Should your performance testers also be your system tuners
Results
End Of Test ReportingBoth companies produced reports.
One more verbal, one more graphical
Reports highlighted issues
The issues needed multiple parties to analyse
The results were not always clear cut
Both needed more time than we could give
Possible Discussion Points
What good is a report form a technical expert if you don’t have the skill set needed to understand it
How much should you expect the testers to contribute to the discussion about the solution
What do you do when the technical experts disagree
What do you do when there is no industry standard tocompare your results with
Any Questions?