16
July/August 2009 Volume 3 SOFTWARE TESTING This issue of Software Testing delves into mobile application and regression testing. CHAPTER 1 THE CHALLENGES OF TESTING WIRELESS MOBILE APPLICATIONS CHAPTER 2 REGRESSION TESTING HOTSPOTS

SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

July/August2009Volume 3

SOFTWARE TESTINGThis issue of Software Testing delves intomobile application and regression testing.

CHAPTER 1THE CHALLENGESOF TESTINGWIRELESS MOBILEAPPLICATIONS

CHAPTER 2REGRESSIONTESTINGHOTSPOTS

Page 2: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

EDITOR’S LETTER

pWANT JOB SECURITY? This issue ofSoftware Testing explores the rewardsof being thoroughly-modern, thor-oughly-trained and always-diligentmobile application and regressiontesters.In the mobile application develop-

ment market, the role of regressiontesting is proving to be mission criti-cal in theWeb 2.0, Rich InternetApplication age. Both mobile appsand regression go better with agileand Software Development Lifecycle(SDLC) approaches—as well as somecreative thinking—so just being acode wizard or pretty good testerwon’t get you the gold.Those who get and stay in the mo-

bile application training circuit couldhave job security for many years,according to Enterprise ManagementAssociates (EMA) Julie Craig, whowrote this issue’s lead article. Justdon’t expect to quit school at anytime, because the mobile technolo-gies will continue to be complex andvaried, with new devices appearingfaster than bubbles in seltzer water.Beyond discussing the great career

potential in mobile application test-ing, Craig lays out the mobile app

peculiarities that require great devel-opment creativity and more savvytesting.In this ezine’s article on testing and

retesting defects, DavidW. Johnsonconnects the dots between mobileapplication and regression testingby citing the Blackberry blackoutsof 2007 and 2008, which were “anunfortunate example of productionissues that should have been detectedby a regression test suite in combina-tion with other formal testing efforts.”Helping testers avoid common mis-

takes is Johnson’s key goal in this arti-cle. Too often, a user reports a bug;regression testing is done; a “fixed”report goes to the user; and the userexperiences the same problem. John-son suggests practices that can helpend this embarrassing scenario.

Regression testers, like mobileapplication testers, must invest incontinuing education due to the con-stant emergence of new breeds ofsoftware, such as mobile applications.This issue is a good first step on thattraining trail. �

JAN STAFFORD, Executive [email protected]

2 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

1Mobile applicationsand regression testing

Page 3: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

p MOBILE DEVICES ARE hot, hot, hot,and applications are the reason why.The lure of always-available Webbrowsing, email, document viewingand instant messaging has provento be irresistible to business and per-sonal users alike. Not to be outdone,entertainment-related applicationsare flourishing as well, with mobilegames, Twitter and Facebook, andphoto uploads becoming fundamentalelements of the new mobile lifestyle.This is reflected in the economy.Annual global sales of mobile phonesin units are now roughly equal tosales of laptops, with smartphonesaccounting for more than 10% of thetotal.Vendors are capitalizing on this

growth with a host of mobile-enabledapplications. Numara® Software, Inc.recently introduced Numara Foot-Prints Mobile, which allows supportpersonnel to access Numara’s Service

Desk software using theWeb browseron their Microsoft®Windows Mobile,RIM® Blackberry®, or Apple® iPhone®devices. TD Ameritrade offers mobilestock trading software which enablesa customer to execute stock tradesfrom his or her phone. Workday.com,a Software as a Service (SaaS)-basedEnterprise Resource Planning (ERP)provider, recently announced in-development software that will pro-vide access toWorkday’s HumanResources (HR) and financial applica-tions from an iPhone. Clearly there isstrong demand for a growing host ofmobile applications, and as a result,mobile development and testing areflourishing.This explosion can be a bonanza

for “mobile literate” development andtesting professionals. The outlook forgrowth in this space is so favorablethat job security is virtually guaran-teed for the foreseeable future. The

3 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

CHAPTER 1

1The challengesof testing wirelessmobile applicationsMobile application testing requires not just skill,but creativity and resourcefulness. It also requires productsand services specifically designed for the challenges ofmobile technology. BY JULIE CRAIG

Page 4: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

bad news, however, is that these jobsare harder than they used to be,thanks to a wide array of handheldhardware, mobile-specific software,and the sheer force of constantchange. Mobile applications isn’t yourgrandmother’s COBOL, and testingthem requires not just skill, but cre-ativity and resourcefulness. It alsorequires products and services specif-ically designed for the challenges ofmobile technology.

MOBILE APPLICATIONS: SIMILARTO CONNECTED APPLICATIONSEven without mobile technology, soft-ware development is a notoriouslyfailure-prone activity. I don’t need togo into the details since everybodyreading this is likely aware of the highpercentage of software developmentprojects that fail to deliver in terms offunctionality, budget or schedule.Mobile applications add to applica-tion complexity, and in doing so,increase development and QualityAssurance (QA) risks.Mobile development requires the

same Software Development LifeCycle (SDLC) approach that adds dis-cipline and governance to traditionalsoftware development. Agile method-ologies are rapidly replacing olderWaterfall processes to become the“de facto” standard governing theSDLC. Agile principles include incre-mental and iterative cycles, testingearly and often, and incorporatingend-user input throughout the lifecy-cle, and they appear to be paying off.BMC Software adopted Agile starting

in 2004, and were able to increaseproduct time to market by two tothree times over the industry norm(go to www.enterprisemanagement.com1 for more information on thisstory).From the perspective of mobile

applications, Agile methodologies areoften also the best choice, primarilybecause of the ongoing focus on cus-tomer involvement and iterative test-ing. The steps required—requirementsgathering, design, development, test-ing, and deployment—are similar forboth connected and wireless applica-tions. However, the lifecycle must betweaked for mobile applications, andit is important to answer some keyquestions from the outset.

They include:� Which mobile devices and

operating system versions will thisapplication support?

� How do we test applicationsto make sure they run on thoseplatforms?

� What modifications must bemade to accommodate the differ-ences among platforms?

� How will industry innovations besupported going forward, since newmobile devices, technologies, andapplications are constantly beingintroduced?

� How will development and testingprocesses accommodate the inherentdifferences among wireless networkprotocols and mobile serviceproviders?

� How do we know howmuchtesting is enough?

4 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

1“Agile Management—What Managers can Learn from Agile Methodologies”

Page 5: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

Each organization will have toanswer these questions internally,based on their overall goals andobjectives.

KEY DIFFERENCESChallenges with mobile testing tendto center around several key areas.One is the wide variety of mobile plat-forms and the complexity aroundtesting an application’s compatibilitywith all supported platforms. Anotheris the challenge of developing applica-tions that compensate for the factsthat the mobile device is not alwayson and that the wireless connectionmay not be as reliable as a wired one.Another key consideration is theuniqueness of each user. Customerssubscribe to different carrier networksand have different use cases. The “lastmile” to their device is likely to run atdifferent speeds and over slightly dif-

ferent protocols depending on userlocation, roaming, vendor, network,and device. The bottom line is that, ifyou and I are running the same mobile

application, our experiences might bevery different. Factoring such varia-tions into expected testing outcomescan become quite complex.The primary problem is one of test-

ing every possible scenario, given thatfew development organizations ownone of every device, version, and car-rier connection. And even the mosttalented QA team will probably not beable to simulate every combination ofthese variables, especially when theidiosyncrasies of users are factored in.Fortunately, a lively marketplace hasarisen around mitigating these chal-lenges, and some of the products andservices detailed below may be help-ful to organizations struggling withmobile testing.

HETEROGENEITY CHALLENGESAND VENDORS’ RESPONSESSay what you will about Microsoft, forthose of us who remember the “olddays” when operating systems (OSes)were virtually vendor-specific, theubiquity of Windows has certainlymade it easier to develop and test. Ofcourse, Apple is, and always has been,an exception to the standardizationrule, and many Independent SoftwareVendors (ISVs) develop applicationsfor bothWindows and Mac OS. Thisstandardization has been a boon interms of limiting platform prolifera-tion. Assuming support for both ven-dors and a given number of operatingsystem levels for each, the worst casetesting scenario is still fairly limited.In contrast, mobile computing is

just about where personal computing

5 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

Mobile developmentrequires the sameSoftware DevelopmentLifecycle (SDLC)approach that addsdiscipline and gover-nance to traditionalsoftware development.

Page 6: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

was 20 years ago. Let’s see, you haveSymbian, Blackberry OS, WindowsMobile, Palm OS, Apple OS X, Linux,etc., etc. Add to this the fact thatthere are multiple browsers and littlestandardization across devices, throw

in 2 or 3 release levels each and theresult is a fairly complex testing sce-nario.This can be a problem for testing

teams since, unless they have one ofeach combination of device, OS, ver-sion, etc., it is virtually impossible toensure that a given application works“everywhere”. Multiple vendors areaddressing this challenge with a widearray of solutions.Some solutions are aimed at in-

house testing:

� Hewlett Packard (HP): HP is inte-grating mobile application testing intoits existing Quick Test Professional(QTP) solution by partnering withvendors that specialize in mobile test-ing. The first partnership is with JamoSolutions, which offers an end-userexperience management solution forWindows Mobile devices. HP is work-

ing on other partnerships as well,including one with DeviceAnywhere.DeviceAnywhere markets both cloud-based and on-site versions of itsmobile Quality of Service (QoS) solu-tion with support for more than 2000handsets and 30 mobile operators.

�Microsoft:Microsoft is a richsource of testing resources for Win-dows mobile applications, with abroad range of tools supporting devel-opment and testing of wireless appli-cations built using Microsoft technol-ogy. Microsoft’s mobile developmentsite is a good starting point.Resources include a set of mobile

emulator images that can be usedwith or without Microsoft Visual Stu-dio and provide a testing platform fortheWindows Mobile operating sys-tem.

� Dexterra™: Dexterra is a differentproduct entirely, and its flagship prod-uct is the Dexterra Concert mobileapplication development platform.Dexterra is available in two versions.“Enterprise Edition” is an open, stan-dards-based development platformdesigned to be deployed within theenterprise. “Carrier Edition” is target-ed at service providers who plan tohost the solution as a service. Bothplatforms support a wide range ofmobile devices as well as softwarerequirements management. Dexterrapromises to streamline and simplifydevelopment of mobile applications,and may be well worth evaluating forcompanies which are heavily investedin mobile development.

6 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

Mobile applicationsisn’t your grand-mother’s COBOL,and testing themrequires not just skill,but creativity andresourcefulness.

Page 7: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

There are also a variety of optionsfor Quality Assurance (QA) managerslooking for hands-on testers for spe-cific devices or to supplement in-house teams. This can be a lifesaverwhen testing schedules back up, or tocertify solutions against actual devicesinstead of emulations. Two vendorsoffering such services include:

� uTest®: uTest is positionedsquarely in the software testing realm,and recently announced support formobile applications. The company’sbusiness model is a distinctive one. Itconsists of a host of software testers(more than 14,000 worldwide) whotest mobile applications across a vari-ety of operating systems and plat-forms. Testers get paid “by the bug”.They execute testing scenarios andreport unexpected issues to the cus-tomer, who then decides whether theissue is, in fact, a defect. If so, the cus-tomer pays the tester who found thebug. If not, the customer does not pay.

This model can be valuable becausehuman testers actually simulate thekinds of issues that are likely to ariseonce the software is released to thepublic. It is also cost effective, in thatuTest’s customers don’t pay for test-ing “features” the tester interprets as“bugs”.

� Keynote™ Systems, Inc.: Keynoteis one of the best knownWeb testingcompanies in the world, due in part toits frequent press releases detailingperformance of well-known onlinevendors. Click here for an example of

a recent Keynote release.Keynote has now expanded into

mobile testing as well, leveragingtesters from locations worldwide todeliver an impressive breadth of solu-tions. They include:

�Mobile Device Perspective(MDP): Provides feedback on howeffectively mobile devices interactwith their service content

�Mobile Application Perspective(MAP): Provides information abouthow well mobile sites function duringtransmission and delivery

� SITE Test System: Providesmobile operators with metrics thatquantify the health of the mobile net-work

� Global Roamer: Provides an in-depth understanding of how applica-tions function from various pointsacross the world, based on testingfrom multiple locales

�Web Effective for iPhone: Plat-form to run large-scale, task-basediPhoneWeb usability studies

PROBLEMS AND PAY-OFFSThere are obviously many more ven-dors and solutions in the mobile mar-ket than there is space to discussthem. The point to think about is theidea of using automation whereverpossible to add efficiency to mobiledevelopment and testing.In recent years, the cost of IT has

risen to the point where, for somecompanies, it is limiting the ability togrow and compete. Software develop-ment, for many companies, is a cost

7 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

Page 8: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

center within IT, and efficiencies inthe development and testing spacesare as important to overall cost reduc-tion as efficiencies in the data center.That being said, rising IT costs

aren’t due to IT specialists, develop-ers, or testers who waste time. Theyare due to technology complexity. Thepast 10 years have seen an explosionof heterogeneous technology, witheach new technology requiring spe-cialized personnel, tools, and infra-structure. Mobile is no exception.Mobile applications represent a

“new reality” in terms of applications,in that many of the applications cur-rently under development cannot be100% tested by in-house personneland resources. They will be used byliterally millions of users, and it isimpossible for a testing team withfinite resources to duplicate everyuse case, platform, network, anddevice.Because of this, investments in

automated testing products and aug-mentation of in-house test teams with

third-party vendors will likely pay offin terms of both testing efficiencyand quality. Research has repeatedlyshown that software bugs detected

early in the lifecycle are much cheap-er to remedy than those found in pro-duction. Delivering quality software isone way that QA teams can directlyimpact costs, and this has becomeeven more true with the proliferationof mobile applications. �

8 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

Mobile applicationsrepresent a “newreality” in terms ofapplications, in thatmany of the applica-tions currently underdevelopment cannotbe 100% tested byin-house personneland resources.

ABOUT THE AUTHOR:

JULIE CRAIG, a senior analyst at EMA, has over 20 years of experience in software engineer-ing, IT infrastructure engineering and enterprise management. Julie's areas of focus currentlyinclude best practices, configuration management, application management, service-orientedarchitecture (SOA) and Software as a Service (SaaS). Her goal is to interpret industry trendswith a common sense approach and a simplified outlook.

Page 9: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

CHAPTER 2

p THE REGRESSION TEST is one of themost misunderstood software testingconcepts. Ineffective regression testpolicies lead to increased post-deployment risk and resulting busi-ness losses.Several recent software failures

should have been detected and miti-gated by effective regression testsuites. For example, the Blackberryblackouts of 2007 and 2008 are anunfortunate example of productionissues that should have been detectedby a regression test suite in combina-tion with other formal testing efforts.The convergence of adaptive develop-ment practices (agile, extreme, etc.)and complex integrated applicationspaces (Internet, cloud computing,etc.) create a growing risk profile thatneeds to be mitigated by a disciplinedimplementation of effective regres-sion testing policies.Testing early and often significantly

reduces remediation costs while miti-gating potential production risks. Inthis article, I will walk you throughregression test opportunities, regres-sion test coverage policies and regres-sion test development, as well as fill-

ing you in on five common regressiontesting mistakes.Regression testing is often referred

to as a Test Phase or Test Type, but itis really the execution of one or moretests within a given Test Phase. Forthe purposes of this article, a regres-

9 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

1Regression testinghot spots: Coverage,common mistakesThis article discusses regression test opportunities, coveragepolicies and development, as well as filling you in on fivecommon regression testing mistakes. BY DAVID W. JOHNSON

Several recent soft-ware failures shouldhave been detectedand mitigated byeffective regressiontest suites.

Page 10: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

sion test is the execution of tests toverify that changes in the applicationlandscape or architectural frameworkhave not negatively impacted existing,unchanged functionality.Generally, according toWhatIs, a

regression test involves the executionof tests in a current Test Phase thatvalidates the changes in the currentapplication landscape—such as soft-ware, hardware, data, and/or metada-ta—have not negatively impactedexisting, unchanged functionality.

REGRESSION TESTOPPORTUNITIESIn the past, a regression test has beenclosely associated with the FunctionTest Phase and System Test Phase.For the most part, the IT communityhas come to realize that testing earlyand often significantly reduces reme-diation costs while mitigating poten-

tial production issues. The followingtable illustrates regression test oppor-tunities as the solution moves frombeing conceptual to a production

release/package. The solution set caninclude software, hardware, data,metadata, or a mixture of these andany other aspects of the applicationlandscape.In the chart below, Solution Set

10 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

SOLUTION SET IS… EXISTS AS… REGRESSION OPPORTUNITY…

Conceptual (no code) A Model Design Review Phase

Under Construction Discrete Units Unit Test Phase• Instrumentation

Constructed Functional Function Test PhaseComponents •Functional Test Sets

Functional Integrated System Test PhaseSystem •System Test Sets

Confirmed Package User Acceptanceor Release Test Phase

•UA Test Sets

Released Production Production MonitoringSolution

For the most part,the IT community hascome to realize thattesting early and oftensignificantly reducesremediation costs whilemitigating potentialproduction issues.

Page 11: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

refers to the current maturity of thesolution set. Exists as is the currentdeliverables supporting the solutionset. Regression Opportunity notes theavailable regression test opportuni-ties. The primary opportunities are initalics.

� Unit Test Phase: InstrumentationThe first opportunity to perform sub-stantive regression testing occurs inthe unit test phase. The primary actorresponsible for regression testingwithin the Unit Test Phase is thedeveloper. The developer accomplish-es this by implementing the Adaptivedevelopment principles—namelyagile, extreme and rapid applicationdevelopment (RAD)—of continuousintegration and code instrumentation.Remediation of defects at this level ofsolution maturity will yield significantdownstream benefits and significantlyreduce remediation cost.I have recently had the pleasure of

working with several disciplined agiledevelopment teams that have fullyimplemented code instrumentationand continuous integration. Theresults of their efforts were a signifi-cant decrease in defects later in thedevelopment lifecycle.

� Function Test Phase: FunctionalTest SetsThe second opportunity to performsubstantive Regression Testing occursin the Function Test Phase – this is theopportunity that is most often lever-aged by organizations. The primaryactor responsible for Regression Test-ing within the Function Test Phase is

the Tester. The Tester should accom-plish this by implementing a suite ofautomated scripts that test the func-tional areas of the application deemedto be in-scope for the current test

engagement—these automatedscripts should be constructed to bereusable test artifacts. Remediation ofdefects at this level of solution matu-rity will reduce remediation costs.Manual regression testing can also

be performed, but the overall returnon investment is greatly reduced.Manual tests take longer to executeand are prone to failure over multipleexecutions. There is often not enoughtime in the test schedule to allow forsignificant manual regression testing.

� System Test Phase: SystemTest SetsThe third opportunity to perform sub-stantive regression testing occurs inthe System Test Phase. The SystemTest Phase should include severaltypes of testing; from a Regressiontesting perspective, the two mostimportant would be Business func-tionality and performance.The primary actor responsible for

regression testing business function-ality within the System Test Phase is

11 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

There is often notenough time in thetest schedule forsignificant manualregression testing.

Page 12: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

12 SOFTWARE TESTING JULY/AUGUST 2009

the tester. The tester should onceagain accomplish this by implement-ing a suite of automated scripts that

test the business functionalitydeemed to be in scope for the currenttest engagement; these automated

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

1Avoiding CommonRegression Test MistakesTHE MOST COMMON regression test mistakes come from misplacedassumption that regression tests:

• ARE STATIC

» Regression test maintenance must be ongoing. Regressiontests need to be maintained as IP (intellectual property) thatmeet the current organizational test coverage needs.

» I have worked several engagements in the last few years thathave treated existing regression test suites as static artifacts.In one case, the client ended up with over five-thousand testcases that could never be executed within the time-frame pro-vided for testing. Once we did a maintenance sweep of thesemanual regression test scripts we had less than one-thousandautomated test cases.

• ARE A NICE-TO-HAVE

» Regression testing is only a nice-to-have if the current organi-zational coverage needs—such as regulatory, IT governanceand risk—does not require any supportive regression test met-ric. This would be a rather unusual circumstance, but it doesoccur; i.e., an in-house reporting tool.

» Anyone who works in the testing space has seen the conse-quences of this mindset. In several cases I have seen clients dolittle or no regression testing and pay the cost in post-produc-tion support; a cost that far exceeds the cost of actually doingthe testing.

• NEED TO RETEST EVERYTHING

» This is usually not a practical goal unless the nature of theapplication space (i.e. medical devices) and the level of risk,justify a complete end-to-end regression test—the testingteam simply runs out of time. Unfortunately this usually meansthat the “easier” tests are executed first while the “harder”regression tests are deferred. Ensure you create a test sched-ule that fits into the available Regression test time-box andaddresses the higher priority coverage items first.

Page 13: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

scripts should be constructed to bereusable test artifacts. Ensure thesetests do not replicate other functiontesting efforts.The primary actor responsible for

regression testing performance withinthe System Test Phase is the perform-ance assurance test engineer. Theperformance assurance test engineershould accomplish this task by imple-menting a regression suite of per-formance tests and associated per-formance monitors—these suitesshould be constructed to be reusableperformance test artifacts.Manual regression testing of busi-

ness functionality can also be per-formed but the overall return oninvestment is greatly reduced. Manualtests take longer to execute and areprone to failure over multiple execu-tions. There is often not enough timein the test schedule to allow for signif-icant manual regression testing.

� User Acceptance Test PhaseThe final opportunity to performregression testing occurs during theUser Acceptance Test Phase. Thesemanual tests are performed by theend-user. The primary purpose ofregression testing during this testphase should be assuring the end-user that no unexpected changeshave occurred in previously existingfunctionality.

REGRESSION TEST COVERAGERegression test coverage policiesguide the organization in determiningwhat regression test artifacts are

required to be built and, more impor-tantly maintained. Regression testpolicies should be based on three keyfactors:

1. Regulatory Space• Rules, Regulations, Laws,and Standards that apply tothe application space.

» i.e. FDA, FAA, SOX, etc.

2. IT Governance• Corporate governance as itapplies to the regulatory space.

• Corporate governance as itapplies to corporate rules,regulations, and standards.

3. Risk Profile• The acceptable level or riskfor any given aspect of theapplication space.

� Regulatory SpaceThe regulatory space can have asignificant impact on the scope ofRegression testing within any giventest phase. Mangers and test leadsresponsible for testing need to ensurethey understand their legal and fiduci-ary responsibilities within the regula-tory space and the impacts of thisresponsibility on the overall scope ofthe testing effort.

� IT GovernanceIT governance implies a system inwhich all stakeholders have the nec-essary input into the IT decision mak-ing process preventing IT from inde-pendently making and later being heldsolely responsible for poor decisions.

13 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

Page 14: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

Therefore a board needs to:

• understand the overall architec-ture of its company’s IT applica-tions portfolio;

• ensure that management knowswhat information resources areout there;

• know what condition informationresources are in;

• understand what role informationresources play in generatingrevenue.

There are several definitions of ITgovernance:

•Weill and Ross focus on “Specify-ing the decision rights andaccountability framework toencourage desirable behavior inthe use of IT.”

• IT Governance Institute expandsthe definition to include founda-tional mechanisms: “… the leader-ship and organizational structuresand processes that ensure that theorganization’s IT sustains andextends the organization’s strate-gies and objectives.”

• AS8015, the Australian Standardfor Corporate Governance ofInformation and CommunicationTechnology (ICT) defines Corpo-rate Governance of ICT as “Thesystem by which the current andfuture use of ICT is directed andcontrolled. It involves evaluatingand directing the plans for the useof ICT to support the organization

and monitoring this use to achieveplans. It includes the strategy andpolicies for using ICT within anorganization.”

From a regression testing perspec-tive application of IT Governance tothe application landscape means theappropriate level of information mustbe consistently supplied to supportdecision making at the board level.Regression testing provides severalopportunities to collect appropriateinformation—as IT Governance prac-tices mature the amount of informa-tion collected needs to be assessedfor completeness.

� Risk ProfileRisk is the net negative impact of avulnerability taking into considerationboth the probability and impact/costif the risk becomes an issue/event.Risk management is the process ofidentifying risk, assessing risk, and

taking steps to reduce risk to anacceptable level. Risk managementoften encompasses three mainprocesses: risk assessment, risk miti-gation, and risk evaluation—Regres-sion testing deals with supplying data

14 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

Regression testsshould be consistentlyreviewed andadapted to meetthe current coverageneeds.

Page 15: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

15 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

to support risk evaluation. The organi-zation should have already consideredwhat level of risk is acceptable for aparticular application. It is the respon-sibility of the organization to under-stand the basis for the current riskassessment and apply the appropriatelevel of Regression testing rigor, tomeet the specified level of acceptablerisk.Acceptable levels of risk can be

categorized as:

Tier 1: Critical Application Space• Minimal or no interruptionsin service are acceptable

Tier 2: Important Application Space• Short uncommon interruptionsin service are acceptable

Tier 3:Normal Application Space• Short common interruptionsin service are acceptable

Tier 4:Non-Critical ApplicationSpace• Interruptions in service areacceptable

• Application can be pulled

� Regression Test Policies:Coverage1. Regression test coverage in com-

bination with other formal testingefforts shall provide sufficient testcoverage to satisfy the RegulatorySpace.2. Regression test coverage in

combination with other formal testingefforts shall provide sufficient test

coverage to satisfy the current ITGovernance requirements.3. Regression test coverage in

combination with other formal test-ing efforts shall provide sufficienttest coverage to satisfy the currentacceptable Risk Profile.

REGRESSION TESTDEVELOPMENTRegression tests should address allmeaningful:

• Code Branches: Regressiontesting at the level of Unit Test

• Functional Business events:Regression testing at the levelof Function Test

• End-to-End Business events:Regression testing at the levelof System Test

• User Experience: Regressiontesting at the level of UserAcceptance Test

In each case appropriate tests needto be designed, built, and constructed.These tests need to be easy to main-tain, with definitive pass/fail condi-tions, and when possible automated.Regarding regression test automa-

tion, the User Acceptance test shouldbe defined and executed by represen-tatives of the user community - thesetypes are rarely automated. All otherlevels of regression testing should beeither automated using test automa-tion tools or instrumented usingAdaptive development techniques.On the maintenance front, regres-

sion tests are often treated as a static

Page 16: SOFTWARETESTING - Bitpipeviewer.media.bitpipe.com/1196438839_379/1248716121... · 14 SOFTWARETESTING JULY/AUGUST2009 a EDITOR’S LETTER a CHAPTER1 THE CHALLENGES OFTESTING WIRELESS

set of artifacts to be applied duringeach release cycle. Regression testsshould be consistently reviewed andadapted to meet the current coverageneeds (regulation, governance, andrisk). This review should includeadding to, removing, and mergingregression tests in the regression testsuite. If test case maintenance is notincorporated into Regression testing,the tendency is to end up with anunmanageable number of tests thatmay not address the coverage need.In software development, there are

some key regression test policies thatyou can’t overlook without bad conse-quences. From my experience, theseare the most important:

• Regression tests shall be auto-mated using test automationtools or instrumented into thesolution using Adaptive develop-ment techniques.

• Regression tests shall undergoscheduled maintenance designed

to right size the Regression testsuite to meet the current cover-age needs, such as regulation,governance and risk.

• Regression test dependencies—plans, schedules, test data, testenvironments, tools, etc.—mustbe considered part of the regres-sion test suite.

Regression testing needs to beincorporated into the overall testingstrategy. The goal should be to miti-gate risks to production caused byunexpected impacts against un-changed functionality; creating re-gression test “saves” early in the sys-tem development life cycle. A “save”is the detection of a defect before itreaches production thus providing theopportunity to remediate the defectbefore it is released. The regressiontesting effort also needs to be right-sized to meet the corporate coveragerequirements and fit into the availableregression testing time-box. �

16 SOFTWARE TESTING JULY/AUGUST 2009

aEDITOR’SLETTER

aCHAPTER 1

THECHALLENGESOF TESTINGWIRELESSMOBILE

APPLICATIONS

aCHAPTER 2

REGRESSIONTESTING

HOT SPOTS:COVERAGE,COMMONMISTAKES

ABOUT THE AUTHOR:

DAVID (DJ) W. JOHNSON is a senior test architect with over 22 years of experience in Infor-mation Technology across several industries having played key roles in business needs analy-sis, software design, software development, testing, training, implementation, organizationalassessments, and support of business solutions. Developed specific expertise over the past 12years on implementing “Test ware” including - test strategies, test planning, test automation

(functional and performance), and test management solutions.