9
Empirical Evaluation of Cloud-based Testing Techniques: A Systematic Review Priyanka Thapar University,India [email protected] Inderveer Chana Thapar University,India [email protected] Ajay Rana Amity University,Noida,India [email protected] ABSTRACT Software Testing is a challenging activity for many software engi- neering projects, especially for large scale systems. The amount of tests cases can range from a few hundred to several thousands, requiring significant computing resources and lengthy execution times. Cloud computing offers the potential to address both of these issues: it offers resources such as virtualized hardware, ef- fectively unlimited storage, and software services that can aid in reducing the execution time of large test suites in a cost-effective manner. In this paper we report on a systematic review of cloud based testing techniques published in major software engineer- ing journals and conferences conducted by other researchers. Re- search papers were gathered from various scholarly databases us- ing provided search engines within a given period of time. A total of 82 research papers are analyzed in this systematic review and we classified it into four categories according to issues addressed by them. We identified majority of the research papers focused on Cloud based Testing and Issues (38 papers) and 23 papers fo- cused on Cloud based Testing Frameworks. By looking at the areas focused by existing researchers, gaps and untouched areas of cloud based testing can be discovered. Categories and Subject Descriptors D.2.5 [Testing and Debugging]: General Terms Software Testing, Privacy-aware Testing, Metamorphic Testing, Symbolic Execution, Security Testing Keywords Cloud-based Testing, Performance Testing, Testing cloud services, Cloud Testing 1. INTRODUCTION Cloud Computing is a model that provides convenient, on-demand access to a shared pool of configurable computing resources, such as networks, servers, storage, applications and services that can be rapidly provisioned and released with minimal management effort or service provider interaction. Leading companies such as IBM, Microsoft, Google, and Amazon have a vested interest in cloud computing [1]. As such several applications, platforms, and infrastructures are being developed to facilitate the delivery of computing resources as services over the Internet [2] [3] [4]. It is anticipated that by 2015 more than 75% of Information Tech- nology infrastructure will be purchased as a service from service providers [5]. In 2010, Garner estimated that “the cloud service market will reach $150.1 billion in 2013”. A recent study of Mar- ket Research Media forecasts that U.S. government spending on cloud computing is entering an explosive growth phase at about 40% CAGR over the next six years. Expenditure will pass $7 billion by 2015. Merrill Lynch estimates that within the next five years, the annual global market for cloud computing will surge to $95 billion. Although much work is being done to model and build cloud applications and services, there is significantly less re- search devoted to testing them [6] [7]. Recent research as shown in Figure 1 from Fujitsu [8] suggests that testing and application development rank second (57%) as the most likely workload to be put into the cloud after Websites (61%). There are several factors that account for the migration of testing in the cloud [9] [6] [10] [11] [12] [13] [14] [15] [16] [17]: (i) Testing is a periodic activity and requires new environments to be set up for each project. Test labs in companies typically sit idle for longer periods, consuming capital, power and space. (ii) Testing is considered an important but non-business-critical activity. Moving testing to the cloud is seen as a safe bet because it doesn’t include sensitive corporate data and has minimal im- pact on the organization’s business-as-usual activities. (iii) Applications are increasingly becoming dynamic, complex, distributed and component-based, creating a multiplicity of new challenges for testing teams. For instance, mobile and Web ap- plications must be tested for multiple operating systems and up- dates, multiple browser platforms and versions, different types of hardware and a large number of concurrent users to under- stand their performance in real-time. The conventional approach of manually creating in-house testing environments that fully mir- ror these complexities and multiplicities consumes huge capital and resources. Figure 1: Top Application of Cloud [8] The benefits of cloud based testing can be enumerated as below [18][19][20][21]: (i) Testing in the cloud leverages the cloud computing infrastruc- ture reducing the unit cost of computing, while increasing testing effectiveness. (ii) Cloud-based testing service providers offer a standardized in- ACM SIGSOFT Software Engineering Notes Page 1 May 2012 Volume 37 Number 3 DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

  • Upload
    haquynh

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

Empirical Evaluation of Cloud-based Testing Techniques: ASystematic Review

PriyankaThapar University,India

[email protected]

Inderveer ChanaThapar University,India

[email protected]

Ajay RanaAmity University,Noida,[email protected]

ABSTRACTSoftware Testing is a challenging activity for many software engi-neering projects, especially for large scale systems. The amountof tests cases can range from a few hundred to several thousands,requiring significant computing resources and lengthy executiontimes. Cloud computing offers the potential to address both ofthese issues: it offers resources such as virtualized hardware, ef-fectively unlimited storage, and software services that can aid inreducing the execution time of large test suites in a cost-effectivemanner. In this paper we report on a systematic review of cloudbased testing techniques published in major software engineer-ing journals and conferences conducted by other researchers. Re-search papers were gathered from various scholarly databases us-ing provided search engines within a given period of time. A totalof 82 research papers are analyzed in this systematic review andwe classified it into four categories according to issues addressedby them. We identified majority of the research papers focusedon Cloud based Testing and Issues (38 papers) and 23 papers fo-cused on Cloud based Testing Frameworks. By looking at theareas focused by existing researchers, gaps and untouched areasof cloud based testing can be discovered.

Categories and Subject DescriptorsD.2.5 [Testing and Debugging]:

General TermsSoftware Testing, Privacy-aware Testing, Metamorphic Testing,Symbolic Execution, Security Testing

KeywordsCloud-based Testing, Performance Testing, Testing cloud services,Cloud Testing

1. INTRODUCTIONCloud Computing is a model that provides convenient, on-demandaccess to a shared pool of configurable computing resources, suchas networks, servers, storage, applications and services that canbe rapidly provisioned and released with minimal managementeffort or service provider interaction. Leading companies such asIBM, Microsoft, Google, and Amazon have a vested interest incloud computing [1]. As such several applications, platforms, andinfrastructures are being developed to facilitate the delivery ofcomputing resources as services over the Internet [2] [3] [4]. Itis anticipated that by 2015 more than 75% of Information Tech-nology infrastructure will be purchased as a service from serviceproviders [5]. In 2010, Garner estimated that “the cloud servicemarket will reach $150.1 billion in 2013”. A recent study of Mar-ket Research Media forecasts that U.S. government spending oncloud computing is entering an explosive growth phase at about40% CAGR over the next six years. Expenditure will pass $7

billion by 2015. Merrill Lynch estimates that within the next fiveyears, the annual global market for cloud computing will surgeto $95 billion. Although much work is being done to model andbuild cloud applications and services, there is significantly less re-search devoted to testing them [6] [7]. Recent research as shownin Figure 1 from Fujitsu [8] suggests that testing and applicationdevelopment rank second (57%) as the most likely workload to beput into the cloud after Websites (61%). There are several factorsthat account for the migration of testing in the cloud [9] [6] [10][11] [12] [13] [14] [15] [16] [17]:(i) Testing is a periodic activity and requires new environmentsto be set up for each project. Test labs in companies typically sitidle for longer periods, consuming capital, power and space.(ii) Testing is considered an important but non-business-criticalactivity. Moving testing to the cloud is seen as a safe bet becauseit doesn’t include sensitive corporate data and has minimal im-pact on the organization’s business-as-usual activities.(iii) Applications are increasingly becoming dynamic, complex,distributed and component-based, creating a multiplicity of newchallenges for testing teams. For instance, mobile and Web ap-plications must be tested for multiple operating systems and up-dates, multiple browser platforms and versions, different typesof hardware and a large number of concurrent users to under-stand their performance in real-time. The conventional approachof manually creating in-house testing environments that fully mir-ror these complexities and multiplicities consumes huge capitaland resources.

Figure 1: Top Application of Cloud [8]

The benefits of cloud based testing can be enumerated as below[18][19][20][21]:(i) Testing in the cloud leverages the cloud computing infrastruc-ture reducing the unit cost of computing, while increasing testingeffectiveness.(ii) Cloud-based testing service providers offer a standardized in-

ACM SIGSOFT Software Engineering Notes Page 1 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 2: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

frastructure and pre-configured software images that are capableof reducing errors considerably.(iii)The non-cost factors include utility like on-demand flexibility,freedom from holding assets, enhanced collaboration, greater lev-els of efficiency and, most important, reduced time-to- market forkey business applications.This study is interested in finding the state of research in cloudbased testing. As there are not many relevant high-quality pri-mary studies relevant to cloud testing, a systematic mappingstudy is recommended [22]. A mapping study “involves a searchof the literature to determine what sorts of studies addressing thesystematic review question have been carried out, where they arepublished, in what databases they have been indexed, what sortsof outcomes they have assessed, and in which populations” [23]).This paper aims to provide an overview of the state of research inthe area of software testing over cloud with the intention of pro-viding future research area by looking at the gaps and untouchedareas. This paper is organized as follows. Section 2 describesthe research method in conducting the mapping study. Section 3reports the result of the study. Section 4 describes threats to va-lidity. Section 5 concludes the work and suggests future researchdirection.

2. RESEARCH METHOD2.1 Research QuestionsThis review aims at summarizing the current state of the art incloud based testing research by proposing answers to the followingquestions:

(i) What is cloud testing and its facets? And need to migrateto cloud for software testing?

(ii) Characteristics of applications suitable for Online SoftwareTesting?

(iii) What are the different approaches to cloud based testing?

(iv) What are the challenges and benefits of testing as a serviceon the cloud?

(v) Who are the existing providers of Cloud Testing and thespecific characteristics of their tools?

2.2 Sources of InformationIn order to gain a broad perspective, as recommended in Kitchen-ham’s guidelines [22], we searched widely in electronic sources.The databases covered are:

• ACM Digital Library (<portal.acm.org>)

• IEEE eXplore (<ieeexplore.ieee.org>)

• Springer LNCS (<www.springer.com/lncs>)

• Google Scholar

• Inspec (<www.theiet.org/publishing/inspec/>)

These databases cover the most relevant journals and conferenceand workshop proceedings within software engineering, as con-firmed by Dyba et al. [24].

2.3 Search CriteriaBased on the research questions in section 2.1, a set of keywordsand its synonyms were defined as search strings. As researchCloud based testing is still new, no specific year range was in-cluded in the search. Table 1 describes the keywords used as wellas its synonyms.

Table 1: Search Keywords and SynonymsKeywords SynonymsTest Cloud Cloud Testing

Online Software Testing TaaSTesting over Cloud Testing in Cloud

2.4 Study SelectionA set of inclusion and exclusion criteria were defined for the se-lection process. These criteria were then used in the mappingstudy process during searching. Figure 2 illustrates the mappingstudy process. The systematic review process started with defin-ing research question as stated in section 2.1. Next step was todefine the searching keyword. The keywords are listed in Table1. Using the keywords, searching was carried out on the selectedrepositories using the provided search engine. Once the list ofresearch papers was obtained, the papers were excluded accord-ing to the title, abstract and conclusion. At the same time, thesame papers returned by different repositories were removed toeliminate redundancy. Finally, reference analysis was conductedto ensure that referenced papers were not missed out. Referenceanalysis is important as some of the publications might have beenmissed out during the keywords based search using search enginesand during exclusion based on title and abstract. By looking atthe reference at the end of each paper, relevant papers that aremissed earlier can be included in the mapping study process.

3. RESULTS AND DISCUSSIONA total of 82 research papers related to Testing as a Service werereturned by the mapping study searching process. The researchpapers were then categorized and classified into 4 groups whichare Cloud based Testing, Automated Test Case generation, Test-ing Frameworks and Cloud Application Development. Figure 3illustrates the breakdown of research papers according to the fourcategories. Based on Figure 3, the majority of the research papers

Figure 3: Breakdown of Research Paper Category

focused on Cloud based Testing (38 papers) followed by TestingFrameworks (23 papers). Automated Test Case generation (9 pa-pers)and Cloud Application Development contributed 12 papers.Figure 4 shows the number of research publications made per yearfor each paper category. Overall, research publications on Test-ing as a Service (TaaS) grew from two publications in 2004 to 31publications in Dec 2011. Based on Figure 4, research papers onCloud based Testing and Testing Frameworks have the highestnumber of increase in publications as compared to other researchpaper category, as discussed in Section 3. The publications werealso classified based on different forum types, similar to system-atic review conducted by (Endo and Simao, 2010). The forum

ACM SIGSOFT Software Engineering Notes Page 2 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 3: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

DefineresearchQuestion

Define search keywords

Searchrepository

Exclusion basedon title, abstractand conclusion

Eliminate Redundancy

Reference Analysis

Figure 2: Study Selection Procedure

Figure 4: Number of Research Publications per year ineach category

types are workshops, conferences and symposiums, journals andwhitepapers. A large percentage of the publications come fromwhitepapers (37 papers), followed by conferences and symposiums(27 papers. Journals (15 papers)and Workshops contributed 3 pa-pers each to the overall publication category. We observed largenumber of whitepapers as compared to research publications inConferences and Journals hence it can be concluded that aca-demic research in this area is scarce. This is illustrated in Figure5. The following section will discuss the findings of the mapping

Figure 5: Number of Publications in different Forums

study for each focus area as categorized in Figure 3.

3.1 Cloud Based TestingBased on the papers gathered, Testing in the cloud or cloud test-ing can be viewed from three different angles (i) Testing at differ-ent levels e.g. performance testing, security testing etc. of onlineSaaS or non-SaaS software (ii) Testing infrastructure and plat-forms across different deployment models of the cloud i.e. public,community, private or hybrid clouds; (iii) Testing of the cloud it-

self. Cloud environments should be tested and measured for theirperformance, availability, security and scalability in order to sup-port efficient delivery of services [25]. A majority of the researchpapers focused on Cloud based testing. From a total of the 82papers, 38 papers discussed Cloud based testing. In order to ana-lyze the test generation approaches, the following questions weredefined:

(i) What are the issues related to cloud based testing?

(ii) What are the different approaches to implement cloud basedtesting?

We have classified Cloud based testing into two categories asshown in Table :

• Cloud Based Testing and issues.

• Offerings by Potential Providors.

Based on Table 2, 21 papers discussed benefits, needs, issues andseveral factors to adopt cloud based testing. Only 2 papers il-lustrated steps, techniques and methodologies to migrate testingto cloud and there are 15 papers on the approaches adopted bydifferent vendors of cloud based testing. The challenges of cloudbased testing are as follows [26] [27] [28] [11] [29] [30]:(i) Lack ofstandards (ii) Security in the public cloud (iii) SLAs (iv) Usage(v) Planning (vi) Performance

3.2 Testing ModelsSeveral authors have proposed models for cloud based testing.Table 3 lists the testing frameworks, year and its approach. Asshown in Table 3, we have divided it into seven categories:

• Model based Testing.

• Performance Testing.

• Symbolic Execution.

• Fault Injection Testing.

• Random Testing.

• Privacy Aware Testing.

• Others.

As shown in Table 3, there are five papers of Model based test-ing and Performance testing, two authors have explored SymbolicExecution and privacy aware testing of cloud applications. Thereis one paper addressing Fault Injection Testing and Random Test-ing. papers focused on security or privacy testing of cloud envi-ronment. Four authors have implemented software testing usingvirtualized environment. Takayuki Banzai et al. [64] proposed a

ACM SIGSOFT Software Engineering Notes Page 3 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 4: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

Table 2: Cloud based TestingCategory Author Year Issue Addressed

Cloud Based Testing and issues

Somenath Nag [31] 2011 OnDemand TestingLeah Muthoni Riungu et al. [32] 2010 Online Software TestingJerry Gao et al. [27] 2011 Cloud Based Application TestingIBM [6] 2011 IBM Testing Services for Cloud-

Application VirtualizationSunil Joglekar [10] 2009 Agile TestingSidharth Ghag [33] 2011 Testing on AzureLeah Muthoni Riungu et al. [28] 2010 Research Issues of Cloud Computing

imposed on Software TestingAli Babar et al. [34] 2011 engineering aspects of migrating SOA

systemK. Priyadarsini et al. [35] 2011 Architecture of Testing ServiceAppLabs [11] 2009 Benefits of Cloud Computing in the

area of TestingLeo van der Aalst [29] 2010 Providor’s Service Models and Process

ModelsImpetus [12] 2011 Cloud utilization during Testing and

its challengesPerfecto mobile [13] 2011 Cloud based mobile testingGanesh Moorthy [36] 2010 SOA testing automation frameworkRajesh Mansharamani [37] 2008 Virtual Production EnviornmentEwald Roodenrijs [38] 2010 Cloud Testing benefits and challengesJinesh Varia [39] 2010 Migration steps to CloudCSC [14] 2011 Testing methodology adopted by CSCTauhida Parveen et al. [26] 2010 Migration of Software Testing to CloudCognizant [15] 2011 Factors affecting Testing in the cloudImpetus [16] 2011 Cloud Computing in the product de-

velopment life cycleTUI [17] 2011 Cloud Testing

Offerings by Potential Providors

Siemens [40] 2011 SiTEMPPOCSS [41] 2009 ClouTestGoWipro [6] 2010 Assurance Services on the cloudHP [42] 2010 Accelerated Test Service, Performance

Test Service, Application SecurityTesting and SAP Test Service

Veracode [43] 2010 Cloud based application security ser-vice

ITKO [44] 2011 LISA DevTest CloudManagerCloudTesting [45] 2011 Browser based testingPushToTest [46] 2011 Selenium based Functional Testing,

Load testingIBM [47] 2010 OnDemand onPremises TestingSauceLabs [48] 2011 Selenium based onDemand TestingSkyTap [49] 2011 Development, Unit, Integration, Sys-

tem, User Acceptance and Perfor-mance Testing

uTest [50] 2011 OnDemand load testing, security test-ing, functionality testing

VMLogix [51] 2011 Desktop, Dashboards Testing, test cre-ation, defect tracking, test automation

SOASTA [52] 2011 CloudTest, Performance Testing ofweb and mobile application

Sogeti [29] 2011 CloudTest, STaaSRackspace [53] 2011 Test and Dev on Cloud

software testing environment, called D-Cloud, using cloud com-puting technology and virtual machines with fault injection facil-ity. Sebastian Gaisbauer et al. [18] implemented VATS for anautomatic performance evaluator for Cloud environments. VATSuses HP LoadRunner as a load generator and provides the foun-dation for an automatic performance evaluator for Cloud envi-ronments. Tariq M. King et al. [68] proposed a collaborativeframework approach which combined the development of an au-tomated test harness for a cloud service, with the delivery of testsupport as-a-service (TSaaS). Marco A. S. Netto et al. [60] uses avirtualized environment for load generation aimed at performancetesting. Both Liviu Ciortea et al. [21] and Stefan Bucur et al. [63]contributed towards the development of Cloud9, the first symbolicexecution engine that scales to large clusters of machines. StefanBucur et al. [63] extended the research of Liviu Ciortea by includ-ing a new symbolic environment model that supports all majoraspects of the POSIX interface, such as processes, threads, syn-chronization, networking, IPC, and file I/O. Alexey Lastovetsky

[72]worked towards the parallel testing of computer software sys-tems.Sasa Misailovit’c et al. [71] presented novel algorithms builton the Korat algorithm for constraint-based generation of struc-turally complex test inputs. W. K. Chan [30] proposed a meta-morphic approach to online services testing. Manuel Oriol [19]developed YETI, language agnostic random testing tool. GaryWassermann [70] proposed automated input test generation algo-rithm, standard concolic framework for automated test executionin Cloud computing environments. Beatriz Marın [67] investi-gated the testing challenges of future web applications and pro-posed a testing methodology by integrating search based testing,model-based testing, oracle learning, concurrency testing, combi-natorial testing, regression testing and coverage analysis. Thereare 2 whitepapers by CSS in which they discussed performanceengineering of web applications and performance testing tool pro-vided by the company.

3.3 Cloud Application Development

ACM SIGSOFT Software Engineering Notes Page 4 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 5: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

Table 3: Testing ModelsApproach Author Year Issues Addressed

Model Based Testing

Hien Le et al [54] 2011 Model Testing for component baseddevelopment

W.K. Chan et al [55] 2009 Model based testingAntonia Bertolino et al [56] 2008 Model based approach to generate

stubs for web servicesAntonia Bertolino [57] 2010 State of art of Model based testingAntti Jaaskelainen et al [58] 2008 Model based Test generation algo-

rithms

Performance Testing

Sebastian Gaisbauer [18] 2008 Automatic performance Testing;VATSCSS [59] 2009 On-demand performance testing toolMarco A. S. Netto [60] 2011 Virtualized enviornment for load gen-

eration aimed at performance testingZohar Ganon [61] 2009 large-scale performance tests of Net-

work Management SystemsCSS [62] 2008 Performance Engineering

Symbolic ExecutionStefan Bucur [63] 2011 Symbolic Execution ; Cloud9Liviu Ciortea [21] 2009 Symbolic Execution ; Cloud9

Fault Injection Testing Takayuki Banzai [64] 2010 Software Testing Enviornment basedusing fault injection;D-Cloud

Random Testing Manuel Oriol [19] 2010 YETI;Language Agnostic random test-ing tool

Privacy Aware TestingLin Gu [65] 2009 Privacy Protected TestingPhilipp Zech [66] 2011 Security Testing

Others

Beatriz Marın [67] 2011 Integration of search based test-ing, model-based testing, oracle test-ing, concurrency testing, combinato-rial testing, regression testing and cov-erage analysis

W. K. Chan [30] 2010 Metamorphic approach to online ser-vice testing

Tariq M. King [68] 2010 Virtual Test Enviornment;AST andTSaaS

T. Vengattaraman [69] 2010 Cloud computing Model for softwaretesting

Gary Wassermann [70] 2008 Concolic framework for automated testexecution

Sa.a Misailovit’c [71] 2007 parallel testing of code on the Koratalgorithm

Alexey Lastovetsky [72] 2004 Design and Implementation of ParallelTesting

Testing vendors and customers interested in testing in the cloudwould want to be aware of the type of applications that can readilybe tested in the cloud. This would help them to better preparefor the migration of testing to the cloud. There is need to findthe exact business areas that Software Testing as a Service wouldbe best approached, how you actually do it, and implement it.To find out the answer of the question no. 2 defined in researchquestions section, we have decomposed it further as follows:

(i) Will all applications run in the Cloud?

(ii) Can all of the existing applications be ported to the Cloud?

(iii) Should all new applications be developed in the Cloud?

Table 4 shows the papers reviewed for the above purpose.

3.4 Automatic Test Case GenerationAutomated software testing means the automation of softwaretesting activities [85]. These activities include the developmentand execution of test scripts, the verification of testing require-ments, and the use of automated test tools. By automating someof test process phases and directing available resources towardsadditional testing can result in gains [86]. Berner et al. [87] esti-mated that most of the test cases in one project are run at leastfive times, and one-fourth over 20 times. Especially smoke tests,component tests, and integration tests are repeated constantly,so there is entire need for automation development. According to

Bertolino et al. [88], test automation is a significant area of inter-est in current testing research, with the aim to improve the degreeof automation, either by developing advanced techniques for gen-erating test inputs, or by finding support procedures to automatethe testing process. The main benefits of test automation arequality improvement, the possibility to execute more tests in lesstime and fluent reuse of testware. The major disadvantages arethe costs associated with developing test automation especially indynamic customized environments. According to Ossi Taipale etal. [85] optimal case for automated software testing would be astandardized product with a stable, consistent platform and casesthat yield unambiguous results which can be verified with minimalhuman intervention. In addition, as software automation requireseffort to maintain and develop, the system should also be easilyreusable in different software projects to have feasible return forthe initial investment.Table 5 lists down the authors involved, the approaches and spec-ification used.

4. THREATS TO VALIDITYThe validity of this mapping study is threatened by the followingissues:

• The research papers were obtained by keyword searchingand reference analysis. Exclusions were made by readingthe title, abstract and conclusions. However, there is a pos-sibility that there exist papers that were missed due to theabove searching and exclusion method.

ACM SIGSOFT Software Engineering Notes Page 5 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 6: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

Table 4: Cloud based Application DevelopmentAuthor Year Issues AddressedRadha Guha [73] 2010 Agile Process ModelDarryl Chantry [74] 2009 Mapping of enterprise applications to

Cloud Computing platformsEmilio P. Mancini [75] 2009 PerfCloudCarmelo Ragusa [76] 2011 Automatic Deployment of applications

to the cloudYANG Jie [77] 2009 Profile-based ApproachBharat Chhabra [78] 2010 Software Engineering IssuesShigeru Hosono [79] 2010 Framework for the development and

operation enviornment for cloud appli-cations

Gautam Shroff [80] 2008 Model Driven Development in theCloud

Pamela Bhattacharya [81] 2010 Dynamic updates for cloud and webapplications

Jan S. Rellermeyer [82] 2009 Model of development, deploymentand management of software applica-tions on the cloud

Pavan Yara et al. [83] 2009 Distributed Software developmentRanjan Sen [84] 2009 Methodology for development of paral-

lel program

Table 5: Automatic Test Case GenerationAuthor Year Issues AddressedCSS [89] 2008 Test automation FrameworkKamal Zuhairi Zamli et al. 2007 Automated and Combinatorial Soft-

ware testing toolChen Mingsong [90] 2006 automatic test case generation ap-

proach using UML diagrams.James Miller et al. [91] 2009 program dependence analysis tech-

niques and genetic algorithms (GAs) togenerate test data.

William G.J. Halfond [92] 2007 Technique for automatically discover-ing web application interfaces based ona novel static analysis algorithm.

Marc Fisher [93] 2004 Approach that utilizes two techniquesfor generating test cases, one involvingrandom selection and one involving agoal-oriented approach

IMPETUS [94] 2011 Test automation strategy.A. Bertolino [88] 2007 Benefits of Test AutomationO. Taipale [85] 2011 Trade-off between Automated and

Manual Software Testing

• Judgmental errors in classifying the papers into each cate-gory.

5. CONCLUSION AND FUTURE WORKThis review paper has described the systematic mapping process,discussed the results of the mapping study and threats to thevalidity of the study. The systematic mapping process was de-scribed in terms of the research questions defined, searching key-words used, the exclusion and inclusion criteria. The results ofthe study was classified into several categories and analyzed. Thepaper has shown the areas of research within Cloud based testingthat have been done by answering the questions that were de-fined initially. Most of the research papers are from whitepapers,conferences and symposium papers (78%), which is an indicationthat the research area is still immature. In this paper we pre-sented results from a systematic review of empirical evaluationsof cloud based testing techniques. Related to our research ques-tions we have identified that:1) there are 23 empirically evaluatedtechniques on Cloud based testing published, 2) these techniquesmight be classified according to the input needed for the tech-nique, type of testing performed. We have identified that thereis Lack of standards in test tools and their connectivity and in-teroperability to support TaaS service. Although five researchershave explored in the field of Performance Testing and Model based

testing. Only two papers have been found that addresses secu-rity testing of cloud based software. More work need to be donein order to improve the current state of research in cloud basedtesting.

6. REFERENCES[1] A. Weiss, “Computing in the clouds,” netWorker, vol. 11,

pp. 16–25, Dec. 2007.

[2] A. Inc, “Cloudwatch: Monitoring for aws cloud resources.”Available at http://aws.amazon.com/cloudwatch/ [LastAccesed: Dec 2011].

[3] G. Inc., “”google apps: Reliable, secure, onlineapplications”.” Available at http://apps.google.com [LastAccesed: Dec 2011].

[4] M. Corporation, “”microsoft windows azureplatform:operating system as an online service”.” Availableat http://www.microsoft.com/windowsazure/ [LastAccesed: Dec 2011].

[5] G. Research, “”gartner research, june 2007”.” ID number:G00148987”.

[6] IBM, “Ibm testing services for cloud applicationvirtualization overview ibm.” IBM Whitepaper. Available athttp://www-935.ibm.com/services/us/en/business-services/

ACM SIGSOFT Software Engineering Notes Page 6 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 7: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

[Last Accesed: Dec 2012].

[7] B. P. Rimal, E. Choi, and I. Lumb, “A taxonomy andsurvey of cloud computing systems,” in Proceedings of the2009 Fifth International Joint Conference on INC, IMS andIDC, NCM ’09, (Washington, DC, USA), pp. 44–51, IEEEComputer Society, 2009.

[8] Fujitsu, “Confidence In Cloud Grows, Paving Way For NewLevels Of Business Efficiency.” Fujitsu Press Release,November 2010. Available online athttp://www.fujitsu.com/uk/news/ Last Accesed:September 2011.

[9] Cognizant, “Taking Testing to the Cloud.” CognizantWhitepaper. Available athttp://www.cognizant.com/Taking-Testing-to-the-Cloud.pdf [Last Accesed: September2011].

[10] S. Joglekar, “A foray into cloud-based software testing.”Patni Whitepaper. Available at www.igatepatni.com/ [LastAccesed: Dec 2012].

[11] AppLabs, “Testing the cloud.” Applabs Whitepaper.Available at http://www.applabs.com/html/ [Last Accesed:December 2011].

[12] L. van der Aalst, “Leveraging cloud capabilities for producttesting.” Impetus Whitepaper. Available atwww.impetus.com/Home/Downloads [Last Accesed:September 2011].

[13] P. Mobile, “Top 10 reasons why enterprises should adopt acloud-based approach for mobile application testing.”Perfecto Mobile Whitepaper. Available atwww.perfectomobile.com/cloudbasedapproach.pdf [LastAccesed: December 2011].

[14] L. G. Briefing, “Testing applications in cloud.” CSCWhitepaper. Available at http://www.assets1.csc.com [LastAccesed: December 2011].

[15] V. K. Mylavarapu, “Taking testing to the cloud.” CognizantWhitepaper. Available atwww.cognizant.com/taking-testing-to-the-cloud [LastAccesed: December 2011].

[16] V. K. Mylavarapu, “Leveraging cloud capabilities forproduct testing.” Impetus Whitepaper. Available atwww.impetus.com [Last Accesed: December 2011].

[17] TUI, “Cloud testing.” TUI Infotec Whitepaper. Available athttp://www.tui-infotec.com/global/ [Last Accesed:December 2011].

[18] S. Gaisbauer, J. Kirschnick, N. Edwards, and J. Rolia,“Vats: Virtualized-Aware Automated Test Service,” inQuantitative Evaluation of Systems, 2008. QEST ’08. FifthInternational Conference on, pp. 93 –102, September 2008.

[19] M. Oriol and F. Ullah, “Yeti on the Cloud,” in SoftwareTesting, Verification, and Validation Workshops (ICSTW),2010 Third International Conference on, pp. 434 –437,April 2010.

[20] G. Candea, S. Bucur, and C. Zamfir, “Automated SoftwareTesting as a Service,” in Proceedings of the 1st ACMsymposium on Cloud computing, SoCC ’10, (New York, NY,USA), pp. 155–160, ACM, 2010.

[21] L. Ciortea, C. Zamfir, S. Bucur, V. Chipounov, andG. Candea, “Cloud9: A Software Testing Service,” SIGOPSOperating System Review, vol. 43, pp. 5–10, January 2010.

[22] B. Kitchenham and S. Charters, “Guidelines for performingsystematic literature reviews in software engineering,”Engineering, vol. 2, no. EBSE 2007-001, p. 65, 2007.

[23] M. Petticrew and H. Roberts, Systematic Reviews in theSocial Sciences: A Practical Guide, vol. 54.

Wiley-Blackwell, 2006.

[24] T. Dyba, “Applying systematic reviews to diverse studytypes : An experience report,” Search, no. 7465,pp. 225–234, 2007.

[25] Spirent, “Sthe ins and outs of cloud computing and itsimpact on the network,T.” Spirent Whitepaper. Availableat: http://www.spirent.com/ /media/White[LastAccesed:Dec 2011].

[26] T. Parveen and S. Tilley, “When to Migrate SoftwareTesting to the Cloud?,” in Proceedings of the 2010 ThirdInternational Conference on Software Testing, Verification,and Validation Workshops, ICSTW ’10, (Washington, DC,USA), pp. 424–427, IEEE Computer Society, 2010.

[27] J. Gao, X. Bai, and W.-T. Tsai, “Cloud testing- issues,challenges, needs and practices,” Software Engineering: AnInternational Journal, vol. 1, pp. 9–23, September 2011.

[28] L. M. Riungu, O. Taipale, and K. Smolander, “Researchissues for software testing in the cloud,” in Proceedings ofthe 2010 IEEE Second International Conference on CloudComputing Technology and Science, CLOUDCOM ’10,(Washington, DC, USA), pp. 557–564, IEEE ComputerSociety, 2010.

[29] L. van der Aalst, “Software testing as a service (staas.”Sogeti Whitepaper. Available at www.sogeti.com/staas[Last Accesed: March 2010].

[30] W. Chan, S. Cheung, and K. Leung, “A metamorphictesting approach for online testing of service-orientedsoftware applications,” Int. J. Web Service Res., vol. 4,no. 2, pp. 61–81, 2007.

[31] S. Nag, “Business case for cloud based-testing.” BlueStarWhitepaper. Available athttp://www.bsil.com/Resource-Center/ [Last Accesed: Dec2012].

[32] L. M. Riungu, O. Taipale, and K. Smolander, “Softwaretesting as an online service: Observations from practice,” inProceedings of the 2010 Third International Conference onSoftware Testing, Verification, and Validation Workshops,ICSTW ’10, (Washington, DC, USA), pp. 418–423, IEEEComputer Society, 2010.

[33] S. Ghag, “Software validations of application deployed onwindows azure.” Infosys Whitepaper. Available atwww.infosys.com/cloud/ [Last Accesed:Dec 2011].

[34] M. A. Babar and M. A. Chauhan, “A tale of migration tocloud computing for sharing experiences and observations,”in Proceedings of the 2nd International Workshop onSoftware Engineering for Cloud Computing, SECLOUD ’11,(New York, NY, USA), pp. 50–56, ACM, 2011.

[35] K. Priyadarsini, “Cloud testing as a service,” IJAEST,vol. 6, pp. 173 – 177, September 2011.

[36] G. Moorthy, “Test automation framework for soaapplications.” Infosys Whitepaper. Available atisqtinternational.com/Ganesh[Last Accesed: Decembe2011].

[37] R. Mansharamani, “Virtual production environments forsoftware development and testing.” TCS Whitepaper.Available athttp://www.tcs.com/SiteCollectionDocuments/ [LastAccesed: Decembe 2011].

[38] E. Roodenrijs, “Testing on the cloud.” Sogeti Whitepaper.Available at http://www.isqtinternational.com/ [LastAccesed: December 2011].

[39] J. Varia, “Migrating your existing applications to the awscloud.” Amazon Web Services Whitepaper. Available athttp://www.media.amazonwebservices.com/ [Last

ACM SIGSOFT Software Engineering Notes Page 7 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 8: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

Accesed:December 2011].

[40] Siemens, “Sitemppo.” Siemens Whitepaper. Available athttps://www.cee.siemens.com/SiTEMPPO [Last Accesed:December 2011].

[41] J. Janeczko, “Journey into the cloud.” TUI Whitepaper.Available at http://www.tui-infotec.com/global/ [LastAccesed: December 2011].

[42] CSS, “Cloudtestgo-cloud based performance testing.” CSSWhitepaper. Available atcsscorp.com/services/cloud-services/ [Last Accesed:December 2011].

[43] Wipro, “Wipro’s cloud testing service.” Wipro Whitepaper.Available at taas.wipro.com/ [Last Accesed: December2011].

[44] P. Ashwood, “Why your it organization should move fromtraditional application testing to testing-as-a-service.” HPWhitepaper. Available at http://h20195.www2.hp.com/[Last Accesed: December 2011].

[45] Veracode, “Agile security.” Veracode Whitepaper. Availableat http://www.espiongroup.com/ [Last Accesed: December2011].

[46] J. Michelsen, “Service virtualization and the devtest cloud.”iTko Whitepaper. Available athttp://www.itko.com/resources/ [Last Accesed: December2011].

[47] IBM, “Development and testing using cloud computing.”IBM Whitepaper. Available athttp://www-935.ibm.com/services/ [Last Accesed:December 2011].

[48] SauceLabs, “onDemand Cloud Testing Tool.” Available athttp://saucelabs.com/ [Last Accesed: September 2011].

[49] Skytap, “SkyTap Cloud Testing Tool.” Available at athttp://skytap.com/ [Last Accesed: September 2011].

[50] uTest, “uTest Cloud Testing Tool.” Available at athttp://utest.com/ [Last Accesed: September 2011].

[51] VMLogix, “VMLogix LabManager Cloud Testing Tool.”Available at at http://vmlogix.com/ [Last Accesed:September 2011].

[52] SOASTA, “Software testing for startups.” SOASTAWhitepaper. Available athttp://www.cloudconnectevent.com/downloads/ [LastAccesed: December 2011].

[53] RackSpace, “Test and dev cloud.” RacKSpace Whitepaper,December 2011. Available athttp://www.rackspace.com/testdev/ [Last Accesed:December 2011].

[54] H. Le, “Testing as a service for component-baseddevelopments,” in Proceedings of The Third InternationalConference on Advances in System Testing and ValidationLifecycle, VALID 2011, pp. 46–51, IARIA, 2011.

[55] W. Chan, L. Mei, and Z. Zhang, “Modeling and testing ofcloud applications,” in Services Computing Conference,2009. APSCC 2009. IEEE Asia-Pacific, pp. 111 –118, dec.2009.

[56] A. Bertolino, G. Angelis, L. Frantzen, and A. Polini,“Model-based generation of testbeds for web services,” inProceedings of the 20th IFIP TC 6/WG 6.1 internationalconference on Testing of Software and CommunicatingSystems: 8th International Workshop, TestCom ’08 /FATES ’08, (Berlin, Heidelberg), pp. 266–282,Springer-Verlag, 2008.

[57] A. Bertolino, W. Grieskamp, R. M. Hierons, Y. L. Traon,B. Legeard, H. Muccini, A. Paradkar, D. S. Rosenblum, andJ. Tretmans, Model-Based Testing for the Cloud, pp. 1–11.

2010.

[58] A. Jaaskelainen, M. Katara, A. Kervinen, H. Heiskanen,M. Maunumaa, and T. Paakkonen, “Model-based testingservice on the web,” in Proceedings of the 20th IFIP TC6/WG 6.1 international conference on Testing of Softwareand Communicating Systems: 8th International Workshop,TestCom ’08 / FATES ’08, (Berlin, Heidelberg), pp. 38–53,Springer-Verlag, 2008.

[59] CSS, “Cloud-based Performance Testing.” CSS Whitepaper.Available at csscorp.com/services/cloud-services/ [LastAccesed: December 2011].

[60] M. A. S. Netto, S. Menon, H. V. Vieira, L. T. Costa, F. M.de Oliveira, R. Saad, and A. Zorzo, “Evaluating loadgeneration in virtualized environments for softwareperformance testing,” in Proceedings of the 2011 IEEEInternational Symposium on Parallel and DistributedProcessing Workshops and PhD Forum, IPDPSW ’11,(Washington, DC, USA), pp. 993–1000, IEEE ComputerSociety, 2011.

[61] Z. Ganon and I. E. Zilbershtein, “Large-scale performancetests of network management systems,”

[62] CSS, “Performance engineering of web applications.” CSSWhitepaper. Available athttp://www.csscorp.com/downloads/whitepapers/ [LastAccesed: December 2011].

[63] S. Bucur, V. Ureche, C. Zamfir, and G. Candea, “Parallelsymbolic execution for automated real-world softwaretesting,” in Proceedings of the sixth conference on Computersystems, EuroSys ’11, (New York, NY, USA), pp. 183–198,ACM, 2011.

[64] T. Banzai, H. Koizumi, R. Kanbayashi, T. Imada,T. Hanawa, and M. Sato, “D-cloud: Design of a softwaretesting environment for reliable distributed systems usingcloud computing technology,” in Proceedings of the 201010th IEEE/ACM International Conference on Cluster,Cloud and Grid Computing, CCGRID ’10, (Washington,DC, USA), pp. 631–636, IEEE Computer Society, 2010.

[65] L. Gu and S.-C. Cheung, “Constructing and testingprivacy-aware services in a cloud computing environment:challenges and opportunities,” in Proceedings of the FirstAsia-Pacific Symposium on Internetware, Internetware ’09,pp. 2:1–2:10, ACM, 2009.

[66] P. Zech, “Risk-based security testing in cloud computingenvironments,” in Proceedings of the 2011 Fourth IEEEInternational Conference on Software Testing, Verificationand Validation, ICST ’11, (Washington, DC, USA),pp. 411–414, IEEE Computer Society, 2011.

[67] B. Marin, T. Vos, G. Giachetti, A. Baars, and P. Tonella,“Towards testing future web applications,” in ResearchChallenges in Information Science (RCIS), 2011 FifthInternational Conference on, pp. 1 –12, may 2011.

[68] T. M. King and A. S. Ganti, “Migrating autonomicself-testing to the cloud,” in Proceedings of the 2010 ThirdInternational Conference on Software Testing, Verification,and Validation Workshops, ICSTW ’10, (Washington, DC,USA), pp. 438–443, IEEE Computer Society, 2010.

[69] T. Vengattaraman, P. Dhavachelvan, and R. Baskaran, “Amodel of cloud based application environment for softwaretesting,” CoRR, vol. abs/1004.1773, 2010.

[70] G. Wassermann, D. Yu, A. Chander, D. Dhurjati,H. Inamura, and Z. Su, “Dynamic test input generation forweb applications,” in Proceedings of the 2008 internationalsymposium on Software testing and analysis, ISSTA ’08,(New York, NY, USA), pp. 249–260, ACM, 2008.

ACM SIGSOFT Software Engineering Notes Page 8 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938

Page 9: Empirical evaluation of cloud-based testing techniques: a ...core.ecu.edu/STRG/materials/Empirical evaluation of cloud-based... · Empirical Evaluation of Cloud-based Testing

[71] S. Misailovic, A. Milicevic, N. Petrovic, S. Khurshid, andD. Marinov, “Parallel test generation and execution withkorat,” in Proceedings of the the 6th joint meeting of theEuropean software engineering conference and the ACMSIGSOFT symposium on The foundations of softwareengineering, ESEC-FSE ’07, pp. 135–144, 2007.

[72] A. Lastovetsky, “Parallel testing of distributed software,”Inf. Softw. Technol., vol. 47, pp. 657–662, July 2005.

[73] R. Guha and D. Al-Dabass, “Impact of web 2.0 and cloudcomputing platform on software engineering,” in Proceedingsof the 2010 International Symposium on Electronic SystemDesign, ISED ’10, (Washington, DC, USA), pp. 213–218,IEEE Computer Society, 2010.

[74] D. Chantry, “Mapping applications to the cloud,” TheArchitecture Journal, vol. 19, pp. 2–9, September 2011.

[75] E. P. Mancini, M. Rak, and U. Villano, “Perfcloud: Gridservices for performance-oriented development of cloudcomputing applications,” in Proceedings of the 2009 18thIEEE International Workshops on Enabling Technologies:Infrastructures for Collaborative Enterprises, WETICE ’09,(Washington, DC, USA), pp. 201–206, IEEE ComputerSociety, 2009.

[76] J. Yang, J. Qiu, and Y. Li, “A profile-based approach tojust-in-time scalability for cloud applications,” inProceedings of the 2009 IEEE International Conference onCloud Computing, CLOUD ’09, (Washington, DC, USA),pp. 9–16, IEEE Computer Society, 2009.

[77] Y. Khalidi, “Building a cloud computing platform for newpossibilities,” Computer, vol. 44, pp. 29–34, March 2011.

[78] B. Chhabra, D. Verma, and B. Taneja, “Softwareengineering issues from the cloud application perspective,”International Journal of Information Technology andKnowledge Management, vol. 2, no. 2.

[79] S. Hosono, H. Huang, T. Hara, Y. Shimomura, and T. Arai,“A lifetime supporting framework for cloud applications,” inProceedings of the 2010 IEEE 3rd International Conferenceon Cloud Computing, CLOUD ’10, (Washington, DC,USA), pp. 362–369, IEEE Computer Society, 2010.

[80] G. Shroff, “Dev 2.0: model driven development in thecloud,” in Proceedings of the 16th ACM SIGSOFTInternational Symposium on Foundations of softwareengineering, SIGSOFT ’08/FSE-16, (New York, NY, USA),pp. 283–283, ACM, 2008.

[81] P. Bhattacharya and I. Neamtiu, “Dynamic updates for weband cloud applications,” in Proceedings of the 2010Workshop on Analysis and Programming Languages forWeb Applications and Cloud Applications, APLWACA ’10,(New York, NY, USA), pp. 21–25, ACM, 2010.

[82] J. S. Rellermeyer, M. Duller, and G. Alonso, “Engineeringthe cloud from software modules,” in Proceedings of the2009 ICSE Workshop on Software Engineering Challengesof Cloud Computing, CLOUD ’09, (Washington, DC, USA),pp. 32–37, IEEE Computer Society, 2009.

[83] P. Yara, R. Ramachandran, G. Balasubramanian,K. Muthuswamy, and D. Chandrasekar, “Global softwaredevelopment with cloud platforms,” Software EngineeringApproaches for Offshore and Outsourced Development,pp. 81–95, 2009.

[84] R. Sen, “Developing parallel programs,” The ArchitectureJournal, vol. 19, pp. 17–23, September 2011.

[85] O. Taipale, J. Kasurinen, K. Karhu, and K. Smolander,“Trade-off between Automated and Manual SoftwareTesting,” International Journal of Systems AssuranceEngineering and Management Springer, pp. 1–12, 2011.

[86] R. Ramler and K. Wolfmaier, “Economic Perspectives inTest Automation: Balancing Automated and ManualTesting with Opportunity Cost,” in Proceedings of the 2006international workshop on Automation of software test,AST ’06, (New York, USA), pp. 85–91, 2006.

[87] S. Berner, R. Weber, and R. K. Keller, “Observations andLessons learned from Automated Testing,” in Proceedings ofthe 27th international conference on Software engineering,ICSE ’05, pp. 571–579, 2005.

[88] A. Bertolino, “Software Testing Research: Achievements,Challenges, Dreams,” in 2007 Future of SoftwareEngineering, FOSE ’07, (Washington, DC, USA),pp. 85–103, IEEE Computer Society, 2007.

[89] CSS, “A framework for test automation.” CSS Whitepaper.Available athttp://www.utest.com/landing-interior/framework [LastAccesed: December 2011].

[90] K. Z. Zamli, N. Ashidi, M. Isa, M. Fadel, and J. Klaib, “Atool for automated test data generation ( and execution )based on combinatorial approach,” International Journal ofSoftware Engineering and Its Applications, vol. 1, no. 1,pp. 19–36, 2007.

[91] J. Miller, M. Reformat, and H. Zhang, “Automatic test datageneration using genetic algorithm and program dependencegraphs,” Information and Software Technology, vol. 48,no. 7, pp. 586–605, 2006.

[92] C. Mingsong, Q. Xiaokang, and L. Xuandong, “Automatictest case generation for uml activity diagrams,” inProceedings of the 2006 international workshop onAutomation of software test, AST ’06, (New York, NY,USA), pp. 2–8, ACM, 2006.

[93] W. G. J. Halfond and A. Orso, “Improving test casegeneration for web applications using automated interfacediscovery,” in Proceedings of the the 6th joint meeting of theEuropean software engineering conference and the ACMSIGSOFT symposium on The foundations of softwareengineering, ESEC-FSE ’07, (New York, NY, USA),pp. 145–154, ACM, 2007.

[94] Impetus, “Designing a successful test automation strategy:Connecting the dots.” Impetus Whitepaper. Available atwww.impetus.com Z Home Z Connect [Last Accesed:December 2011].

ACM SIGSOFT Software Engineering Notes Page 9 May 2012 Volume 37 Number 3

DOI: 10.1145/180921.2180938 http://doi.acm.org/10.1145/180921.2180938