17
Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University http://www.csc.ncsu.edu/faculty/xie /

Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Embed Size (px)

Citation preview

Page 1: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Automated Testing of System Software (Virtual Machine Monitors) Tao XieDepartment of Computer ScienceNorth Carolina State Universityhttp://www.csc.ncsu.edu/faculty/xie/

Page 2: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Automated System Software Testing

Purpose: automated testing of system code bases (e.g., virtual machine monitors) for robustness, security, functionality, coverage..

Often highly environment-dependent software Challenges

Code bases are complex Heavily interact with system APIs; system behavior

depends on environment state, e.g., open("/dev/tty",O_WRONLY)

Testing requires sophisticated system setup

Page 3: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Testing Environment-Dependent Software

Test inputs: method arguments, receiver object state & input environment state

Test outputs: method return values, receiver object state & output environment state

Sufficient & safe testing of software generate high-covering tests cause no threat to the environment

Page 4: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Application of Automated Testing Tools

Dynamic symbolic execution tools generate input method arguments and receiver object state Microsoft Pex (C#) and CREST (C)

Empirical study: applied these tools to test Xen, CodePlex Client, and NUnit framework Heavily interact with the file-system environment

Observed results: test generation tools failed to generate high-covering test inputs

Identified problem: required input environment states

Page 5: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Problems P1: Generating input environment state is beyond the

scope of test-generation tools. P2: Arbitrary program inputs generated by test-

generation tools can lead to pollution/threat of the environment states.

Example code under test//Code Coverage – Many cases – Environment state01: public void Add(string localPath, bool recursive,SourceItemCallback callback)02: {03: Guard.ArgumentNotNullOrEmpty(localPath, "localPath");

04: if (fileSystem.DirectoryExists(localPath))05: AddFolder(localPath, recursive, callback, true);

06: else if (fileSystem.FileExists(localPath))07: AddFile(localPath, callback, true);......

// Safe testing – delete directory01: bool OnBeforeAddItem(SourceItem item)02: { ..........09: if (item.ItemType == ItemType.File)

10: File.Delete(item.LocalName);11: else12: Directory.Delete(item.LocalName);13: ......14: return (answer == "y" || answer == "a");15: }

Page 6: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Outline Mock objects as a solution to the

identified problems Challenges Proposed approach Preliminary results Future work

Page 7: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Mock Objects Used to simulate the

required environment, avoiding interacting with the real environment

Benefits: Enable unit testing Increase code coverage Ensure safe testing

Challenge: Non-trivial to implement a mock object

//Code under test in Mock object based testing approach01: public void Add(string localPath, bool recursive,SourceItemCallback callback)02: {03: Guard.ArgumentNotNullOrEmpty(localPath,

"localPath");04: if

(mockFileSystem.DirectoryExists (localPath))

05: AddFolder(localPath, recursive, callback, true);06: else if

(mockFileSystem.FileExists(localPath))

07: AddFile(localPath, callback, true);......

Page 8: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Mock Objects (cont.) Incomplete/incorrect

implementation causes false alarms

Non-trivial to implement sophisticated mock object

Tedious task!

Our solution: Systematic approach to build a mock object to pass given tests failed due to insufficient mock object

//incorrect implementationpublic bool DirectoryExists(string path){ return false; }

public void CreateDirectory(string path){ listOfCreatedDir.Add(path); }

//correct implementationpublic bool DirectoryExists(string path){

if (listOfCreatedDir.Contains(path))

return true;

return false;}

public void CreateDirectory(string path){ if (listOfCreatedDir.Contains(path))

return; else listOfCreatedDir.Add(path);}

Page 9: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Approach Moca (MOCk Assistant) follows Test-Driven

Development (TDD) to systematically build a high-quality mock object that is sufficient to achieve effective testing of the code under test without causing any false alarms

Moca makes use of PUTs (Parameterized Unit Tests)

PUTs = unit tests (TUTs) with parameters, including functional specifications

Microsoft research tool, Pex, accepts these PUTs and generates high-covering tests

Page 10: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Approach (cont.) Input to Moca

A set of conventional unit tests (failing due to an insufficient mock object)

An environment that needs to be mocked A set of PUTs

Moca assists developers in building a mock object that can be used to replace the real-environment interactions

Page 11: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Approach (cont.)

TUT – Conventional unit testsCUM – Class to mockENV – EnvironmentMUM – Method to mock CUT – Code under test

Page 12: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Preliminary Results Application on a real-world application,

CodePlex Client

Results: Moca can assist developers in building a mock object Effective in achieving high coverage, w/o false

alarms Sufficient when compared to a naive

implementation, Less complex and thus incurring less effort than a

manual sophisticated implementation

Page 13: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Summary Identified problems with automated testing of

environment-dependent software Conducted empirical study to show benefits of

using mock objects and indentify challenges in building mock objects

Proposed an approach based on TDD methodology to build mock objects

Demonstrated the feasibility and benefits of the proposed approach

Also developed new techniques for test generation

Page 14: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Key OutcomesRelevance to military/DoD: An undergraduate student, Justin Gorham, is working as a summer intern at the Fort

Hood Army Electronic Proving Ground (EPG) team in applying Pex and our extensions on Army code bases.

A PhD student, Kunal Taneja, is working as a summer intern at FDA in applying Pex and our extensions on DoD code base for regulatory purposes (mocking databases).

Publications: [AST 09] Madhuri R Marri, Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Wolfram

Schulte. An Empirical Study of Testing File-System-Dependent Software with Mock Objects. In Proceedings of the 4th International Workshop on Automation of Software Test (AST 2009), Business and Industry Case Studies, pp. 149-153, May 2009.   

[Mutation 09] Tao Xie, Nikolai Tillmann, Jonathan de Halleux, and Wolfram Schulte. Mutation Analysis of Parameterized Unit Tests. In Proceedings of the 4th International Workshop on Mutation Analysis (Mutation 2009), pp. 177-181, April 2009. 

[SUITE 09] Madhuri R Marri, Suresh Thummalapenta, and Tao Xie. Improving Software Quality via Code Searching and Mining. In Proceedings of the First International Workshop on Search-Driven Development – Users, Infrastructure, Tools and Evaluation (SUITE 2009), pp. 33-36, May 2009. 

[DSN 09] Tao Xie, Nikolai Tillmann, Peli de Halleux, and Wolfram Schulte. Fitness-Guided Path Exploration in Dynamic Symbolic Execution. To appear in Proceedings of the 39th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN 2009), June-July 2009.

[ESEC/FSE 09] Suresh Thummalapenta, Tao Xie, Nikolai Tillmann, Peli de Halleux, and Wolfram Schulte. MSeqGen: Object-Oriented Unit-Test Generation via Mining Source Code. To appear in Proceedings of the 7th joint meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE 2009), August 2009. 

Page 15: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Other Related FundingRelated new funding over the SOSI project period:

NSF CAREER Award: 5 years (Aug 09-July 14) $425,000 “Cooperative Developer Testing with Test Intentions”

Other ongoing supports

ARO Award: 3 years (Sept 08-Aug 11) $300,000“ Mining Program Source Code for Improving Software Quality”

NSF SoD Award: 3 years (Jan 08-Dec 10) $245,000

“Collaborative Research: SoD-TEAM: Designing Tests for Evolving Software Systems”

NSF CyberTrust Award: 3 years (Aug 07-July 10) $227,275“CT-ISG: Collaborative Research: A New Approach to Testing and Verification of Security Policies”

Page 16: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Future Directions Test generation

Guided exploration of paths [DSN 09] Method-sequence generation [ESEC/FSE 09] Security (attack)/access control test generation (w/

NIST) Performance testing Embedded/network/db/SOA-app test generation

Dealing with environments Fully automate Moca Domain-specific mock object tools/libraries (e.g., file

system, database, network, hardware environments) Test oracles

Detection of insufficiency of assertions Inference of normal behavior as approximate oracles

Page 17: Automated Testing of System Software (Virtual Machine Monitors) Tao Xie Department of Computer Science North Carolina State University

Questions?

More info on research of NCSU Automated Software Engineering Group:

http://www.csc.ncsu.edu/faculty/xie/research.htm

http://www.csc.ncsu.edu/faculty/xie/publications.htm

Recent industry impact Our Fitnex strategy [DSN 09] integrated in Microsoft

Research Pex as its default strategy (second half of 2008, download count of 5,600)