Upload
nasim-parker
View
30
Download
0
Embed Size (px)
DESCRIPTION
Survey of Tools to Support Safe Adaptation with Validation. Alain Esteva-Ramirez School of Computing and Information Sciences Florida International University. Bárbara Morales-Quiñones Department of Computer Engineering University of Puerto Rico-Mayaguez. REU Summer Program. 06/26/2007. - PowerPoint PPT Presentation
Citation preview
Survey of Tools to Support Safe Adaptation with Validation
Alain Esteva-Ramirez
School of Computing and
Information Sciences
Florida International University
Bárbara Morales-Quiñones
Department of Computer
Engineering
University of Puerto Rico-Mayaguez
06/26/2007
REU Summer Program
2
Outline
Introduction Testing Autonomic Systems
Motivation Background
Safe Adaptation by Zhang et. al
Tool Classification & Criteria
Survey of Tools
Selection of Tools
Questions?
3
Introduction:
Autonomic Computing Automated low level task/actions Specify behavior as high level policies Self-management features
Testing Autonomic Systems Requires testing prior to initial deployment Requires runtime testing, since structure
and behavior can change at runtime Pioneers of autonomic computing stated
that validation is one of the grand challenges of autonomic computing.
Testing Autonomic Systems (1)
4
Introduction:
Two approaches developed by King et al. [1] Replication with Validation
Only feasible when managed resources can be replicated Requires the system to create and/or maintain copies of the
managed resource for validation purposes Changes implemented and validated on copies
Safe Adaptation with Validation Validates changes resulting from self-management as part
of a safe adaptation process It can be used when duplicating managed resources are too
expensive, impractical, or impossible Occurs directly on the managed resource, during execution.
[1] Towards Self-Testing in Autonomic Computing Systems
Testing Autonomic Systems (2)
5
Motivation
Survey represents preliminary work for Testing Autonomic Computing Systems During Safe Adaptation.
Motivation stems from the need to test autonomic computing systems at runtime(i.e., to avoid the high cost of system failures.)
Since the strategy is based on safe adaptation, investigation of tools can be useful for building dependable adaptive (and autonomic) systems.
Many new tools/plugins have emerged; integrated development platforms, open-source comm.
6
Background
Safe adaptation Developed by Zhang et al. (WADS 2004) Directed towards using a disciplined approach to
building adaptive systems.
An adaptation is safe if and only if: It does not violate the dependencies between
components It does not interrupt any critical communications
that could result in erroneous conditions
[2] Enabling Safe Dynamic Component-Based Software Adaptation
Safe Adaptation by Zhang et al. (1)
7
BackgroundSafe Adaptation by Zhang et al. (2)
Source: J. Zhang, B. H. C. Cheng, Z. Yang, and P. K. McKinley. Enabling safe dynamic component-based software adaptation. In WADS, pages 194–211, 2004.
8
Tool Classification
Dependency Analysis Tools Partially automate safe adaptation process. Extract dependency relationships among and
between components. Metrics Tools
Allows to measure efficiency based on certain performance metrics (memory, response time). Aids validation of self-optimization features.
Complexity metrics – automatically generate costs of adaptation steps.
Unit Testing Tools Support or enhance previous unit testing (REU
2006).
[2] Enabling Safe Dynamic Component-Based Software Adaptation
9
Tool Selection Criteria (1)
Dependency Analysis Exportable dependencies – we need a way
to access the generated information, to later analyze it.
Graphical visualization – aids us in visualizing these dependencies
Dependency cycle detection – it is important to know these to accurately manage the impacts of changes
10
Tool Selection Criteria (2)
Performance Analysis Load – mechanisms to check object’s
memory usage. Speed – mechanisms to evaluate response
time of method calls. Complexity – code complexity related
metrics like cyclomatic complexity and coupling between objects
Unit Testing Support Java Unit Testing – support or enhance
unit testing Code Coverage – support dynamic
analysis of code for unit test coverage. Branch and line coverage
Tool Selection Criteria (3)
12
Survey of Tools
JDepend Eclipse plugin that exports generated design
metrics.
Dependency Finder Independent tool
Filters dependencies by packages, classes or features
Code Analysis Plugin Information is presented through diagrams that
help visualize the dependencies.
Dependency Analysis Tools
13
Survey of Tools
Test and Performance Tools Platform (TPTP) Provides framework in which developer build test
and performance tools that integrates with Eclipse
Finds problems faster and with less difficulty
Finds performance bottlenecks and other metrics easily
Addresses the entire test and performance cycle including test editing and execution, monitoring, profiling, among other capabilities.
Performance Metrics Tools
14
Survey of Tools
JUnitPerf JUnit test decorators that are used to measure
efficiency of JUnit tests. Timed tests – create a top level bound Load tests – creates an artificial load
Cobertura Provides XML coverage reports from the system
level to an individual line of code. Provides complexity metrics. Provides a maxmemory attribute that helps to
restrict the amount of memory used.
Unit Test Support Tools
15
Tool Selection
Dependency Analysis JDepend – exporting XML dependency report CAP – Visualizing dependency graphs
Performance Analysis TPTP – addresses the entire test and performance
life cycle
Unit Testing JUnitPerf – measure efficiency of JUnit tests Cobertura – exporting XML coverage report
16
References
[1] Towards Self-Testing in Autonomic Computing Systems
[2] Enabling Safe Dynamic Component-Based Software Adaptation
[3] Survey of Tools to Supporting Testing Autonomic Computing Systems During Safe Adaptation
17
Questions?
Questions, comments and queries.