43
Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Embed Size (px)

Citation preview

Page 1: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Quality Assessment II

Peter Dologdolog [at] cs [dot] aau [dot] dk2.2.05Intelligent Web and Information SystemsNovember 2, 2010

Page 2: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Outline

Design models and testingUsage analysisStatistical testing

2Peter Dolog, Web Engineering, Quality Assesment

Page 3: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Design Models and Quality Assessment

Page 4: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Metrics for navigation

Navigation maps: size, depth, breadth, compactnessNavigation context: coupling of nodes, intrinsic

complexity

Can be checked in navigation design modelsFor example if represented in XML, analysis can be

done by XSLT

4Peter Dolog, Web Engineering, Quality Assesment

Page 5: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Correctness of navigation

Syntacticall the constructs are correct and consistent with respect to a given syntax

Semanticnon-determinismracing conditionsdeadlocks

5Peter Dolog, Web Engineering, Quality Assesment

Page 6: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Usability

Consistency:consistent use of patterns in content composition, navigation, and function invocation

Ease of Navigation:availability of links – from associations between data, in addition links to home, and other related upper pages to avoid dead ends.

Low page density:number of content units at the page

6Peter Dolog, Web Engineering, Quality Assesment

Page 7: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Modify Pattern

7Peter Dolog, Web Engineering, Quality Assesment

© Springer, Web Applications Engineering 2009

Page 8: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Termination Variant

8Peter Dolog, Web Engineering, Quality Assesment

© Springer, Web Applications Engineering 2009

Page 9: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Other design based techniques

See the last lecture:White box : Data flow basedBlack Box

9Peter Dolog, Web Engineering, Quality Assesment

Page 10: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

From last lecture: performance testing

10Peter Dolog, Web Engineering, Quality Assesment

Page 11: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Testing Loop

11Peter Dolog, Web Engineering 2010, Quality Assesment I

© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.

Page 12: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Results

12Peter Dolog, Web Engineering 2010, Quality Assesment I

© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.

Page 13: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Additional response time testing and stressing

13Peter Dolog, Web Engineering 2010, Quality Assesment I

© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.

Page 14: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

White box testing (2)

14Peter Dolog, Web Engineering 2010, Quality Assesment I

© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.

Page 15: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

White Box Testing

15Peter Dolog, Web Engineering 2010, Quality Assesment I

© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.

Page 16: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Web Usage Analysis

16

Page 17: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Typical Web Log

Client IPTimestampMethod of transaction (Get, Post, …)URL requestedProtocol version (e.g HTTP1.1)HTTP return codeSize of response returned to clientCookie at the client browserURL of referrerClient used to access (e.g. Firefox, Mozzila, IE, …)

17Peter Dolog, Web Engineering, Quality Assesment

Page 18: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Usage Analysis

We can mine for related pages from the requestsA typical query would be:

Give me all URL’s accessed in one session by usersRank them according to number of occurance patterns in the log

Good for the web sites with less complex and static pages.

If the patterns correlate with the links then we are fine

However, for complex applications, more fine grained log structure is neccassary to assess the usage

18Peter Dolog, Web Engineering, Quality Assesment

Page 19: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Goals

You want to typically find out whetheryour designed links are followed or notwhether your pages are reachablewhich of the pages are not used and whywhich data items are usedare there any anomalies, i.e. invoking functions after strange navigation sequenceand so on

19Peter Dolog, Web Engineering, Quality Assesment

Page 20: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Goal’s achievements

You are able to do this if:you know the composition of pageswhich data entities and records are inhow they are processedwhich operations and functions are invokeddo data fulfil their role (central role, access role, interconnection)how the links are related to the items and so on

20Peter Dolog, Web Engineering, Quality Assesment

Page 21: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Design Model

Data Schema (In MVC Model)Hypertext Schema (In MVC Model and partly

Controller)Presentation Model (In MVC View and partly

Controller)…

You can extend the web log with these information

21Peter Dolog, Web Engineering, Quality Assesment

Page 22: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

WebML QA approach

22Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 23: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Design Schema Analysis

Performed at the design timeChecks for some quality attributes (see last

lectures)There are various metrics to check for those

attributes (see P. Fraternali et. al.: WQA: An XSL Framework for Analyzing the Quality of Web Applications. IWWOST 2002 workshop proceedings.)

23Peter Dolog, Web Engineering, Quality Assesment

Page 24: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Web Usage Analysis

Data Access AnalysisHypertext Access AnalysisNavigation Analysis

24Peter Dolog, Web Engineering, Quality Assesment

Page 25: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Web Usage Mining

Discovers patterns of navigating connections which are not designed

Results in rules: • XML Association Rule X=>Y• XML Sequential Pattern, where rule head and

body are bound to a position in the log statement (indicates temporal relation)

25Peter Dolog, Web Engineering, Quality Assesment

Page 26: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Information Extraction

26Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 27: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

You can extend your WA logs as well

http://logging.apache.org/log4j/1.2/index.html

To insert log about an event from application runtime

Such as user interacting with data unit, or part of a page, … when populating it and sending as a result of a request

27Peter Dolog, Web Engineering, Quality Assesment

Page 28: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Concepetual Log

28Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 29: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

WebML QA approach

29Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 30: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Consistency Analysis

30Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 31: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Most Accessed Entities

31Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 32: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Missing Links

32Peter Dolog, Web Engineering, Quality Assesment

© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.

Page 33: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Statistical Testing

33Peter Dolog, Web Engineering, Quality Assesment

Page 34: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Statistical Testing

Web• Huge user population• Complex Architecture• Renders traditional coverage-based testing less

appropriateStatistical Testing

• Generation of test cases from usage data• Targeting high risk areas• Focuses on a probability distribution from

application domain

34Peter Dolog, Web Engineering, Quality Assesment

Page 35: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Outcome

Test cases generated randomlyOutcome of the test is used to predict reliability

given the usage profileThe amount of resources reduced significantly

Usual predictions focus on:• Reliability• Time to failure• Mean time between failures• All are of probabilistic nature

35Peter Dolog, Web Engineering, Quality Assesment

Page 36: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Process of Statistical Testing

Construct the statistical testing models or usage models basedon actual usage scenarios and frequencies.

Use these models for test case construction, selection andexecution.

Analyze the test results for reliability assessments andprediction, and for decision-making.

36Peter Dolog, Web Engineering, Quality Assesment

Page 37: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Usage Model

Characterizes the application operationGraph – nodes different states of usage model and

arcs transitions between themProbability distribution may be assigned to the arcs

Operation profiles are used to represent usage models

Help to guide allocation sequences

37Peter Dolog, Web Engineering, Quality Assesment

Page 38: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Unified Markov Model

Based on operation profilesStates and transitionsProbabilities on transitions – best estimate of actual

use

Web applications: hits and goesVisited pages or collections of related pages – statesUsage patterns/Navigation patterns and their

likelihood – transitions

Can be formed as hierarchical

38Peter Dolog, Web Engineering, Quality Assesment

Page 39: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Example

39Peter Dolog, Web Engineering, Quality Assesment

Hao, J. and Mendes, E. 2006. Usage-based statistical testing of web applications. In Proceedings of the 6th international Conference on Web Engineering (Palo Alto, California, USA, July 11 - 14, 2006). ICWE '06, vol. 263. ACM, New York, NY, 17-24. DOI= http://doi.acm.org/10.1145/1145581.1145585

Page 40: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Testing

Perform exhaustive complete coverage testing for the toplevelmodel.

Perform selective testing for some lower-level models, whichcovers frequently visited sub-parts.

Perform more selective testing for the remaining low levelUMMs.

40Peter Dolog, Web Engineering, Quality Assesment

Page 41: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Test case generation

Treshold set upThose which are above trashold contribute to the

critical navigation pathStart with higher numberLater decrease

41Peter Dolog, Web Engineering, Quality Assesment

Page 42: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Building a UMM

Web logsaccesserror

42Peter Dolog, Web Engineering, Quality Assesment

Page 43: Quality Assessment II Peter Dolog dolog [at] cs [dot] aau [dot] dk 2.2.05 Intelligent Web and Information Systems November 2, 2010

Calculating Reliability

One possibility is to look at (Nelson Model)number of errors discovered during testingamount of time needed to discover them

Relaibility = 1 – f/n

f is number of errorsn is number of hits for those errors

Mean time between failures = n/f

43Peter Dolog, Web Engineering, Quality Assesment