69
1 Engineering the Evaluation Approach to fit different Web Project/Organization Needs Dr. Luis Olsina Motivation (Data, Info …) Generic Evaluation Approach Quality Modeling Framework Integrated M&E Strategies M&E Strategies (GOCAME / SIQinU) Conceptual Base/Framework M&E Process Methods and Tools Multi-level M&E Strategy/Approach GOCAME + RATIONALE RATIONALE GIDIS_Web GIDIS_Web, , UNLPam UNLPam, , Argentina E-mail mail [email protected] [email protected] Montevideo Uruguay Agenda Part I (1 h. 20’ + 10’ for Qs): Background Quality and Quality Models Introducing the Quality Concept ISO 25010 Models and Views External Quality and Quality in Use Views/Characteristics The 2Q2U Quality Modeling Framework Models and Relationships between Views Conceptual Base for Measuring and Evaluating Quality Example: Information Suitability for a Shopping Cart Concepts for Non-functional Requirements (NFR): Attributes, … Concepts for Measurement: Metrics, … Concepts for Evaluation: Indicators, … • Conclusions

Engineering the Evaluation Approach to fit different Web

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Engineering the Evaluation Approach to fit different Web

1

Engineering the Evaluation Approach to fit different Web Project/Organization Needs

Dr. Luis Olsina

• Motivation (Data, Info …)

• Generic Evaluation Approach• Quality Modeling Framework• Integrated M&E Strategies

• M&E Strategies (GOCAME / SIQinU)• Conceptual Base/Framework• M&E Process• Methods and Tools

• Multi-level M&E Strategy/Approach• GOCAME+

RATIONALERATIONALE

GIDIS_WebGIDIS_Web, , UNLPamUNLPam, , ArgentinaEE--mail mail [email protected]@ing.unlpam.edu.ar

Montevideo Uruguay

Agenda Part I (1 h. 20’ + 10’ for Qs): Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Models and Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Page 2: Engineering the Evaluation Approach to fit different Web

2

AgendaPart II (1 h. 20’ + 10’ for Qs): Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Page 3: Engineering the Evaluation Approach to fit different Web

3

Many ISO standards deal with quality, evaluation, e.g.:

– Quality (Sw Product/System/System in Use): internal and external quality models, and quality-in-use modelfor sw. (SQuaRE: ISO 25010:2011)

– Quality (Process): process assessment and capability determination for software organizations (ISO 15504/CMMI)

– Measurement and Evaluation: The measurement process (ISO 15939:2002) / The evaluation process (ISO 14598:1998), and new SQuaRE documents

We often have observed a lack of consensus in the terminology in those documents

same terms different meaning, different terms with similar meaning, absent terms, etc.

We will use some terms regarding quality, quality measurement, and quality evaluation coming from our ontology

Quality and ISO Standards

Quality vs. Project Management Variables

– Scope

● Functionalities / Services / Content to deliver

– Time (Schedule)

● Effort (person per hour)

● Calendar (working and not-working days)

– Time-to-Market

– Quality

● Product/System

– System in Use

● Process

– Capability

● Resource

– Human Skills,

– Strategies, Methods, Tools, ...

– Cost

● Budget

Page 4: Engineering the Evaluation Approach to fit different Web

4

Quality usually has different Views (as per Garvin, 87):

– Transcendent View or Perspective

– Product View

– Producer View

– User View

– Value-based View

● quality/cost trade-off

What is Quality?

Page 5: Engineering the Evaluation Approach to fit different Web

5

What is Quality?

● The quality concept is not simple and atomic, but a multi-dimensional and contextual concept.

● Quality can not be measured and evaluated directly,

– at least in a not very trivial way

● Common practice assesses quality by means of thequantification of lower abstraction concepts, such asattributes of entities

● Given the inner complexity that a quality concept involves(e.g. multi-dimensional), it is necessary generally a model inorder to specify the quality requirements.

What is Quality?

● Quality depends on a specific project/organizationalinformation need, i.e., for a specific purpose, userviewpoint, and context

A possible definition of Quality:

● Quality is an abstractbstractbstractbstract relationshiprelationshiprelationshiprelationship between attributesattributesattributesattributes of anentityentityentityentity categorycategorycategorycategory (a product, system, process, etc, and theirconcrete entities) and a specific informationinformationinformationinformation needneedneedneed, which canbe stated at different organizationalorganizationalorganizationalorganizational levelslevelslevelslevels.

Quality is not an absolute concept but rather a relative, multi-dimensional and contextual one

Page 6: Engineering the Evaluation Approach to fit different Web

6

To Represent Quality (as per Olsina et al, 2008):

Define Quality is a hard job ...

Define, Specify the Quality depends on the:

– Entity Category (and Entity) to be applied

● Project (Development, Maintenance, ...)

– Process

– Product, System or System-in-Use

– Resource

– Service

– User Viewpoint / Purpose

● Manager, Developer, Final User, …

– they have different information needs, priorities ...

● Understand, Improve, Predict, Control

– Context / Domain / Criticity

● Application/Project/Organization Context

● Line of Products/ Application Domain

– Other factors

Quality of an entity is hard to define and assess but it is easy to recognize

Page 7: Engineering the Evaluation Approach to fit different Web

7

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

What is a Quality Model?

Quality Model

– Defined set of characteristics and the relationships between them which provide the basis for specifying quality requirements and evaluating quality

– ISO 25010 structure for a quality model

Page 8: Engineering the Evaluation Approach to fit different Web

8

The ISO 25010 Quality Model

System/Software Product Quality

Functional Suitability

Performance efficiency

Compatibility Usability Reliability Security Maintain- ability

Portability

Functional completeness

Functional correctness Functional

appropriateness

Time-behaviour Resource utilization Capacity

Co-existence Interoperability

Appropriateness recognizability

Learnability Operability User error protection

User interface aesthetics Accesibility

Maturity Availability

Fault tolerance

Recoverability

Confidentially Integrity

Non-repudiation

Accountability Authenticity

Modularity Reusability

Analysability Modifiability Testability

Adaptability Installability

Replaceability

Introduction: Perspectives or Views of Quality

Three Views of Sw. Quality (ISO 9126-1:2001 / 25010:2011)

– Internal Quality

● It can be measured and evaluated by static attributes of a product, i.e.documents such as specification of requirements, architecture, ordesign; pieces of source code, and so forth.

– External Quality

● It can be measured and evaluated by dynamic properties of a system, i.e. the running code in a computer system, i.e. when the module or full application is executed in a computer or network simulating as close as possible the actual environment

– Quality in Use

● QinU is the degree to which a sw/Web application-in-use used by specific users meets their needs to achieve specific task goals with effectiveness, efficiency, freedom from risk and satisfaction in specific contexts of use

● It evaluates the degree of excellence,

Page 9: Engineering the Evaluation Approach to fit different Web

9

● Internal Quality is specified by a quality model

– the eight characteristics shown before

● It can be measured and evaluated by static attributes of documentssuch as specification of requirements, architecture, or design;pieces of source code, and so forth.

● In early phases of a software or Web lifecycle, we can evaluate andcontrol the internal quality of these early products.

● But assuring internal quality is not usually sufficient to assureexternal quality (EQ).

Perspectives of System Quality: ISO 25010

● External Quality is specified by a quality model

– the eight characteristics shown before

● It can be measured and evaluated by dynamic attributes/propertiesof the running code in a computer system, i.e. when the module orfull application is executed in a computer or network simulating asclose as possible the actual environment.

● In late phases of a software lifecycle (e.g. in different kinds oftesting, or even in the operational state of a software or WebApp),we can measure, evaluate and control the EQ of these lateproducts,

● But assuring external quality is not usually sufficient to assurequality in use.

Perspectives of System Quality: ISO 25010

Page 10: Engineering the Evaluation Approach to fit different Web

10

● Quality in Use is the final user’s view of quality

● Quality in use is the “degree to which a product or system can be used byspecific users to meet their needs to achieve specific goals with effectiveness,efficiency, freedom from risk and satisfaction in specific contexts of use”.

Perspectives of System Quality: ISO 25010

Perspectives of System Quality: ISO 25010

● Attributes of internal and external quality of a softwareproduct are rather the cause, attributes of quality in userather the effect.

● QinU evaluates the degree of excellence, and can be usedto validate the extent to which the software or WebApp meetsspecific user needs.

● Considering appropriate attributes of the software/Webspecifications for internal quality is a prerequisite to achievethe required external behavior of the system, and consideringappropriate attributes of the system w.r.t. the externalbehavior is a prerequisite to achieve QinU

Page 11: Engineering the Evaluation Approach to fit different Web

11

Quality in Use Model

Instance of QinU MODEL with associated Attributes

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Page 12: Engineering the Evaluation Approach to fit different Web

12

23

Proposed 2Q2U framework and models

● Our framework for modeling NFR for internal/external Quality, Quality in use, actual Usability and User experience is called:

2Q2U (for short)

● 2Q2U extends the ISO 25010 standard, adding 4 (sub-)characteristics

– Information Quality (I/E Quality)

– Learnability in Use (QinU)

– Sense of Community (QinU)

– Communicability (QinU)

● Modifies the ISO 25010 Functional Suitability characteristic:

– Functional Quality (I/E Quality)

● Adds two new concepts: Actual Usability and UX, to which characteristics and sub-characteristics can be related in a flexible way:

– Actual Usability (QinU)

– Actual User Experience (QinU)

2Q2U Models/Characteristics (as per Olsina et al, 2012):

Page 13: Engineering the Evaluation Approach to fit different Web

13

2Q2U New QinU Characteristics/Concepts

2Q2U v2.0 QinU (sub-)characteristic / Definition Related ISO QinU

Actual User Experience / Degree to which specified

users can achieve actual usability, freedom from risk,

and satisfaction in a specified context of use

Absent calculable

concept

Actual Usability (synonym: usability in use) / Degree

to which specified users can achieve specified goals

with effectiveness, efficiency, learnability in use, and

without communicability breakdowns in a specified context of use

Absent calculable

concept, but similar

concept (i.e. usability in

use) was in the ISO 25010 draft

Learnability in use / Degree to which specified users

can learn efficiently and effectively while achieving

specified goals in a specified context of use.

Absent calculable

concept

Communicability / Degree to which specified users can

achieve specified goals without communicative

breakdowns in the interaction in a specified context of

use.

Absent calculable

concept

Sense of Community / Degree to which a user is

satisfied when meeting, collaborating and

communicating with other users with similar interest and needs

Absent calculable

concept

2Q2U Modeling Framework: Relationships

Page 14: Engineering the Evaluation Approach to fit different Web

14

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Content Suitability. Some Sub-characteristics/Attributes

External Quality Requirements (for Shopping Cart Entity)

1 Usability

1.1 Understandability

1.1.1 Icon/label ease to be recognized

1.1.2 Information grouping cohesiveness

1.2 Learnability

1.2.1 ………………………………………………………..

1.3 Operability

1.3.1 Control permanence

1.3.2 Expected behaviour of the Ccontrols

2 Content Quality (Infoquality)

2.1 Content suitability

2.1.1 Basic information coverage

2.1.1.1 Line item information completeness

2.1.1.2 Product description appropriateness

2.1.2 Coverage of other contextual Information

2.1.2.1 …………………………………………………………

Appropriateness Recognizability

Page 15: Engineering the Evaluation Approach to fit different Web

15

Example: Content Suitability definitions

Content Suitability

Degree to which a product or system delivers information with the right

coverage, added value, and consistency, considering the specified

user tasks and goals.

Coverage, Degree to which the information is appropriate, complete and concise for the task at hand for an intended user.

Appropriateness, Degree to which the information coverage fits to an intended user goal.

Completeness, Degree to which the information coverage is the sufficient amount of information to an intended user goal. Conciseness, Degree to which the information coverage is compactly

represented without being overwhelming.

Consistency, Degree to which the content is consistent to the application’s piece of information with respect to the intended user goal.

Example: Content Suitability. Cúspide.com catalog

Page 16: Engineering the Evaluation Approach to fit different Web

16

Example: Content Suitability. Shopping Cart (Before)

2.1.1.1 Line item

information completeness

2.1.1.2 Product description

appropriateness

1.3.2 Expected behaviour

of the delete control

1.1.1 Shopping cart icon /

label ease to be recognized

Example: Content Suitability. Shopping Cart (After)

Page 17: Engineering the Evaluation Approach to fit different Web

17

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Our Measurement/Evaluation Approach

Our M&E strategy (GOCAME – Goal-Oriented Context-Aware

Measurement and Evaluation) is based on three capabilities:

● A measurement and evaluation framework that relies on a sound conceptual (ontological) base.

– C-INCAMI

● A process specification, i.e. the main activities that should be planned and performed for measurement, evaluation and analysis

● Specific model-based methods, techniques and tools in order to carry out the specification of activities

– WebQEM, C-INCAMI_Tool …

Page 18: Engineering the Evaluation Approach to fit different Web

18

C-INCAMI Conceptual Base/Components

Concepts for NFR

● Information Need

● Entity Category/Entity

● Attribute

● Quality, Quality in Use

– CALCULABLE CONCEPT

● External Quality Model, Quality in Use Model

– CONCEPT MODEL

The GOCAME Strategy relies on a sound conceptual (ontological) base.

Just some terms for the NFR component:

Page 19: Engineering the Evaluation Approach to fit different Web

19

Basic Model for NFR

Concepts for NFR

● INFORMATION NEED

– Insight necessary to manage objectives, goals, risks, and problems.

● External Quality,

● Quality in Use, etc.

– To our example, “Understand (and further improve) the External Quality (w.r.t. its Usability and Info quality) to the Cuspide.com shopping cart

● Purpose = Understand / Improve

● User Viewpoint = final users

● Calculable Concept = External Quality

● Entity Category = e-bookstore WebApp (System)

● Entity = Cuspide.com shopping cart

Page 20: Engineering the Evaluation Approach to fit different Web

20

● ENTITY CATEGORY

– Object category that is to be characterized by measuring its attributes– High Level Categories: Product, System, Process, Resource, Project, System in Use...

● ENTITY (syno. Object)

– A concrete object that belongs to an entity category.– Example: given the entity category (i.e., an e-bookstore Web application, which its superCategory is a system) a concrete object that belongs to this category is the “Cuspide.com” WebApp.

Concepts for NFR

ATTRIBUTE (syno. Property, Feature)

– A measurable physical or abstract property of an entitycategory.– Note that the selected attributes are those relevant properties for the

defined information need.

– To our example, an attribute name is

● “Line item information completeness”,

● defined as “degree to which the line item information coverage is the sufficient

amount of data to an intended user goal”

– An attribute can be quantified (measured) by one or moredirect or indirect metrics.

Concepts for NFR

Page 21: Engineering the Evaluation Approach to fit different Web

21

CALCULABLE CONCEPT (syno. Measurable Concept)

– Abstract relationship between attributes of entities categoriesand information needs.– To our example, the calculable concept is “External quality”and two sub-concepts are “Usability”, and “Content quality”.– External Quality,

● Content Suitability

– Coverage

● Completeness.. .

– For instance, the “Completeness” sub-concept is defined as“Degree to which the information coverage is the sufficient amount ofinformation to an intended user goal”.

– The calculable concept can be represented by a conceptmodel.

Concepts for NFR

CONCEPT MODEL

– The set of sub-concepts and the relationships between them, which provide the basis for specifying the concept requirement and its further evaluation or estimation.

– the concept model type can be either ● a standard-based model (ISO, etc.) ● an organization own-defined model, or● a mixture of both.

– The concept model used in the example is of “mixture” typethat is based on the ISO quality-in-use model, and its extension – note the model shows also attributes combined to the sub-concepts.

Concepts for NFR

Page 22: Engineering the Evaluation Approach to fit different Web

22

External Quality Requirements (for Shopping Cart Entity)

1 Usability

1.1 Understandability

1.1.1 Icon/label ease to be recognized

1.1.2 Information grouping cohesiveness

1.2 Learnability

1.2.1 ………………………………………………………..

1.3 Operability

1.3.1 Control permanence

1.3.2 Expected behaviour of Controls

2 Content Quality

2.1 Content Suitability

2.1.1 Basic Information Coverage

2.1.1.1 Line item information completeness

2.1.1.2 Product description appropriateness

2.1.2 Coverage of other Contextual Information

2.1.2.1 …………………………………………………………

Instantiated EQ Quality Model

SubSub--ConceptConcept

AttributeAttribute

Appropriateness Recognisability Calculable ConceptCalculable Concept

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Page 23: Engineering the Evaluation Approach to fit different Web

23

Concepts for Metrics/Measurement

● Attribute

● Measurement

● Measure

● Metric

– Direct

– Indirect (Formula)

● Scale

– Scale Type

– Categorical, Numerical (Unit)

● Method (Procedure)

– Of Measurement/Calculation (Sw Instrument)

The GOCAME strategy relies on a sound conceptual (ontological) base.

Just some terms for the Measurement Component:

Model for Metric/Measurement

Page 24: Engineering the Evaluation Approach to fit different Web

24

Concepts for Metric/Measurement

MEASUREMENT

● Activity that uses a metric definition in order to produce a measure’s value.MEASURE

● the number or category assigned to an attribute of an entity by making a measurement– A measurement activity must be performed for each metric thatintervenes in the project.– It allows recording the date/time stamp, the collectorinformation in charge of the measurement activity, and for themeasure, the yielded value itself.

METRIC

– The defined measurement or calculation method (procedure) and the measurement scale

● Ex. Total Number of Unique Titles

– DIRECT METRIC (syno. Single, Base Metric)

– a metric of an attribute that does not depend upon a metric of any

other attribute.

● Ex. Degree of completeness to the line item information

– INDIRECT METRIC (syno. Hybrid, Derived Metric)

– a metric of an attribute that is derived from metrics of one or more

other attributes.

● Ex. Degree of Unique Titles (DUT = #UT / #TT)

Concepts for Metric/Measurement

Page 25: Engineering the Evaluation Approach to fit different Web

25

Concepts for Metric/Measurement

MEASUREMENT METHOD (syno. Procedure, Counting Rule,Protocol)

– the particular logical sequence of operations and possibleheuristics specified for allowing the realisation of a metricdescription by a measurement.– The typetypetypetype ofofofof aaaa measurementmeasurementmeasurementmeasurement procedureprocedureprocedureprocedure can be either

● subjective i.e. where the quantification involves human judgement, or ● objective i.e. where the quantification is based on numerical rules.

– Usually an objective measurement method type can be automated or semi-automated by a software tool.

SCALE

– a set of values with defined propertiesScale Type

– The type of scales depends on the nature of the relationship between values of the scale.– The types of scales are commonly classified into nominal,ordinal, interval, ratio, and absolute.– The scale type of measured values affect

● the sort of arithmetical and statistical operations that can be applied tovalues (e.g. we can’t add numbers in an ordinal scale)● the admissible transformations (e.g. M’ = a M for a ratio scale)

Concepts for Metric/Measurement

Page 26: Engineering the Evaluation Approach to fit different Web

26

Categorical Scale

– a scale where the measured or calculated values are categories, and cannot be expressed in units, in a strict sense.Numerical Scale

– a scale where the measured or calculated values are numbers that can be expressed in units, in a strict sense.UNIT (for Numerical Scales)

– Particular quantity defined and adopted by convention, withwhich other quantities of the same kind are compared in orderto express their magnitude relative to that quantity● Examples of Unit: LOC, bytes, words, links, tasks ...

Concepts for Metric/Measurement

Instantiated EQ Quality Model

AttributeAttribute

External Quality Requirements (for Shopping Cart Entity)

1 Usability

1.1 Understandability

1.1.1 Icon/label ease to be recognized

1.1.2 Information grouping cohesiveness

1.2 Learnability

1.2.1 ………………………………………………………..

1.3 Operability

1.3.1 Control permanence

1.3.2 Expected behaviour of Controls

2 Content Quality

2.1 Content Suitability

2.1.1 Basic Information Coverage

2.1.1.1 Line item information completeness

2.1.1.2 Product description appropriateness

2.1.2 Coverage of other Contextual Information

2.1.2.1 …………………………………………………………

Appropriateness Recognisability

Page 27: Engineering the Evaluation Approach to fit different Web

27

Example for Scale/Scale type

Direct metric: Degree of completeness to the line item information

The scale specifies three categories considering an ordinal scale type:

1. Incomplete; less info than category 2

2. Partially complete, i.e. it only has title, price, quantity, and sometimes availability fields;

3. Totally complete, i.e. it has title, author, price, quantity, added on date, and availability.

The specification of the measurement method is objective and the data collection can be made observationally or maybe automated by a tool.

Example: Content Suitability. Shopping Cart (Before)

2.1.1.1 Line item

information completeness

Page 28: Engineering the Evaluation Approach to fit different Web

28

Measures

External Quality Requirements Measure EI value P/GI value

Global Quality Indicator 61.97%

1 Usability 60.88%

1.1 Understandability 83%

1.1.1 Icon/label ease to be recognized 100%

1.1.2 Information grouping cohesiveness 66%

1.2 Learnability 51.97%

1.2.1 ……………………………………………… …

1.3 Operability 49.50%

1.3.1 Control permanence 100%

1.3.2 Expected behaviour 50%

2 Content Quality 63.05%

2.1 Content Suitability 63.05%

2.1.1 Basic Information Coverage 50%

2.1.1.1 Line item information completeness 2 50%

2.1.1.2 Product description appropriateness 50%

2.1.2 Coverage of other Contextual Information 76.89%

2.1.2.1 ……………………………………………….. …

2.1.2.2 Return policy information completeness 33%

Appropriateness Recognisability

To Remark

Metrics are welcome when they are clearly needed and easy to collect and

understand Pfleeger

● A Metric specifies in the numerical/symbolic world a specific

mapping of an entity’s attribute of the empirical world

● A Metric (in a measurement process) can not interpret itself a

calculable concept

Need of INDICATORS (in an evaluation process) in order to get

contextual information

Indicators are ultimately the foundation for interpretation of information needs and

decision-making.

Page 29: Engineering the Evaluation Approach to fit different Web

29

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions

Concepts for Indicator/Evaluation

● Information Need

● Concept Model

– Calculable Concept / Attribute

● INDICATOR

– Elementary (interprets/calculates Metric’s measure)

– Global (calculates Concept Model)

● ELEMENTARY and GLOBAL MODEL

● DECISION CRITERIA (Acceptability Levels)

● EVALUATION, INDICADOR VALUE

The GOCAME strategy relies on a sound conceptual (ontological) base.

Just some terms for the Evaluation Component:

Page 30: Engineering the Evaluation Approach to fit different Web

30

Model for Indicator/Evaluation

INDICATOR (syno Criterion)

– the defined calculation methodmethodmethodmethod and scalescalescalescale in addition to the modelmodelmodelmodel and decision criteriadecision criteriadecision criteriadecision criteria in order to provide an estimate or evaluation of a calculable concept with respect to defined information needsinformation needsinformation needsinformation needs.

– Elementary Indicator (syno. Elementary Criterion)

● Name: Satisfaction Level of the line item information completeness

– Global Indicator (syno. Global Criterion)

● Name: Satisfaction Level of External Quality

Concepts for Indicator/Evaluation

Page 31: Engineering the Evaluation Approach to fit different Web

31

ELEMENTARY MODEL

– algorithm or function with associated decision criteria that model an elementary indicator.Metric Metric Degree of completeness to the line item information

X = {1, 2, 3}

● Elementary Indicator Model

Satisfaction Level of the line item information completeness

EI = {{ 1, 0 }, {2, 50 }, { 3, 100}}

Concepts for Indicator/Evaluation

DECISION CRITERIA

– Thresholds, targets, or patterns used to determine the need

for action or further investigation, or to describe the level of confidence in a given results.

● Example

– Acceptability Levels

● Unsatisfactory (range 0-40)

● Marginal (range 40-70)

● Satisfactory (range 70-100)

Concepts for Indicator/Evaluation

Page 32: Engineering the Evaluation Approach to fit different Web

32

Measures and Indicator Values

External Quality Requirements Measure EI value P/GI value

Global Quality Indicator 61.97%

1 Usability 60.88%

1.1 Understandability 83%

1.1.1 Icon/label ease to be recognized 100%

1.1.2 Information grouping cohesiveness 66%

1.2 Learnability 51.97%

1.2.1 ……………………………………………… …

1.3 Operability 49.50%

1.3.1 Control permanence 100%

1.3.2 Expected behaviour 50%

2 Content Quality 63.05%

2.1 Content Suitability 63.05%

2.1.1 Basic Information Coverage 50%

2.1.1.1 Line item information completeness 2 50%

2.1.1.2 Product description appropriateness 50%

2.1.2 Coverage of other Contextual Information 76.89%

2.1.2.1 ……………………………………………….. …

2.1.2.2 Return policy information completeness 33%

Appropriateness Recognisability

GLOBAL MODEL (syno Aggregation Model, Scoring Model or

Function)

– algorithm or function with associated decision criteria thatmodel a global indicator.Ex. of global model to Satisfaction Level of External Quality

● Linear Additive Scoring ModelPartial/Global Indicator = ∑ (Weight x Elementary Indicator)

P/GI = W1 EI1+ ....+ Wn EIn

where W1 + ....+ Wn = 1;

Concepts for Indicator/Evaluation

Page 33: Engineering the Evaluation Approach to fit different Web

33

Measures and Indicator Values

External Quality Requirements Measure EI value P/GI value

Global Quality Indicator 61.97%

1 Usability 60.88%

1.1 Understandability 83%

1.1.1 Icon/label ease to be recognized 100%

1.1.2 Information grouping cohesiveness 66%

1.2 Learnability 51.97%

1.2.1 ……………………………………………… …

1.3 Operability 49.50%

1.3.1 Control permanence 100%

1.3.2 Expected behaviour 50%

2 Content Quality 63.05%

2.1 Content Suitability 63.05%

2.1.1 Basic Information Coverage 50%

2.1.1.1 Line item information completeness 2 50%

2.1.1.2 Product description appropriateness 50%

2.1.2 Coverage of other Contextual Information 76.89%

2.1.2.1 ……………………………………………….. …

2.1.2.2 Return policy information completeness 33%

Appropriateness Recognisability

Example: Content Suitability. Shopping Cart (Before)

2.1.1.1 Line item

information completeness

2.1.1.2 Product description

appropriateness

1.3.2 Expected behaviour

of delete control

Page 34: Engineering the Evaluation Approach to fit different Web

34

Example: Content Suitability. Shopping Cart (After)

To Remark

Metrics are welcome when they are clearly needed and easy to collect and understand

Usefulness of Metrics

● Data coming from a measurement (objective, subjective)

● Mapping between an empirical world (entity attribute/property)

to a numerical, formal world

● Heuristic operationalisation

● A metric (and its measures) CAN NOT interpret by itself a

calculable concept (Need of INDICATORS)

Page 35: Engineering the Evaluation Approach to fit different Web

35

To Remark

Indicators are ultimately the foundation for interpretation of information needs and decision-making.

Usefulness of Indicators

● Mapping from a numerical world to another

● To serve as a base to quantify Calculable Concepts for an

Information Need

● Indicators give contextual Information/Knowledge

● Indicators give contextual information useful for decision-

making (Analyses and Recommendations)

Agenda Part I: Background

• Quality and Quality Models

• Introducing the Quality Concept

• ISO 25010 Models and Views

• External Quality and Quality in Use Views/Characteristics

• The 2Q2U Quality Modeling Framework

• Models and Relationships between Views

• Conceptual Base for Measuring and Evaluating Quality

• Example: Information Suitability for a Shopping Cart

• Concepts for Non-functional Requirements (NFR): Attributes, …

• Concepts for Measurement: Metrics, …

• Concepts for Evaluation: Indicators, …

• Conclusions (Part I)

Page 36: Engineering the Evaluation Approach to fit different Web

36

Conclusions: Part I

● 2Q2U, useful for NFR modeling and instantiation, extends the ISO 25010 standard.

– Adds 4 (sub-)characteristics:

● Information Quality (EQ); Learnability in Use (QinU), Sense of Community (QinU), and Communicability (QinU)

– Modifies substantially a new characteristic:

● Functional Quality (EQ);

– Adds two new concepts:

● Actual Usability (QinU)

● Actual User Experience (QinU)

● We illustrated the conceptual framework (C-INCAMI) primary terms as part of the GOCAME strategy

– Attribute/Entity (NFR), Metric (Measurement), and Indicator (Evaluation)

● Next, (Part II) a generic Evaluation Approach is illustrated,

– which reuses the whole concepts introduced in Part I …

References

Olsina L; Papa F.; Molina H; (2008) How to Measure and

Evaluate Web Applications in a Consistent Way, Chapter in

Springer Book titled Web Engineering: Modelling and Implementing

Web Applications, Rossi, Pastor, Schwabe, and Olsina Editors.

Olsina L. Rossi G. Garrido A. Distante D.; Canfora G.;

(2008) Web Applications Refactoring and Evaluation: A

Quality-Oriented Improvement Approach, In Journal of Web

Engineering, Rinton Press, US, Vol 7 Nº 4, pp. 258-280

Lew P., Olsina L., Li Zhang; (2010) Quality, Quality in Use,

Actual Usability and User Experience as Key Drivers for Web

Application Evaluation, In: LNCS 6189, Springer, ICWE 2010, Vienne,

Austria, pp. 218-232

Page 37: Engineering the Evaluation Approach to fit different Web

37

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

Generic Evaluation Approach

● This tutorial discusses a Generic M&E Approach (GEA),whose architecture is based on two main pillars, namely:

● A Quality Modeling Framework; which can be multi-level,and

● Integrated M&E Strategies (such as GOCAME, SIQinU,among others), which in turn are grounded on threeprinciples:

– A M&E conceptual framework,

– A well-established M&E process, and

– Measurement, Evaluation, Analysis and Recommendation methodsand tools.

Page 38: Engineering the Evaluation Approach to fit different Web

38

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

Quality Modeling Framework

● A quality model, which is targeted for a quality focus andentity category, can be defined as the set of (sub-)characteristics, and their hierarchical relationships thatprovide the basis for specifying a non-functionalrequirements structure and its further evaluation.

– Quality models can be intended for different entities categoriessuch as resource, process, product, system, system in use,among others, such as project or service.

– In any given M&E project more than one entity category (e.g. asystem and a system in use) can intervene, each with a differentquality focus/model.

– The quality focus (for a given information need –purpose and userviewpoint-) is in general terms the root characteristic (calculableconcept) of an instantiated quality model.

– Also, in an instantiated quality model, attributes are combined orrelated accordingly to its (sub-)characteristics.

Page 39: Engineering the Evaluation Approach to fit different Web

39

Quality Modeling Framework

● Additionally, a QMF is useful for establishing relationshipsamong quality focuses at a given organizational levelrelated to different entity categories –and ultimately toconcrete entities.

● Such relationships among focuses at a givenorganizational level (e.g. operative level for a given M&Eproject) are commonly named “influences” and “dependson” relationships.

● E.g. the relationships in ISO 9126-1/25010 for EQ-QinUfocuses:

.

Quality Modeling Framework (as per Olsina et al, 2011):

Entity Category (superCategory)

Quality Focus/Model Examples of Entity

Category

Resource

Process

Product

System

System in Use

• Evaluation Strategy

• Development Tool

• Development Team

• Development Process

• Evaluation Process

• WebApp source code

• Diagram class

• Mashup WebApp

• Defect Tracking WebApp

• Mashup WebApp-in-use

• Defect Tracking

WebApp-in-use

2Q2U Quality Framework

Resource Quality

Process Quality

Product (Internal) Quality

System (External) Quality

System-in-Use Quality

influences

depends on

Page 40: Engineering the Evaluation Approach to fit different Web

40

2Q2U Modeling Framework (as per Olsina et al, 2012):

2Q2U Models/Characteristics (as per Olsina et al, 2012):

Page 41: Engineering the Evaluation Approach to fit different Web

41

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

Integrated M&E Strategies

● An Integrated M&E Strategy should be designed for agiven purpose and information need at M&E project orprogram level of an organization, which in turn might begrounded on three principles or capabilities:

1. A M&E Conceptual Base/Framework,

2. A well-established M&E Process, and

3. M&E Methods and Tools.

● We support the premise that a M&E strategy is integrated ifthe three above mentioned capabilities are to a great extentachieved simultaneously

Page 42: Engineering the Evaluation Approach to fit different Web

42

Integrated M&E Strategies

1. A M&E Conceptual Base/Framework,

2. A well-established M&E Process, and

3. M&E Methods and Tools.

● The M&E conceptual base/framework capability should bebuilt on a robust terminological base as an ontology (oralso a taxonomy or glossary), which explicitly and formallyspecifies the main concepts, properties, relationships, andconstraints for the M&E domain, as well as their groupinginto components.

● This principle ensures terminological uniformity among theother capabilities and thus the consistency of results.

Integrated M&E Strategies1. A M&E Conceptual Base/Framework,

2. A well-established M&E Process, and

3. M&E Methods and Tools.

● The C-INCAMI conceptual framework has six components:

– M&E project definition,

– Nonfunctional requirements specification,

– Context specification,

– Measurement design and implementation,

– Evaluation design and implementation, and

– Analysis and recommendation specification.

● Also C-INCAMI is enriched with a Process conceptual base and component

Page 43: Engineering the Evaluation Approach to fit different Web

43

C-INCAMI Conceptual Base/Components

Process Conceptual Component

Page 44: Engineering the Evaluation Approach to fit different Web

44

Integrated M&E Strategies

1. A M&E Conceptual Base/Framework,

2. A well-established M&E Process specification, and

3. M&E Methods and Tools.

● The second principle is the M&E process specification,which describes what to do, by specifying the activities to beplanned and executed, their inputs and outputs, roles,interdependencies, among other aspects.

– In process modeling we can specify process views

● It reuses the conceptual base (e.g. C-INCAMI, Process)

● A well-established M&E process not only facilitates theunderstanding and communication among stakeholders butalso ensures repeatability and reproducibility in theimplementation of the activities.

GOCAME M&E Process

Page 45: Engineering the Evaluation Approach to fit different Web

45

Integrated M&E Strategies

1. A M&E Conceptual Base/Framework,

2. A well-established M&E Process specification, and

3. M&E Methods and Tools Specifications.

● While activities state ‘what’ to do methods, on the otherhand, describe ‘how’ to perform these activities, which inturn can be automated by tools.

– A methodology is a set of related methods.

● For instance, the WebQEM (Web Quality EvaluationMethod) and its associated C-INCAMI_tool were instantiatedfrom the conceptual framework and process.

● WebQEM can be used to evaluate and analyze differentviews of quality such as EQ and QinU for system andsystem-in-use; also, can be used for any quality focus of anyentity category.

C-INCAMI_Tool

Page 46: Engineering the Evaluation Approach to fit different Web

46

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

GOCAME and SIQinU Strategies

Page 47: Engineering the Evaluation Approach to fit different Web

47

The GOCAME M&E Integrated Strategy

GOCAME stands for Goal-Oriented Context-Aware Measurement and Evaluation Strategy, which is based on the three quoted capabilities

● It is a multi-purpose strategy that follows a goal-oriented and context-sensitive approach in defining and performing M&E projects.

● It is multi-purpose because can be used to evaluate (i.e. “understand”, “predict”, etc.) the quality for not only system and system-in-use entity categories but also for other ones such as resource and process, by using their instantiated quality models accordingly.

The GOCAME M&E Integrated Strategy

GOCAME stands for Goal-Oriented Context-Aware Measurement and Evaluation Strategy, which is based on the three quoted capabilities

● Moreover, the evaluation focus can vary, i.e. ranging from “external quality” of a system, “quality in use” of a system-in-use to “cost” of product/resource (or even “capability quality” of a resource).

● However, GOCAME does not incorporate improvement cycles as in SIQinU.

● Rather it can be used to understand the current or further situation (as an evaluation snapshot) of concrete entities.

Page 48: Engineering the Evaluation Approach to fit different Web

48

C-INCAMI Conceptual Base/Components

M&E Process in GOCAME

Page 49: Engineering the Evaluation Approach to fit different Web

49

Define NFR Activity

It implies 3 sub-activities: • Establish Information Need

• Specify Context

• Select a Concept Model

Define Non-Functional Requirements:

•Establish Information Need •Specify Context

•Select a Concept Model

Establish Information Need Activity

Page 50: Engineering the Evaluation Approach to fit different Web

50

improve final userWebApp/System

Cuspide.com shopping cart

External Quality

Define Non-Functional Requirements:

•Establish Information Need •Specify Context

•Select a Concept Model

Establish Information Need Activity

1) The used “lifecycle type”2) “technique type” used to make de changes

1) “lifecycle type” = “Agile Methodology”2) “technique type” = “WMR”

Specify Context Activity

Define Non-Functional Requirements:

•Establish Information Need •Specify Context, which represents the relevant state of the

situation of the entity to be assessed with regard to the information need

•Select a Concept Model

Page 51: Engineering the Evaluation Approach to fit different Web

51

1. Usability1.1 Understandability. . .

2. Content Quality2.1 Content Suitability2.1.1 Basic Information Coverage2.1.1.1 Line item information completeness2.1.1.2 Product description appropiateness

2.1.2 Coverage of other related information

. . . 2.1.2.1 Return policy information completeness1. Functional Quality

1.1 Suitability1.2 Accuracy. . .2. Usability2.1 Understandability2.2 Learnability2.3 Operability3. Reliability3.1 Fault Tolerance. . .4. . . .

EQ Model (2Q2U)

Select a Concept Model Activity

Define Non-Functional Requirements:

•Establish Information Need •Specify Context

•Select a Concept Model

Design the Measurement Activity

It implies 2 sub-activities: • Establish Entity• Assign one Metric

to each Attribute

Line item Informationcompleteness

Degree of completeness to the line item information

Page 52: Engineering the Evaluation Approach to fit different Web

52

Attribute: Amount of attempts to access protected pages

Direct Metric:

Name: Total number of attempts to access protected pages (#TPP)

Objective: The total number of protected pages (i.e. the given population) to be attempted for

access by a given technique

Author: Covella G. and Dieser A.

Version: 1.0

Measurement Procedure :

Specification: As precondition, log into the website with a valid user ID and password.

Browse the site looking for the URL population of protected pages, which are those that

must be accessed only after a successful login. Add one per each protected page URL

selected.

Type: Objective

Numerical Scale:

Representation: Discrete

Value Type: Integer

Scale Type: Absolute

Unit:

Name: Protected pages Acronym: Pp

Selected Metric for a Security Attribute

Attribute: Authentication Schema Bypass

Indirect Metric:

Name: Ratio of Protected Pages Accessed via Forced Browsing (%PPA)

Objective: To determine the ratio between the number of successful attempts accessing

protected pages by forced browsing and the total number of attempts performed.

Author: Covella G. and Dieser A.

Version: 1.0

Reference: OWASP Testing Guide 2008 V3.0

Calculation Procedure

Formula Specification: %PPA = (#PF / #TPP) * 100

Numerical Scale:

Representation: Continuous

Value Type: Real

Scale Type: Proportion

Unit:

Name: Percentage Acronym: %

Related Metrics:

1) Number of successful attempts to access protected pages by forced browsing (#PF);

2) Total number of attempts to access protected pages by forced browsing (#TPP)

Attribute: Amount of Successful attempts to access protected pages

Direct Metric:

Name: Number of successful attempts to access protected pages by forced browsing (#PF)

Objective: The number of successful attempts bypassing the authentication schema for the

protected page population using the forced browsing technique

Author: Covella G. and Dieser A.

Version: 1.0

Measurement Procedure:

Name: Direct page request

Specification: Using an unauthenticated browser session, attempt to directly access a

previously selected protected page URL through the address bar in a browser. Add one per

each successful access which bypasses the authentication.

Type: Objective

Numerical Scale:

Representation: Discrete

Value Type: Integer

Scale Type: Absolute

Unit:

Name: Successful attempts in Pp Acronym: Pp

Implement the Measurement Activity

Page 53: Engineering the Evaluation Approach to fit different Web

53

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

The SIQinU M&E Integrated Strategy

SIQinU stands for Strategy for understanding and Improving Quality in Use, which is based on the three quoted capabilities

● SIQinU is a specific-purpose and context-sensitive strategy to incrementally and continuously improve a WebApp-in-use’s QinUby means of mapping actual usage problems to measurable EQ attributes –that are inherent to a WebApp-, and then by performing improvement actions that enable evaluators assessing the gain both at EQ and QinU levels.

● SIQinU is a specific-purpose strategy because it can be used to evaluate (i.e. just for the purpose of “understand” and “improve”) the quality for only system-in-use and system entity categories by using their instantiated QinU and EQ models respectively.

Page 54: Engineering the Evaluation Approach to fit different Web

54

The SIQinU M&E Integrated Strategy

SIQinU stands for Strategy for understanding and Improving Quality in Use, which is based on the three quoted capabilities

● SIQinU is in alignment with the GOCAME strategy regarding its M&E conceptual framework, activities and methods.

● However, there are particular phases and activities in SIQinUthat are not included in GOCAME. Also it has specific methods and techniques

Quality Modeling Framework (as per Olsina et al, 2011):

Entity Category (superCategory)

Quality Focus/Model Examples of Entity

Category

Resource

Process

Product

System

System in Use

• Evaluation Strategy

• Development Tool

• Development Team

• Development Process

• Evaluation Process

• WebApp source code

• Diagram class

• Mashup WebApp

• Defect Tracking WebApp

• Mashup WebApp-in-use

• Defect Tracking

WebApp-in-use

2Q2U Quality Framework

Resource Quality

Process Quality

Product (Internal) Quality

System (External) Quality

System-in-Use Quality

influences

depends on

Page 55: Engineering the Evaluation Approach to fit different Web

55

2Q2U Instantiated Characteristics for the JIRA Case Study

depends on

influences

Quality-in-Use Model External Quality

Model

Actual User Experience

Actual Usability

Information Quality

Usability

Learnability

Operability

Information

Suitability

Effectiveness

Efficiency

Learnability in Use

Defect Tracking WebApp Entity

Category

Defect Tracking WebApp-in-use Entity

Category

110

M&E Process in SIQinU

Step by Step

Page 56: Engineering the Evaluation Approach to fit different Web

56

111

SIQinU – Phase I : Specify requirements and evaluation criteria for QinU

112

SIQinU – Phase I: Specify Requirements and Evaluation Criteria for QinU

Page 57: Engineering the Evaluation Approach to fit different Web

57

113

SIQinU – Phase I: Specify Requirements and Evaluation Criteria for QinU

114

SIQinU – Phase I: Specify Requirements and Evaluation Criteria for QinU

Sub-task No. Sub-task Name/Description Screen ID

1 Summary, steps, and results: In this sub-task, the

user enters the summary of the defect, steps to

produce the defect and results, expected and actual.

Initial Defect Entry Screen (SC1-1)

Summary, Steps and Results Screen

(SC1-2), Summary, Steps, Results

Summary Screen (SC1-3)

2 Add Detail Info: In this sub-task, users enter detailed

information about the defect.

Add Detail Info Screen (SC2-1),

Detail Summary Screen (SC2-2)

3 Add Environment Info: In this sub-task, users enter

the environment information such as the operating system and browser.

Add Environment Info Screen (SC3-

1), Environment Summary Screen (SC3-2)

4 Add Version Info: In this sub-task the user enters the

version of the software being tested and other

associated information.

Add Version Info Screen (SC4-1),

Version Summary Screen (SC4-2)

5 Add Attachment: In this sub-task the user attaches a

file, if necessary to describe the defect in greater

detail. This usually is a screenshot showing the

defect while the application is in use.

Add Attachment Screen (SC5-1),

Attachment Summary Screen (SC5-

2)

Page 58: Engineering the Evaluation Approach to fit different Web

58

115

S_task N. Sub-task Name/Description Screen ID

1 Summary, steps, and results: In this sub-

task, the user enters the summary of the

defect, steps to produce the defect and

results, expected and actual.

Initial Defect Entry Screen (SC1-

1) Summary, Steps and Results

Screen (SC1-2), Summary, Steps,

Results Summary Screen (SC1-3)

2 Add Detail Info: In this sub-task, users

enter detailed information about the defect.

Add Detail Info Screen (SC2-1),

Detail Summary Screen (SC2-2)

3 Add Environment Info: In this sub-task,

users enter the environment information

such as the operating system and browser.

Add Environment Info Screen

(SC3-1), Environment Summary

Screen (SC3-2)

4 Add Version Info: In this sub-task the user

enters the version of the software being

tested and other associated information.

Add Version Info Screen (SC4-1),

Version Summary Screen (SC4-2)

5 Add Attachment: In this sub-task the user

attaches a file, if necessary to describe the

defect in greater detail. This usually is a

screenshot showing the defect while the

application is in use.

Add Attachment Screen (SC5-1),

Attachment Summary Screen

(SC5-2)

Sub-tasks for Entering a New Defect Task in JIRA:

116

Definitions needed for Task and Attribute M&E

Correct Incorrect

Complete This is a sub-task that has been done

completely, meaning that each sub-task in

the task was complete and had data

entered. The test leader found no errors in

the defect entered. If he did, then he

would enter and change information for

that defect to correct it.

This is a sub-task that is complete, but the

test leader found that it was incorrect. Some

possible reasons that this situation may arise

include: • User was hasty and entered in incorrect information

because he wanted to finish quickly.

• User did not understand completely, perhaps not

enough help or instructions not clear in the

application, so he/she finished, but did something

wrong.

Incomplete This is a sub-task that has not been

completed, and no errors were found in

the defect sub-tasks that were completed.

This could be because: • User was proceeding to enter the defect, but found

they were missing some information and could not

complete it or they inadvertently skipped that part

because it was not mandatory and they didn’t think

it was important.

• User didn’t understand part of the application due

to explanations, so they left that part incomplete.

• Flow in the application was not clear so they quit.

This is the worst case where a sub-task is

both incomplete, and incorrect due to a

combination of reasons. It could be that the

user does not understand well, and therefore

enters wrong information, or may eventually

quit due to frustration.

Page 59: Engineering the Evaluation Approach to fit different Web

59

117

SIQinU – Phase I: Specify Requirements and Evaluation Criteria for QinU

118

SIQinU – Phase II: Perform QinU Evaluation

Page 60: Engineering the Evaluation Approach to fit different Web

60

119

SIQinU – Phase II: Perform QinU Evaluation

120

Phase II – JIRA v1 QinU Evaluation Summary

Page 61: Engineering the Evaluation Approach to fit different Web

61

121

Phase II – JIRA v1 Evaluation by Attribute

122

SIQinU – Phase III: Derive/Specify Requirements and Evaluation Criteria for EQ

Page 62: Engineering the Evaluation Approach to fit different Web

62

123

Phase III – JIRA Screen (SC2-1)

Was 38%

Phase III – Derived EQ Characteristics and Attributes

Page 63: Engineering the Evaluation Approach to fit different Web

63

Phase IV: Perform the EQ Evaluation & Analysis

● Similar to Phase II but for EQ

126

Phase V: Recommend and Perform Improvement Actions for EQ

● Based on Ph IV, for these EQ characteristics –and particularly attributes- that require improvement, we will make improvement recommendations for modifying the WebApp, i.e. Version 1 to Version 1.1.

Page 64: Engineering the Evaluation Approach to fit different Web

64

127

Phases IV/V – JIRA – EQ Evaluation (before/after)

Was 38%

Phase VI – JIRA QinU Evaluation -before/after Analysis

Page 65: Engineering the Evaluation Approach to fit different Web

65

2Q2U Instantiated for the JIRA Case Study

depends on

influences

Quality-in-Use Model External Quality

Model

Actual User Experience

Actual Usability

Information Quality

Usability

Learnability

Operability

Information

Suitability

Effectiveness

Efficiency

Learnability in Use

Defect Tracking WebApp Entity

Category

Defect Tracking

WebApp-in-use Entity

Category

SIQinU: Specific relationships developed from the generic“depends on” and “influences” ones -in next slide

130

Phase VI – Relationships Developed

Page 66: Engineering the Evaluation Approach to fit different Web

66

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

QMF for the Operative Level of an Organization

Entity Category (superCategory)

Quality Focus/Model Examples of Entity

Category

Resource

Process

Product

System

System in Use

• Evaluation Strategy

• Development Tool

• Development Team

• Development Process

• Evaluation Process

• WebApp source code

• Diagram class

• Mashup WebApp

• Defect Tracking WebApp

• Mashup WebApp-in-use

• Defect Tracking

WebApp-in-use

2Q2U Quality Framework

Resource Quality

Process Quality

Product (Internal) Quality

System (External) Quality

System-in-Use Quality

influences

depends on

Page 67: Engineering the Evaluation Approach to fit different Web

67

133

Multi-level M&E Approach/ GOCAME+ Strategy

Management

Level

Operative Level

Tactic Level

Quality Focus/

Model

Quality Focus /

Model

Quality Focus /

Model

SIN 1 (Strategic Information Need)

TIN 1.1

OIN 1.1.1

TIN 1.2

TIN 1.3

OIN 1.2.1

OIN 1.1.2

Entity

Category

Concrete

Entity

Category OIN 1.3.1

aligned with

aligned with

AgendaPart II: Evaluation Approach

• Generic Evaluation Approach

• Quality Modeling Framework

• Integrated M&E Strategies

• Conceptual Base/Framework

• M&E Process

• Methods and Tools

• Two M&E Strategies: GOCAME and SIQinU

• Multi-level M&E Strategy/Approach

• Introducing GOCAME+

• Revisiting Learning Objectives

Page 68: Engineering the Evaluation Approach to fit different Web

68

Revisiting the Learning ObjectivesThe Tutorial main learning objectives were:

● Review background concepts such as Information Need, Qualityand Entity Category; Quality Framework/Models, and Strategiesw.r.t. the three principles, i.e. M&E process, conceptual framework, methods/tools.

● Get insight on how the QMF can be instantiated in a purposeful way not only for understanding but also for improvement, using for this end the customized strategy. We call this a GEA/Architecture

– The learning objective was also to see that many different strategies can be instantiated from the same QMF, regarding different organizational information needs and levels.

– Even the multi-level M&E GOCAME+ strategy

● Get insight how a concrete strategy (SIQinU) for understanding and improving a WebApp (e.g. its EQ and QinU) can be used, while excerpts of a real case study (JIRA) was illustrated.

– Also GOCAME was illustrated

References

Olsina L., Lew P., Dieser A., Rivera B. (2012) Updating Quality

Models for Evaluating New Generation Web Applications, In:

Journal of Web Engineering, Special issue: Quality in new generation Web

applications. Rinton Press, US, 11 (3), pp. 209-246..

Olsina L., Lew P., Dieser A., Rivera B. (2011) Using Web Quality

Models and a Strategy for Purpose-Oriented Evaluations, Journal of

Web Engineering, Rinton Press, US, 10 (4), pp. 316-352

Becker P., Lew P., Olsina L. (2012) Specifying Process Views for a

Measurement, Evaluation, and Improvement Strategy, In: Advances

in Software Engineering Journal, Academic Editor: Osamu Mizuno, Hindawi

Publishing Corporation, Vol. 2012, 27 pg., DOI:10.1155/2012/949746

Page 69: Engineering the Evaluation Approach to fit different Web

69

Dr. Luis Olsina

E-mail: [email protected]

GIDIS_Web (Grupo de Investigación y Desarrollo en Ingeniería de Software y Web)

Departamento de Informática – Facultad de Ingeniería – Universidad Nacional de La Pampa

General Pico – La Pampa

Argentina

© 2013 GIDIS_Web

Questions ?Questions ?