“The most crucial property of any interactive system is its support for human activity. this is...

Preview:

Citation preview

“The most crucial property of any interactive systemis its support for human activity. this is what makesit worth having. it may enable us to do things faster,with fewer errors, with less prior learning, with greater resultant quality, or perhaps just with greaterfun and satisfaction.” [p.6]

Newman, W.M. and Lamming, M.G., 1995, Interactive System Design, Harlow: Addison-Wesley

“This interdisciplinary degree is concerned with designing interactive technology for people. You will gain skills in software design and systems engineering as well as an understanding of human-computer interaction and computer-mediated communication.”

Course details from University Prospectus

Layers of Interactive Systems

Socio-technical interaction

Human/computer Interaction

Application Programming

Cognition & Perception

System Programming/Assembly Language

Operating Systems/communications

Machine Language

Sensorimotor

Microprogramming

Physical/Digital LogicFeedback

Physical Inputs

Information & knowledge

Activity & CultureConvention

Selection in the World

Baber et al., 2000, Interact’00

Assignment

You will work in small teams to develop a concept design for an interactive museum guide.

You will need to consider: Sensor technologies Display technologies User interface Usability

Your team will present your concept designs in a poster session (10%) in week 8 The layout and contents of the posters will be discussed in a later session

You will produce individual reports on the design (90%) and submit it in week 10 The marking scheme will be given to you on week two

Usability & Designing for Error1H1

Chris Baber

Useful Resources

Ergonomics Information Analysis CentreThird Floor (N309)Several hundred books, journals, conference

proceedings on all aspects of interactive systemsErgonomics Abstracts database

Objectives

Consider why people make mistakes in their interactions with technology

Examine simple approaches to predicting how people use technology

Evaluate the usability of products against ISO9241

Course Structure

Human Centred Design

Human Error

Predictive Evaluation Methods

Introduction to Usability Evaluation

Outline syllabus

1. Making sense of technology;

2. Introduce concept of Human Error;

3. Introduce concept of Usability;

4. Review ISO9241 (part 13)

Reading List

Norman, D.A., 1990, The Design of Everyday Things, New York: Basic Books [The psychology of everyday things]

Noyes, J. and Baber, C., 1999, User-Centred Design of Systems, Berlin: Springer-Verlag

Stanton, N.A., 1998, Human Factors in Consumer Products, London: Taylor and Francis

Nielsen, J., 1993, Usability Engineering,

Boston: Academic Press

http://www.baddesigns.com/index.shtml

The “Waterfall” Model ofthe Design ProcessRequirements

Design

Implementation

Verification

Maintenance

Human-Centred Design Process

ISO 134071.Plan human-centred design process

2. Specify context of use

4. Produce design solutions

3. Specify user and organsiationalrequirements

5. Evaluate designagainst requirements

complete

Making Sense of Technology

‘Instant Experts’

You get a new mobile telephone for Xmas, do you:

1. Read the manual from cover to cover and then switch on the phone?

2. Switch on the phone and try to use it?

3. Switch on the phone and read the manual when you have problems?

Using tools

Physical appearance

Knowledge of use

Sequence of activity

Affordance

See handle Reach out hand Grasp handle Turn handle Pull door

The Cooker Problem #1

Which control acts on which ring?

The Cooker Problem #2

Which control acts on which ring?

Direction of motion Stereotypes

a d

b

c

? ?

Clockwise = increaseClockwise = rightClockwise = away from controlClockwise = increase on scale

1 2 3 4 5 6 7

1 2 3 4 5 6 7

Clockwise to Increase?

1234567

Conclusions…

We have learned ‘routines’ for how to use many sorts of technology

We apply these routines ‘automatically’ When the routines succeed, they are

reinforced When the routines fail, we think about what

we’re doing

Problem Products

Rule I: to set time turn control;Rule II: to set time < 15s, turn control past 15s and then turn back to desired time

http://www.baddesigns.com/timer.html

Conflicting labels?

V for volume?

V for

http://www.baddesigns.com/remote.html

Spatial Compatibility?

http://www.baddesigns.com/boombox.html

Which side is the handle to open the ‘fridge?

http://www.baddesigns.com/fridge.html

Make the freezer warmer without changing fresh food setting

Norman, D.A., 1990, The Design of Everyday Things, New York: Basic Books

Normal settings C AND 5Colder fresh food C AND 6-7 1 Set both controlsColdest fresh food B AND 8-9 2 Allow 24 hours toWarmest fresh food D AND 7-8 stabilizeOff (fresh fd & frz) C AND 4-1

A B C D Efreezer

7 6 5 4 3 2 1fresh food

Possible Explanatory Models:

1. Thermostat and control for each compartment, hence two sets of controls(but why do the controls interact?)

2. Only one thermostat and distribution of cold air varied between compartments (but in which compartment is the thermostat?)

Using technology

Goals x Appearance (‘System image’)

Expectations x Design

Tasks x Functions

Context x Use

Norman’s 7 Stages of Action

Norman, D.A., 1990, The Design of Everyday Things, New York: Basic Books

Goal

Intention Evaluation

Plan of action Interpretation

Action Perception

Things in the World

Task Models

Hierarchical Task Analysis

Activity assumed to consist of TASKS performed in pursuit of GOALS

Goals can be broken into SUBGOALS, which can be broken into tasks

Hierarchy (Tree) description

Hierarchical Task Description

1 .0S w itch on O H P

2 .0C h eck p ro jec tion

3 .0P lace fo il on O H P

4 .0F ocu s p ro jec tion

0 .0P resen t O H P s lid es

Task Analysis comes from adding plans PLANS = conditions for combining tasks Fixed Sequence

P0: 1 > 2 > exit Contingent Fixed Sequence

P1: 1 > when state X achieved > 2 > exit P1.1: 1.1 > 1.2 > wait for X time > 1.3 > exit

Decision P2: 1 > 2 > If condition X then 3, elseif condition Y then 4 >

5 > exit

Reporting

HTA can be constructed using Post-it notes on a large space (this makes it easy to edit and also encourages participation)

HTA can be difficult to present in a succinct printed form (it might be useful to take a photograph of the Post-it notes)

Typically a Tabular format is used:Task number Task Plan Comments

Varieties of Task Model

Context People Goals Actions / Tasks

Diagrammatic Model

Design Team

Methodologyfollows

Experience

Design Rationale

draws upon

represents

Stakeholders

Design briefinforms

defines

consults

Errors in ATM use

Incorrect PIN Forget PIN Insufficient funds Exceed withdrawal limit Card insertion Following prompts

Slip / Memory Memory Decision / Memory Decision / Memory

Slip Slip / Misinterpretation

Human Error

Human Error

Failure of person or failure of design?

Lack of intelligence or lack of information?

Lack of ability or lack of guidance?

Types of human error

Humanactivity

Unintendedactions

Intendedactions

Slip

Lapse

Mistake

Failure of attentionIntrusionOmissionReversal

Misorder / Mistime step

Failure of memoryOmit stepLose place

Forget intentionWrong procedure

Rule breaking ViolationSabotage

Skill-based Failure

Double capture: habit intrusion preventing intended deviation from routine

Omission after interruption: lack of attentional check

Reduced intentionality: delay between intention and action

Perceptual confusion: confuse objects

Interference: two current plans converge

Repetition: begin action, interruption, repeat action

Reversal: begin action, interruption and undo action

Rule-based Failure

First exception: routine action with attentional failure

Countersigns: ignore evidence that rule should not be applied,

Increased Workload: narrowing of attention;

Rule strength: stimuli evoke rule with most strength

Generalisation: frequency bias;

Redundancy: chunking of stimuli;

Rigidity: functional fixedness;

Deficiency: ignore parts of problem space

Wrong rule: misinterpret situation and apply wrong rule

GEMS

Check action OK? GOAL

Yes

Problem? No Problemsolved?

Availableinformation

Familiar Yes Apply knownpattern? rule

No

GEMS (continued)

Find analogy Apply known rule

None found

Develop mental model Infer diagnosisOf problem space. Apply rules YesDefine abstract relations

NoProblem solved?

Human Error Prediction

Failure Mode Effect Consequence Analysis Human Activity can fail in predictable ways

E.g., Task: Type PIN Failure Modes: recall wrong PIN, type PIN incorrectly,

don’t type anything, use wrong keypad…

SHERPA Action

1. Omitted2. Too early3. Too late4. Too much5. Too little6. Too short7. In wrong direction8. Right action on wrong object9. Wrong action on right object

Information1. Not obtained2. Wrong information

obtained Communication

1. No communication2. Wrong information

Check1. Omitted2. Wrong object3. Wrong check4. mistimed

Tabular Format

HTA

Task Failure Mode Effect Consequence

1.1 Insert card Action: 7/8/9 Card not inserted

No access; Repeat task

1.2 Check prompt Check: 1 Fail to read screen

Do not check ATM status

TAFEI

Task Analysis for Error Identification

State-transition model of human interaction with products

Consider possible transitions for each state

Ask whether transitions are ‘legal’, (needed to fulfil specific goal)

Flowchart for TAFEI

Define components / materials

Define user goals / tasks Define device states / transitionsfor specific user goal

Relate user tasks to state transitions

Analysing TAFEIDraw transition matrix

For cell i,j, ask:is it possible to move to ALL other states

IF YES and related to goal, Then LEGAL

IF YES and not related to goal,Then ILLEGAL

IF NO,Then IMPOSSIBLE

TAFEI state

00

Mini-disc Off; headphones out

Waiting for:On

Headphones inOpen cover

TAFEI state

00

Mini-disc Off; headphones out

Waiting for:On

Headphones inOpen cover

11

Mini-disc Off; headphones in

Waiting for:On

Headphones inOpen cover

TAFEI state

00

Mini-disc Off

Waiting for:On

Headphones inOpen cover

11

Headphones in

Waiting for:On

Headphones outOpen cover

2

22

Mini-disc On

Waiting for:Off

Headphones outOpen cover

LUL Ticket Machine

Transition Matrix

From State:

To State:

012345

0 1 2 3 4 5

II----

LII---

IILI--

IIILL-

IIIIIL

IIIILL

Redesigning User Interface Using TAFEI: two examples

Microscope Control: Original Design

Menu driven Menus accessed by first

letter of command Menus arranged in

hierarchy

TAFEI of original design

Problems with original design

Lack of consistency D = DOS commands; Delete; Data file; Date

Hidden hierarchy Only ‘experts’ could use

Inappropriate defaults Setting up a scan required ‘correction’ of default settings

three or four times

Initial design activity

Observation of non-technology work Cytogeneticists inspecting chromosomes

Developed model of task Hierarchical task analysis

Developed design principles, e.g., Cytogeneticists as ‘picture people’ Task flow Task mapping

Task Model

Work flows between specific activities

Patient details Administration

Set up

Reporting

Microscope

Cell sample

Analysis

First “prototype”

TAFEI of first prototype

Second prototype

Final Product

Conclusions

Graphical user interface based on task activity Recognition not recall Task flow based on work Terminology from work not computer Accessible to non-technical

Gas Analyzer

Results of Redesign

Figure 9: Total wrong key presses and error screens encountered using the original and re-designed LMS

interface

0

10

20

30

40

50

Original Re-design

LMS version

key

pre

sse

s

wrong key presses

error screen encountered

Figure 12: Time taken to complete the task for the original and re-designed versions of the LMS.

0

0.5

1

1.5

2

2.5

original re-design

LMS version

Tim

e i

n m

inu

tes

time

Usability

Interactive System Design

Build prototype

Analyse

Modifications

Enhance

Study Users

Specify

EvaluateActivity data

Prototype

Specify

Activitymodel

Usability/errorreport

Newman, W.M and Lamming, M., 1985, Interactive Systems Design, Reading, MA: Addison-Wesley

User-Centred Design of Systems[Noyes and Baber, 1999]

Who will use the system? What will the system be used for? What are the main components of the system? How will the system be designed? How well do the users think the system works? How well does the system really work? How well does the system under development work? Can the system be improved? How can the system be introduced into the workplace? How will the system be used in the workplace?

Forms of evaluation

Understand use of technology at work

Comparison of products

Comparison with Standards

Comparison with design targets

Evaluation in Design

Summative – at end of design

Formative – during design, e.g.,Early

Set objectives, define requirements, existing solutions

Mid Compare with objective, settle disputes, prototypes

LateCompare with Standards and competitors

Evaluation in Design

Performance factors

Organisationalfactors

Desired Performance

Contextfactors

Relevantmethods

Evaluation Protocol

Actualperformance

Defining usability

“Usability measures: the effectiveness, efficiency and satisfaction with which specified users can achieve goals in a particular environment.” [ISO9241]

Usability measuresSpecified User

GoalsParticular Environment

Evaluation Process

user

task

equipment

environment

product

Context of use

goalsIntended objectives

Outcome of interaction

effectiveness

efficiency

attitude

Usability measures

Specified Users

Skills & Knowledge

Product experience

System knowledge

Task experience

Organisational experience

Training

Operating skills

Qualifications

Linguistic ability

Physical attributes

Vision

Hearing

Manual dexterity

Goals & Tasks

Definition of typical tasks in typical work domain

Selection of typical goals for typical users

Particular Environment

Typical domain:OrganisationProductSystemTaskOperating skills

Evaluation Methods

Analytic methods Specialist reports

User reports Observational methods

Specialist reports

Evaluation by Experts

Expertise in:ProductDomainTask

Metrics for Specialist report

HEURISTIC EVALUATION

Simple, natural language User’s languageMinimal memory load ConsistencyUseful feedback Clear exitsClear shortcuts Useful error messagesMinimise user error

Detecting Problems

Designing for Error

Change the design of productModes Interlocks

Change the operation of the productTask flowConsistencyClarity / GuessabilitySimplicity

Designing for Usability

Learnable Efficient

Acceptable Flexible

Useful Enjoyable Elegant