31
Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Embed Size (px)

Citation preview

Page 1: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 1

Usability evaluation

• Without users - analytical techniques

• With users - survey and observational techniques

Page 2: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 2

Approach 1: Analytical evaluation

• cognitive walkthrough• guideline-based analysis, in this

case, heuristic analysis, but also standards inspection, consistency inspection

Page 3: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 3

Cognitive Walkthrough

Page 4: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 4

What is it?

• Evaluators look at the system from the user’s point of view

• They step through user tasks and predict where users will have problems

• They concentrate on learnability

Page 5: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 5

How do you do it? (1)Preparation:

Identify usersWe can use our personasIdentify representative tasksWe can use the scenariosPer task, describe the correct action sequenceObtainable from storyboard & site mapGet a representation to work withCould be paper or rough/finished web pagesGet evaluators

Page 6: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 6

How do you do it? (2)• Evaluators walk through the

correct action sequence• For each action, they indicate

whether it is a “success story” or a “failure story”

• They provide evidence for their decision

Page 7: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 7

Questions for each action

Will the user:• Expect to have to take this action?• Notice the control for the action?• Recognise that the control produces the desired

effect?

If the correct action is performed: • Will progress be apparent?

Page 8: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 8

Group walkthrough

• Performed by a mixed team• Capture information on group

displays (like flipcharts)• Perhaps videotape whole process

Page 9: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 9

Expect to have to do this?

On a mobile phone, after entering a phone number, press the Send button.

Page 10: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 10

Notice the control?

• Raising the window shade

Page 11: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 11

Recognise the control?

Turning the volume down

Page 12: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 12

Will progress be apparent?

Is the system doing something?

Page 13: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 13

Cognitive Walkthrough of removing sound from an animation in PowerPoint

• Apples• Pears• Oranges• Bananas

Page 14: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 14

Analytical method 2:Heuristic Evaluation

Page 15: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 15

Heuristic Evaluation

What is it?“Expert” evaluation method based on general usability principles.

Heuristics = general rules about common properties of usable interfaces.

Page 16: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 16

1. Create something to evaluate

Can be paper prototype, active prototype,

possibly a site map

How to do it

Page 17: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 17

Develop a set of tasks for evaluators to attempt, normally scenarios, focussed on crucial or problematic issues.

Ask evaluators to go through the site several times and inspect the various navigation and information elements

2. Develop materials

Page 18: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 18

Select at least 3-5 evaluators. The more evaluators, the more problems are discovered, but the benefit/cost ratio decreases at about 5 evaluators.

Evaluators should not be associated with the project. Those with user interface or domain expertise find a greater percentage of actual problems (65%) and suggest a greater percentage of improvements than do developers (24%) or non-experts (12%).

3. Select Evaluators

Page 19: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 19

Page 20: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 20

Evaluators go through site with heuristics in mind, at least twice, once for pages, once for site design.

May go through once for each heuristic - depends.

Compile notes and write up report.

Decide on relative importance and make a plan for tackling problems

4 Carry out evaluation

Page 21: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 21

Which heuristics to use?

• Many lists exist• Important that list is not too long• Nielsen’s list of 10 heuristics • Keith Instone (handouts) gives

examples of how to use Nielsen’s heuristics for Web designs

Page 22: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 22

Nielsen’s Heuristics

Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

Page 23: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 23

Match between system & real world

Page 24: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 24

User control and freedom

Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue.

Page 25: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 25

Consistency and standards

Users should not have to wonder whether different words, situations, or actions mean the same thing.

Follow platform conventions, e.g. avoid custom link colours

From Adobe Acrobat Reader

Page 26: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 26

Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

Page 27: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 27

Recognition rather than recall

Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

Page 28: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 28

Flexibility and efficiency of use

Accelerators may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users.

Allowing bookmarks gives efficiency, as do other browser functions

Page 29: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 29

Aesthetic and minimalist design

A graphical counter-example:

But on the Web, minimalist design also applies to text.

Page 30: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 30

Help users recognise, diagnose and recover from errors

Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

Page 31: Usability Group@Brighton 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques

Usability Group@Brighton 31

Help and documentation

Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.