47
Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Embed Size (px)

Citation preview

Page 1: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Usability evaluation

Big picture as lead into Heuristic Evaluation

and situating think-aloud evaluation

Page 2: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Overview

• Core• Heuristic evaluation

– What is it– How to do it

• Practice in preparation for homework and Assignment

• Beyond the basics– Broader picture of HE

• Homework

Page 3: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Core methods

What do the next graphs tell you?

Page 4: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Technology Transfer of Heuristic Evaluation and Usability Inspectionby Jakob Nielsen on June 27, 1995

http://www.nngroup.com/articles/technology-transfer-of-heuristic-evaluation/

Page 5: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Source as above

Page 6: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

What is Heuristic Evaluation

• Heuristic Evaluation• Select set of design heuristics• Recruit experts in HE• For each expert:

– Expert taken through tasks– Meanwhile team member records violations of the heuristics– Expert reviews all the problems found and rate them

• Combine all expert reports• Compare this with think-aloud:

– How is this different?– What is similar?

Page 7: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Guidelines (heuristics) for design and evaluation

Page 8: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Heuristic evaluation aka Usability Inspection

and guidelines

Page 9: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

About heuristics

Rules of thumbGuidelines

Page 10: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Example

Copyright MKP. All rights reserved. 10

Page 11: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Aside

• These heuristics/guidelines– Are helpful for design– And for evaluation

Why is this so?

Page 12: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Core: Use Neilsen Heuristics to do HE

Hard for novices to use effectively

Page 13: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Cockton, Gilbert (2014): Usability Evaluation. In: Soegaard, Mads and Dam, Rikke Friis (eds.). "The Encyclopedia of Human-Computer Interaction, 2nd Ed.". Aarhus, Denmark: The Interaction Design Foundation. Available online at https://www.interaction-design.org/encyclopedia/usability_evaluation.html

Page 14: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

14

Nielsen’s original heuristics

• Visibility of system status.• Match between system and real world.• User control and freedom.• Consistency and standards.• Error prevention. • Recognition rather than recall.• Flexibility and efficiency of use.• Aesthetic and minimalist design.• Help users recognize, diagnose, recover from errors.• Help and documentation.http://www.nngroup.com/articles/ten-usability-heuristics/

Page 15: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

• Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

• Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.

• User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

• Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

• Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.

Page 16: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

• Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

• Flexibility and efficiency of use Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

• Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

• Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

• Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

Page 17: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Class activity

Page 18: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

18

Activity: Identify potential trade-offs

• Visibility of system status.• Match between system and real world.• User control and freedom.• Consistency and standards.• Error prevention. • Recognition rather than recall.• Flexibility and efficiency of use.• Aesthetic and minimalist design.• Help users recognize, diagnose, recover from errors.• Help and documentation.

Page 19: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

19

Main stages of heuristic evaluation

• 1. Preliminaries– Agreed set of heuristics to use;– Programming team member overviews system;– And team member is available throughout;– Set of tasks;

Page 20: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

20

Main stages of heuristic evaluation

• 2 Evaluation:– Each expert works independently through the UI;– Team member records problems, so expert can simply

state them (single person works well, so they bring all details together);

– Also answers any questions (eg expert gets stuck, cannot find how to do a task, may need help with domain expertise aspects)

– Multiple passes (overview, then detailed)– Results in a set of identified failures to match the

heuristics (part of interface, violated heuristic)

Page 21: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

21

Main stages of heuristic evaluation

• 3. Concluding summary– Summarise all the flaws– Rate these in terms of severity

Page 22: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Severity ratingshttp://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/

• “The severity of a usability problem is a combination of three factors:

• The frequency with which the problem occurs: Is it common or rare?

• The impact of the problem if it occurs: Will it be easy or difficult for the users to overcome?

• The persistence of the problem: Is it a one-time problem that users can overcome once they know about it or will users repeatedly be bothered by the problem?”

Page 23: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

• 0 = I don't agree that this is a usability problem at all 1 = Cosmetic problem only: need not be fixed unless extra time is available on project 2 = Minor usability problem: fixing this should be given low priority 3 = Major usability problem: important to fix, so should be given high priority 4 = Usability catastrophe: imperative to fix this before product can be released

Severity ratingshttp://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/

Page 24: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Summarising the resultsDescription of problem Heuristics violated S1 S2 S3

Hard to find the Archive button Consistency … 0 1 2

Scroll bar not visible Visibility…, Match system and real world

3 2 4

User has to remember ID from last screen

Recognition rather than recall 4 4 4

Interface uses language that is not meaningful to user – select bundle for student data

Match system and real world 2 2 3

Instructions say “Circle all selections” but there are check boxes

Match system and real world

Page 25: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Shows which evaluators found which usability problems in HE of a banking system. Each row is an evaluator (n=19) and each column is a flaw (n=16)Black squares show where evaluator found the problem.The rows are sorted with most successful evaluators at the bottom.The columns are sorted with easiest to find problems at right.

http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/

What does this graph tell you?

Page 26: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Give a definition of an “Easy” and “Hard”.State some reasons a problem may be easy … hardIs there any problem that every evaluator found?What are the main lessons?

http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/

Page 27: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

27www.id-book.com

Discount evaluation

• Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used.

• Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

Page 28: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

28www.id-book.com

No. of evaluators & problems

Page 29: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/

Page 30: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Critical to success

• Set of heurstics• Set of tasks• Prototype interface• Experts!!!

Page 31: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

31www.id-book.com

Advantages and problems

• Few ethical & practical issues to consider because users not involved.

• Can be difficult & expensive to find experts.• Best experts have knowledge of application domain &

users.• Can find both important and minor problems• Best if experts are not the designers.• Biggest problems:

– Important problems may get missed;– Many trivial problems are often identified;– Experts have biases;– May encourage tinkering with the interface detailed in the

identified problems.

Page 32: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

How would one validate a set of heuristics?

And how would you know about this?

Page 33: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

How to select appropriate heuristics?

ContextValidity

Usability (broad versus detailed)

Page 34: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Summary of HE

• No users needed• Heuristics needed• Experts needed (NOT the designers…)• Relatively cheap

Page 35: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Class activity in your groups

• More practice in creating tasks– For CUSP, define 2 important abstract tasks– For each of these, define 2 concrete tasks

• Conduct HE on CUSP using pods– Using Nielsen’s Heuristics – Using the above 4 tasks– As you are getting started allocate different heuristics to team

members to focus on– Write down problems as you see them (Sketch the CUSP

screen and annotate it)– Make a table of the problems and your assessment of their

severity

Page 36: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

About guidelines

Page 37: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

H&P Conclusions about guidelines

• Be cautious using guidelines• Use careful thought and interpretation • In application, guidelines can conflict and

overlap• Guidelines do not guarantee a good user

experience • Using guidelines does NOT eliminate need

for fuller UX evaluation

Copyright MKP. All rights reserved. 37

Page 38: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

How to Write User-Friendly Contenthttp://www.usability.gov/how-to-and-tools/methods/writing-for-the-web.html

• Use the words your users use. • Chunk your content. • Front-load the important information. • Use pronouns. • Use active voice. • Use short sentences and paragraphs. • Use bullets and numbered lists. • Use clear headlines and subheads. • Use images, diagrams, or multimedia• Use white space.

Page 39: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

• Use the words your users use. By using keywords that your users use, you will help them understand the copy and will help optimize it for search engines.

• Chunk your content. Chunking makes your content more scannable by breaking it into manageable sections.

• Front-load the important information. Use the journalism model of the “inverted pyramid.” Start with the content that is most important to your audience, and then provide additional details.

• Use pronouns. The user is “you.” The organization or government agency is “we.” This creates cleaner sentence structure and more approachable content.

• Use active voice. “The board proposed the legislation” not “The regulation was proposed by the board.”• Use short sentences and paragraphs. The ideal standard is no more than 20 words per sentence, five

sentences per paragraph. Use dashes instead of semi-colons or, better yet, break the sentence into two. It is ok to start a sentence with “and,” “but,” or “or” if it makes things clear and brief.

• Use bullets and numbered lists. Don’t limit yourself to using this for long lists—one sentence and two bullets is easier to read than three sentences.

• Use clear headlines and subheads. Questions, especially those with pronouns, are particularly effective.• Use images, diagrams, or multimedia to visually represent ideas in the content. Videos and images

should reinforce the text on your page.• Use white space. Using white space allows you to reduce noise by visually separate information.

Page 40: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

http://guidelines.usability.gov/

• Section 1, Make Action Sequences Clear• Section 2, Avoid Jargon• Section 3, Use Familiar Words• Section 4, Define Acronyms and Abbreviations• Section 5, Use Abbreviations Sparingly• Section 6, Use Mixed Case with Prose• Section 7, Limit the Number of Words and Sentences• Section 8, Limit Prose Text on Navigation Pages• Section 9, Use Active Voice• Section 10, Write Instructions in the Affirmative• Section 11, Make First Sentences Descriptive

Page 41: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Ben Shneiderman's golden rules for dialogue

• Consistency– eg location of “quit”

• short cuts for frequent users• informative feedback

– HTTP Error 404 !!!• closure in dialogues

– (ie clear when action is complete)• simple error handling• easy reversal of actions

– undo• support internal locus of control

– Users should feel in control• reduce short term memory load

Page 42: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Bruce Togazzini• http://www.asktog.com/basics/firstPrinciples.html• Anticipation• Autonomy• Color Blindness• Consistency• Defaults• Efficiency of the User• Explorable Interfaces• Fitts' Law• Human Interface Objects • Latency Reduction• Learnability• Metaphors, Use of• Protect Users' Work• Readability• Track State• Visible Navigation

Page 43: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Many other guidelines

• Company specific• Device/OS specific

– eg https://developer.apple.com/library/mac/documentation/userexperience/conceptual/applehiguidelines/Windows/Windows.html

• ISO, ANSI Standards• National Standards• Military Standards• Accessibility Standards

Page 44: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

OS X Example

• “mental model your users have should infuse the design … support the user’s mental model by striving to incorporate the following characteristics

• Familiarity• Simplicity• Availability (functionality available)• Discoverability• https://developer.apple.com/library/mac/documentation/userexperience/conceptual/applehiguidelines/Intr

o/Intro.html#//apple_ref/doc/uid/TP30000894-TP6 (visited 2013)

Page 45: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Class activity

• Take the OS X set and map them to Neilsen– Where and how do they match– And not

Page 46: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

• Visibility of system status.• Match between system and real

world.• User control and freedom.• Consistency and standards.• Error prevention. • Recognition rather than recall.• Flexibility and efficiency of use.• Aesthetic and minimalist design.• Help users recognize, diagnose,

recover from errors.• Help and documentation

• Familiarity• Simplicity• Availability

(functionality available)

• Discoverability

Page 47: Usability evaluation Big picture as lead into Heuristic Evaluation and situating think-aloud evaluation

Homework for next week• Reading

– Reinecke, K., & Gajos, K. Z. (2014, April). Quantifying visual preferences around the world. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 11-20). ACM.

– Create concept map, including the concepts: methods, key results, links to HE

• Work on Assignment 1 Heuristic evaluation on the e-book– Same tasks as for TA– Each team member is an expert– Using Nielsen’s Heuristics – Bring a printout of screenshots, annotated to show problems you have found– Make a table of the problems and your assessment of their severity