82
Tri UPA Workshop Presented by Carol Barnum Usability Center @ Southern Polytechnic UX Tuneup

Ux tuneup

  • Upload
    ux-firm

  • View
    1.782

  • Download
    2

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Ux tuneup

Tri UPA Workshop

Presented by Carol BarnumUsability Center @ Southern Polytechnic

UX Tuneup

Page 2: Ux tuneup

How’d you get here?

Self taught?Read “the book“?Educated?Trained?Some other way?

Page 3: Ux tuneup

Workshop agenda Heuristic

evaluation&Usability testing

What still worksWhat needs a tuneup

Page 4: Ux tuneup

Heuristic evaluation

tradition to today

Morning focus

Page 5: Ux tuneup

Heuristic evaluation/Expert review

UPA survey says . . .

Slide 5

% of respondents Survey year

77% 2007

74% 2009

75% 2011

Page 6: Ux tuneup

Why so popular?

Slide 6

Fast Cheap

Easy Effective

Convenient

Page 7: Ux tuneup
Page 8: Ux tuneup

Slide 8

“It can often be more expensive and difficult to find 3-5 usability professionals as it is to test 3-5 users.”Jeff Sauro, “What’s the difference between a Heuristic Evaluation and a Cognitive Walkthrough?” Measuring Usability, Aug. 2, 2011

Page 9: Ux tuneup

1. Visibility of system status2. Match between system and real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and recover from errors10. Help and documentation

J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994

Tradition—Nielsen’s 10 heuristics

Slide 9

Page 10: Ux tuneup

• Small set of evaluators – 3 to 5 optimal cost-benefit – Single evaluator finds 35% of problems

• Each evaluator inspects alone – 1 to 2 hours– Several passes through interface– Inspection based on heuristics– If evaluators are not SME’s, hints can be given– Evaluator writes notes or report

The Nielsen Method

Page 11: Ux tuneup

After individual evaluations are done, evaluators: – Talk to each other, often with a facilitator– Share reports/notes– Collate findings– Rank issues by severity– Write compiled report

The Nielsen Method

Page 12: Ux tuneup

• Supply a typical usage scenario, listing the steps a user would take to perform tasks

• Hold a design debrief with designers• Use brainstorming to focus on possible

solutions• Include positive findings

Nielsen variations on method

Page 13: Ux tuneup

And the method is called…

“Discount Usability Engineering“

Page 14: Ux tuneup

So, what do you get?

Slide 14

• A list of potential problems• Also (sometimes) the positive findings• Tied to a heuristic or rule of practice• A ranking of findings by severity• (Sometimes) recommendations for fixing

problems• A report of findings

Page 15: Ux tuneup

What do you do?

Well, I always . . . .

Page 16: Ux tuneup

What do I do?

Well, I used to follow Nielsen

Page 17: Ux tuneup

17

Phase 1: Nielsen is my bible

Page 18: Ux tuneup

• Comparative evaluation of reservation process• 17 teams

– 8 did expert review/heuristic evaluation– Only 1 team used Nielsen’s heuristics

• Rolf’s conclusions– Findings “overly sensitive“—too many to manage– Need to improve classification schemes– Need more precise and usable recommendations

CHI 2003Results available at Rolf Molich’s DialogDesign http://www.dialogdesign.dk/CUE-4.htm

CUE 4 Hotel Pennsylvania 2003

Slide 18

Page 19: Ux tuneup

usability.spsu.edu UPA 2011

Slide 19

Page 20: Ux tuneup

After that, what did I do?

I got a little older and wiser

Page 21: Ux tuneup

Phase 2: Loose interpretation of Nielsen

dropped his heuristicskept severity ratingsadded screen capturesadded calloutsadded recommendations

Page 22: Ux tuneup

Click icon to add picture

Slide 22

Objectives/goals for the modules

Reason content is being presentedConciseness of presentationDefinitions required to work with the module/contentEvaluation criteria and methodsDirect tie between content and assessment measureSequence of presentation follows logically from introductionQuizzes challenge users

Develop a consistent structure that defines what’s noted in the bulleted points, above.Avoid generic statements that don’t focus users on what they will be accomplishing.Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the moduleConnect ideas in the goals and objectives with outcomes in the assessmentFollow the order of presentation defined at the beginningDevelop interesting and challenging questionsRe-frame goals/objectives at the end of the module

3

Finding Description of problem Recommendation H C S Severity Rating

Objectives/goals for the modulesnot clear

Unclear reason content is being presentedLack of conciseness of presentationDefinitions are required to work with the module/contentEvaluation criteria and methods unclearDirect tie between content and assessment measure unclearSequence of presentation does not follow logically from introduction.Quizzes do not challenge users.

Develop a consistent structure that defines what’s noted in the bulleted points.Avoid generic statements that don’t focus users on what they will be accomplishing.Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the module.Connect ideas in the goals and objectives with outcomes in the assessment.Follow the order of presentation defined at the beginning.Develop interesting and challenging quiz questions.Re-frame goals/objectives at the end of the module.

3

Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.H = Hyperspace; C = Cardiac Arrest; S = Shock

Page 23: Ux tuneup

Slide 23

Page 24: Ux tuneup
Page 25: Ux tuneup

Then, what did I do?

I broke free!

Page 26: Ux tuneup

findings stated in our terminologyscreen captures

Phase 3: I did it my way

Page 27: Ux tuneup

27

Page 28: Ux tuneup

28

A unique password between 6 and 16 characters was required. “Unique” is not defined. This is a problem with terminology.

Usually, passwords must be a combination of letters and numbers for higher security. An all-letter password—Heuristics—was accepted. A dictionary term is not a secure password and contradicts accepted conventions. The ability to input a dictionary word may be a component of trust for users.

The username and security question answer were rejected on submit.

This result is confusing as the name was confirmed on the previous screen. This relates to establishing conventions for the form of names/passwords on the input screen. Input formats need to be defined on the relevant page.Differences in spelling “username” vs. “user name” are subtle but are consistency issues.

The red banner is confusing as the user chose the gold (Free Edition). This is a consistency issue.

Page 29: Ux tuneup

User experience emerges in reviewer comments. . .

Page 30: Ux tuneup

Reviewer comment: I wanna click on the map, not the pulldown. WAH!Also, I’ve got no idea what the text on this page means.

Page 31: Ux tuneup

Why not tell the user’s story?!

Page 32: Ux tuneup

• Ginny Redish and Dana Chisnell• AARP report—58 pages, 50 websites

– Two personas—Edith and Matthew– Evaluators “channel“ the user via persona and

tasks/goals– The users’ stories emerge

Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf

Strategy—Persona-based scenario review

Slide 32

Page 33: Ux tuneup

While the clickable area is very large in the navigation blocks, Edith expected to click on the labels, so she was surprised when the menu appeared

When trying to click an item in the menu above, Edith had trouble selecting because her mouse hovered close enough to the choices below to open that menu, obscuring the item she wanted to click

Chisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP)

Page 34: Ux tuneup

Engage in conversation with your reader

“Every use of every website is a conversation started by the site visitor.”

Ginny RedishLetting Go of the WordsMorgan Kaufmann, 2007 (new edition coming)

Page 35: Ux tuneup

Tell the story of your user’s experience

“Stories organize facts in memorable ways.”

Whitney Quesenbery and Kevin BrooksStorytelling for User ExperienceRosenfeld Media 2010

Page 36: Ux tuneup

No deliverableQuick findingsPresentationDetailed report

Options for report deliverables

Slide 36

Page 37: Ux tuneup

• All sites have usability problems• All organizations have limited resources• You’ll always find more problems than you have

resources to fix• It’s easy to get distracted by less serious problems

that are easier to solve . . .• Which means that the worst ones often persist• Therefore, you have to be intensely focused on

fixing the most serious problems firstRocket Surgery Made Easy, New Riders, 2010

Steve Krug’s approach

Slide 37

Page 38: Ux tuneup

Slide 38

“Focus ruthlessly on a small number of the most important problems.”

Steve Krug

Page 39: Ux tuneup

“big honkin’ report“

Page 40: Ux tuneup
Page 41: Ux tuneup

• Effective• Efficient• Engaging• Error-tolerant• Easy to learn

Whitney Quesenbery, wqusability.com

Lighten the loadStart with Quesenbery’s 5 E’s

Page 42: Ux tuneup

Customize your heuristics

Page 43: Ux tuneup

Walk in your user’s shoes

Page 44: Ux tuneup

• Scenario. You want to do user testing in Atlanta.– You heard there might be a lab at Southern Polytechnic

State University www.spsu.edu– See if you can find whether they have a lab and can rent

the lab to you• Your task for this review:

– Work independently– Jot down findings– Then meet with a few others to organize findings– Discuss how you will report the top findings

Your turn. Expert review.

Page 45: Ux tuneup

lunch

Page 46: Ux tuneup

Usability testing

small studies with a twist

Afternoon focus

Page 47: Ux tuneup

82% do it

UPA survey says . . .

Page 48: Ux tuneup

• Lab• Informal• Contextual• Remote• Big• Little• In between

How do you do it?

Page 49: Ux tuneup

What’s a small study good for?

Slide 49

Research Exploration

Testing prototypes

Understanding users

Answering arguments

Page 50: Ux tuneup

Slide 50

Why don’t we always test?

Time Cost

Ignorance Can’t get users

Agile!!!!

Page 51: Ux tuneup
Page 52: Ux tuneup

• First Fridays• RITE method• 5-second tests• Man on the street (or coffee shop)

Faster, cheaper ways

Page 53: Ux tuneup

First Fridays

• http://www.howto.gov/web-content/usability/first-fridays

Page 54: Ux tuneup

• Rapid iterative testing and evaluation• Developed by Microsoft’s Game Studios• Requires full team commitment

– Observe– Analyze findings immediately– Change immediately– Retest– Do it again

RITE method

Page 55: Ux tuneup

• Do it yourself• http://fivesecondtest.com/

5-second test

Page 56: Ux tuneup

• Do it yourself• http://www.usertesting.com

Remote testing

Page 57: Ux tuneup

GoalsPersona (s)Tasks/scenariosProtocolAssessment

Basics of testing

Page 58: Ux tuneup

What’s in a day?

Page 59: Ux tuneup

• Goal—ease of use for finding an online graduate program that supports UX interests

• Create post-task questions• Select one person in your group to be the user

– User task: search for an online program in UX or related field at www.spsu.edu

– What are the requirements for admission?– What are the fees?– What is the next application deadline?

• Observers take notes• Discuss findings • Determine top findings

Your turn. Option 1

Page 60: Ux tuneup

• New device for mobile phone user• Create a few tasks• Write a few post-task questions• Select a “new“ user to be participant• Observers take notes• Discuss findings• Determine top findings

Your turn. Option 2

Page 61: Ux tuneup

• Create your own• Use SUS• Use Product Reaction Cards• Other?

Post-test feedback mechanisms

Page 62: Ux tuneup

Create your own

Page 63: Ux tuneup

Q 1

Q 2

Q 3

Let’s write some questions

Page 64: Ux tuneup

System Usability Scale StronglyDisagree

StronglyAgree

1. I think that I would like to use this website frequently.

2. I found this website unnecessarily complex.

3. I thought this website was easy to use.

4. I think that I would need assistance to be able to use this website.

5. I found the various functions in this website were well integrated.

6. I thought there was too much inconsistency in this website.

7. I would imagine that most people would learn to use this website very quickly.

8. I found this website very cumbersome/awkward to use.

9. I felt very confident using this website.

10. I needed to learn a lot of things before I could get going with this website.

This questionnaire is based on the System Usability Scale (SUS), which was developed by John Brooke while working at Digital Equipment Corporation. © Digital Equipment Corporation, 1986.

Page 65: Ux tuneup

What’s in the cards?

Page 66: Ux tuneup

Microsoft creates desirability toolkit

1. Faces Questionnaire

2. Product Reaction Cards

118 Cards

6 Faces

Slide 66

Page 67: Ux tuneup

• Spread them out on table• Instruct user to

– walk along the table and pick up cards that express the user’s experience

– Share the meaning of the cards– User’s story emerges

• In remote testing, provide a table or Excel spreadsheet– User highlights selections – Explains choices

• Collate the results in clusters of similar/same cards

How to deal the cards

Page 68: Ux tuneup
Page 69: Ux tuneup
Page 70: Ux tuneup

70

3 TV weather websites

PositiveNegative

PositiveNegative

PositiveNegative

Station AStation B

Station C

0

5

10

15

20

25

30

35

40

26/13 39/5 24/17

Page 71: Ux tuneup

Repeated positive card selections focused on ease of use, relevance, and speed

Easy-to-useHelpfulStraightforward

FastRelevantReliableUseful

Page 72: Ux tuneup

Let’s try this

Page 73: Ux tuneup

Why do the most serious usability problems we uncover often go unfixed?

“But the light bulb has to want to change”

Steve Krug and Caroline Jarrett#upa2012 Las Vegas

Page 74: Ux tuneup

Survey says…

Legal department objectedDisagreements emerged later

Other events intervened before change could happenTechnical team said it couldn't be done

Required too big a change to a business processTeam did not have enough power to make it happen

No effective decision makerToo much else to do

Not enough timeDeferred until next major update/redesign

Not enough resourcesConflicted with decision maker's belief or opinion

0 10 20 30 40 50 60 70

Number of times this reason was chosen from 131 total usable responses

Page 75: Ux tuneup

Steve’s view: You can’t fix everything

© 2001 Steve Krug

Problems you can find with just a few test participants

Problems you have the resources to fix

Page 76: Ux tuneup

Jarrett/Krug theme: Do basic UX better

• Do testing earlier• Make stakeholders watch the sessions• Present results better

– More explanations– Use video clips

76

Page 77: Ux tuneup

The one-two punch

Page 78: Ux tuneup

• What’s it good for?• When do you do it?• What do you do with the results?

Expert review

Page 79: Ux tuneup

• What’s it good for?• When do you do it?• What do you do with the results?

User testing

Page 80: Ux tuneup

Happiness is . . . using both methods

Page 81: Ux tuneup

Read the book.Visit the website. www.mkp.com/testingessentialsContact me. [email protected]

Page 82: Ux tuneup

Image credits• Slide 1, 3 Jarrod Clark, optimisticjourney.com• Slide 2, suitcasesandsippycups.com• Slide 4, 46, lifehack.org• Slide 7, smartevos.com• Slide 13, thecampuscompanion.com• Slide 15, 16 lkgh.today.msnbc.com• Slide 17, estdevonmethodists.org.uk• Slide 21, groundnotes.wordpress.com• Slide 26, musicalstewdailywordpress.com• Slide 29, spinsucks.com• Slide 31, uncycloped.wikea.com• Slide 39, momentsofserenitywordpress.com• Slide 40, godaddyalternative.com• Slide 42, the bolditalic.com• Slide 43, en.wikipedia.com• Slide 51, ourstage.com• Slide 57, www.getelastic.com• Slide 58, blogs.gifts.com• Slide 62, psychologyface.com• Slide 68, tricksandpranks.com• Slide 77, gov.cbia.com• Slide 80, fanpop.com