Upload
ux-firm
View
1.782
Download
2
Tags:
Embed Size (px)
DESCRIPTION
Citation preview
Tri UPA Workshop
Presented by Carol BarnumUsability Center @ Southern Polytechnic
UX Tuneup
How’d you get here?
Self taught?Read “the book“?Educated?Trained?Some other way?
Workshop agenda Heuristic
evaluation&Usability testing
What still worksWhat needs a tuneup
Heuristic evaluation
tradition to today
Morning focus
Heuristic evaluation/Expert review
UPA survey says . . .
Slide 5
% of respondents Survey year
77% 2007
74% 2009
75% 2011
Why so popular?
Slide 6
Fast Cheap
Easy Effective
Convenient
Slide 8
“It can often be more expensive and difficult to find 3-5 usability professionals as it is to test 3-5 users.”Jeff Sauro, “What’s the difference between a Heuristic Evaluation and a Cognitive Walkthrough?” Measuring Usability, Aug. 2, 2011
1. Visibility of system status2. Match between system and real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and recover from errors10. Help and documentation
J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994
Tradition—Nielsen’s 10 heuristics
Slide 9
• Small set of evaluators – 3 to 5 optimal cost-benefit – Single evaluator finds 35% of problems
• Each evaluator inspects alone – 1 to 2 hours– Several passes through interface– Inspection based on heuristics– If evaluators are not SME’s, hints can be given– Evaluator writes notes or report
The Nielsen Method
After individual evaluations are done, evaluators: – Talk to each other, often with a facilitator– Share reports/notes– Collate findings– Rank issues by severity– Write compiled report
The Nielsen Method
• Supply a typical usage scenario, listing the steps a user would take to perform tasks
• Hold a design debrief with designers• Use brainstorming to focus on possible
solutions• Include positive findings
Nielsen variations on method
And the method is called…
“Discount Usability Engineering“
So, what do you get?
Slide 14
• A list of potential problems• Also (sometimes) the positive findings• Tied to a heuristic or rule of practice• A ranking of findings by severity• (Sometimes) recommendations for fixing
problems• A report of findings
What do you do?
Well, I always . . . .
What do I do?
Well, I used to follow Nielsen
17
Phase 1: Nielsen is my bible
• Comparative evaluation of reservation process• 17 teams
– 8 did expert review/heuristic evaluation– Only 1 team used Nielsen’s heuristics
• Rolf’s conclusions– Findings “overly sensitive“—too many to manage– Need to improve classification schemes– Need more precise and usable recommendations
CHI 2003Results available at Rolf Molich’s DialogDesign http://www.dialogdesign.dk/CUE-4.htm
CUE 4 Hotel Pennsylvania 2003
Slide 18
usability.spsu.edu UPA 2011
Slide 19
After that, what did I do?
I got a little older and wiser
Phase 2: Loose interpretation of Nielsen
dropped his heuristicskept severity ratingsadded screen capturesadded calloutsadded recommendations
Click icon to add picture
Slide 22
Objectives/goals for the modules
Reason content is being presentedConciseness of presentationDefinitions required to work with the module/contentEvaluation criteria and methodsDirect tie between content and assessment measureSequence of presentation follows logically from introductionQuizzes challenge users
Develop a consistent structure that defines what’s noted in the bulleted points, above.Avoid generic statements that don’t focus users on what they will be accomplishing.Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the moduleConnect ideas in the goals and objectives with outcomes in the assessmentFollow the order of presentation defined at the beginningDevelop interesting and challenging questionsRe-frame goals/objectives at the end of the module
3
Finding Description of problem Recommendation H C S Severity Rating
Objectives/goals for the modulesnot clear
Unclear reason content is being presentedLack of conciseness of presentationDefinitions are required to work with the module/contentEvaluation criteria and methods unclearDirect tie between content and assessment measure unclearSequence of presentation does not follow logically from introduction.Quizzes do not challenge users.
Develop a consistent structure that defines what’s noted in the bulleted points.Avoid generic statements that don’t focus users on what they will be accomplishing.Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the module.Connect ideas in the goals and objectives with outcomes in the assessment.Follow the order of presentation defined at the beginning.Develop interesting and challenging quiz questions.Re-frame goals/objectives at the end of the module.
3
Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.H = Hyperspace; C = Cardiac Arrest; S = Shock
Slide 23
Then, what did I do?
I broke free!
findings stated in our terminologyscreen captures
Phase 3: I did it my way
27
28
A unique password between 6 and 16 characters was required. “Unique” is not defined. This is a problem with terminology.
Usually, passwords must be a combination of letters and numbers for higher security. An all-letter password—Heuristics—was accepted. A dictionary term is not a secure password and contradicts accepted conventions. The ability to input a dictionary word may be a component of trust for users.
The username and security question answer were rejected on submit.
This result is confusing as the name was confirmed on the previous screen. This relates to establishing conventions for the form of names/passwords on the input screen. Input formats need to be defined on the relevant page.Differences in spelling “username” vs. “user name” are subtle but are consistency issues.
The red banner is confusing as the user chose the gold (Free Edition). This is a consistency issue.
User experience emerges in reviewer comments. . .
Reviewer comment: I wanna click on the map, not the pulldown. WAH!Also, I’ve got no idea what the text on this page means.
Why not tell the user’s story?!
• Ginny Redish and Dana Chisnell• AARP report—58 pages, 50 websites
– Two personas—Edith and Matthew– Evaluators “channel“ the user via persona and
tasks/goals– The users’ stories emerge
Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf
Strategy—Persona-based scenario review
Slide 32
While the clickable area is very large in the navigation blocks, Edith expected to click on the labels, so she was surprised when the menu appeared
When trying to click an item in the menu above, Edith had trouble selecting because her mouse hovered close enough to the choices below to open that menu, obscuring the item she wanted to click
Chisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP)
Engage in conversation with your reader
“Every use of every website is a conversation started by the site visitor.”
Ginny RedishLetting Go of the WordsMorgan Kaufmann, 2007 (new edition coming)
Tell the story of your user’s experience
“Stories organize facts in memorable ways.”
Whitney Quesenbery and Kevin BrooksStorytelling for User ExperienceRosenfeld Media 2010
No deliverableQuick findingsPresentationDetailed report
Options for report deliverables
Slide 36
• All sites have usability problems• All organizations have limited resources• You’ll always find more problems than you have
resources to fix• It’s easy to get distracted by less serious problems
that are easier to solve . . .• Which means that the worst ones often persist• Therefore, you have to be intensely focused on
fixing the most serious problems firstRocket Surgery Made Easy, New Riders, 2010
Steve Krug’s approach
Slide 37
Slide 38
“Focus ruthlessly on a small number of the most important problems.”
Steve Krug
“big honkin’ report“
• Effective• Efficient• Engaging• Error-tolerant• Easy to learn
Whitney Quesenbery, wqusability.com
Lighten the loadStart with Quesenbery’s 5 E’s
Customize your heuristics
Walk in your user’s shoes
• Scenario. You want to do user testing in Atlanta.– You heard there might be a lab at Southern Polytechnic
State University www.spsu.edu– See if you can find whether they have a lab and can rent
the lab to you• Your task for this review:
– Work independently– Jot down findings– Then meet with a few others to organize findings– Discuss how you will report the top findings
Your turn. Expert review.
lunch
Usability testing
small studies with a twist
Afternoon focus
82% do it
UPA survey says . . .
• Lab• Informal• Contextual• Remote• Big• Little• In between
How do you do it?
What’s a small study good for?
Slide 49
Research Exploration
Testing prototypes
Understanding users
Answering arguments
Slide 50
Why don’t we always test?
Time Cost
Ignorance Can’t get users
Agile!!!!
• First Fridays• RITE method• 5-second tests• Man on the street (or coffee shop)
Faster, cheaper ways
First Fridays
• http://www.howto.gov/web-content/usability/first-fridays
• Rapid iterative testing and evaluation• Developed by Microsoft’s Game Studios• Requires full team commitment
– Observe– Analyze findings immediately– Change immediately– Retest– Do it again
RITE method
GoalsPersona (s)Tasks/scenariosProtocolAssessment
Basics of testing
What’s in a day?
• Goal—ease of use for finding an online graduate program that supports UX interests
• Create post-task questions• Select one person in your group to be the user
– User task: search for an online program in UX or related field at www.spsu.edu
– What are the requirements for admission?– What are the fees?– What is the next application deadline?
• Observers take notes• Discuss findings • Determine top findings
Your turn. Option 1
• New device for mobile phone user• Create a few tasks• Write a few post-task questions• Select a “new“ user to be participant• Observers take notes• Discuss findings• Determine top findings
Your turn. Option 2
• Create your own• Use SUS• Use Product Reaction Cards• Other?
Post-test feedback mechanisms
Create your own
Q 1
Q 2
Q 3
Let’s write some questions
System Usability Scale StronglyDisagree
StronglyAgree
1. I think that I would like to use this website frequently.
2. I found this website unnecessarily complex.
3. I thought this website was easy to use.
4. I think that I would need assistance to be able to use this website.
5. I found the various functions in this website were well integrated.
6. I thought there was too much inconsistency in this website.
7. I would imagine that most people would learn to use this website very quickly.
8. I found this website very cumbersome/awkward to use.
9. I felt very confident using this website.
10. I needed to learn a lot of things before I could get going with this website.
This questionnaire is based on the System Usability Scale (SUS), which was developed by John Brooke while working at Digital Equipment Corporation. © Digital Equipment Corporation, 1986.
What’s in the cards?
Microsoft creates desirability toolkit
1. Faces Questionnaire
2. Product Reaction Cards
118 Cards
6 Faces
Slide 66
• Spread them out on table• Instruct user to
– walk along the table and pick up cards that express the user’s experience
– Share the meaning of the cards– User’s story emerges
• In remote testing, provide a table or Excel spreadsheet– User highlights selections – Explains choices
• Collate the results in clusters of similar/same cards
How to deal the cards
70
3 TV weather websites
PositiveNegative
PositiveNegative
PositiveNegative
Station AStation B
Station C
0
5
10
15
20
25
30
35
40
26/13 39/5 24/17
Repeated positive card selections focused on ease of use, relevance, and speed
Easy-to-useHelpfulStraightforward
FastRelevantReliableUseful
Let’s try this
Why do the most serious usability problems we uncover often go unfixed?
“But the light bulb has to want to change”
Steve Krug and Caroline Jarrett#upa2012 Las Vegas
Survey says…
Legal department objectedDisagreements emerged later
Other events intervened before change could happenTechnical team said it couldn't be done
Required too big a change to a business processTeam did not have enough power to make it happen
No effective decision makerToo much else to do
Not enough timeDeferred until next major update/redesign
Not enough resourcesConflicted with decision maker's belief or opinion
0 10 20 30 40 50 60 70
Number of times this reason was chosen from 131 total usable responses
Steve’s view: You can’t fix everything
© 2001 Steve Krug
Problems you can find with just a few test participants
Problems you have the resources to fix
Jarrett/Krug theme: Do basic UX better
• Do testing earlier• Make stakeholders watch the sessions• Present results better
– More explanations– Use video clips
76
The one-two punch
• What’s it good for?• When do you do it?• What do you do with the results?
Expert review
• What’s it good for?• When do you do it?• What do you do with the results?
User testing
Happiness is . . . using both methods
Read the book.Visit the website. www.mkp.com/testingessentialsContact me. [email protected]
Image credits• Slide 1, 3 Jarrod Clark, optimisticjourney.com• Slide 2, suitcasesandsippycups.com• Slide 4, 46, lifehack.org• Slide 7, smartevos.com• Slide 13, thecampuscompanion.com• Slide 15, 16 lkgh.today.msnbc.com• Slide 17, estdevonmethodists.org.uk• Slide 21, groundnotes.wordpress.com• Slide 26, musicalstewdailywordpress.com• Slide 29, spinsucks.com• Slide 31, uncycloped.wikea.com• Slide 39, momentsofserenitywordpress.com• Slide 40, godaddyalternative.com• Slide 42, the bolditalic.com• Slide 43, en.wikipedia.com• Slide 51, ourstage.com• Slide 57, www.getelastic.com• Slide 58, blogs.gifts.com• Slide 62, psychologyface.com• Slide 68, tricksandpranks.com• Slide 77, gov.cbia.com• Slide 80, fanpop.com