Upload
rosa-hoover
View
231
Download
5
Tags:
Embed Size (px)
Citation preview
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation
Dr. Dania Bilal
IS 588
Spring 2008
Dr. D. Bilal
Purposes
Measures multiple components of the user interface
Addresses relationships between system and its users
Bridges the gap between human and machines
Purposes
Measures the quality of system design in relation to its intended users
Involves several methods, each applied at appropriate time of the design and development process
Usability Attributes
As described by NeilsenLearnabilityEfficiencyMemorabilityErrors & their severitySubjective satisfaction
Learnability
System must be easy to learn, especially for novice usersHard to learn
• systems are usually designed for expert users
Learning curve for novice and expert users
Efficiency
System should be efficient to use so that once the user has learned how to use it, the user can achieve a high level of productivityEfficiency increases with learning
Memorability
System should be easy to remember, especially by casual usersNo need to learn how to use system all
over again after a period of not using it
Errors
System should have a low error rate System should provide user with a
recovery mechanismMinor errorsMajor errors
Minor Errors
Errors that did not greatly slow down user’s interaction with the system
User is able to recover from them through system feedback through awareness of error made
Major Errors
Difficult to recover from them Lead to faulty work if high in frequency May not be discovered by the user
Errors can be catastrophic
Subjective Satisfaction
System should be likeable by users (affective)
Satisfaction varies with purpose of systemuser goals
Assumptions
The designer’s best guess is not good enough The user is always right The user is not always right Users are not designers Designers are not users More features are not always better Minor interface details matter Online help does not really help
Source: Nielsen, J. (1993). Usability Engineering. San Diego: Morgan Kaufman.
Cognitive Walkthrough Method
Involves experts acting on behalf of actual users
Characteristics of typical users are identified & documented
Tasks focusing on aspects of design to be evaluated are developed
Cognitive Walkthrough Method
An observer “experimenter” is presentPrepares tasksTakes notes Provides help, etc. Coordinates and overseas final report
Cognitive Walkthrough Method
Expert walkthrough interface on each task Expert records problems that user may
experience Assumptions about what would cause
problems and why are noted Benchmarks may be used for each task
Sample Questions for Walkthrough
Will the user know what to do to complete part of or the whole task successfully?
Can the user see the button or icon to use for the next action?
Can the user find specific subject category from the hierarchy?
Cognitive Walkthrough
Each expert documents experience about walkthrough for each taskCritical problems documentedProblems and what cause them are
explainedDraft report/notes are compiled and shared
with other experts and Experimenter
Debriefing Session
Experts and experimenter meet & discuss findings
Experimenter shares his/her observational notes with experts
Findings include success stories & failure stories, as applicable
Consolidated report is generated
Walkthrough Report
Include questions experts for each of the tasks and the consolidated answer Use benchmarks and map out the finding for
each task See Assignment 4: Usability for additional
information on benchmarks
Heuristic Evaluation
Evaluators interact with an interface several times and map interface to specific heuristics or guidelinesExample: Nielsen’s ten heuristics
Each evaluator generates a report Reports are aggregated and final report is
generated An observer may be present
Stages of Heuristic Evaluation
Stage 1: Debriefing sessionExperts told what to doWritten instructions provided to each
expert Heuristics provided to each expert as part
of written instructionsVerbal instructions may be included
Stages of Heuristic Evaluation
Stage 2: Evaluation sessionsEach expert tests system based on
heuristicsExpert may also use specific tasksTwo passes are taken through interface
• First pass: overview and familiarity• Second pass: Focus on specific features &
identify usability problems
Stages of Heuristic Evaluation
Stage 3: Debriefing sessionExperts meet to discuss outcome and
compare findingsExperts consolidate findingsExperts prioritize usability problems found &
suggest solutions
Neilsen’s Heuristics
Ten heuristics found athttp://www.useit.com/papers/heuristic/heuris
tic_list.html
Additional rules, see Text.Some heuristics can be combined under
categories and given general description.
Usability Heuristics
http://www.usabilityfirst.com/methods http://
www.useit.com/papers/heuristic/heuristic_evaluation.html (how to conduct a heuristic evaluation)
http://www.uie.com/articles (collection of articles) http://www.uie.com/articles/usability_tests_learn/
Learning about usability test (Jared Spool) http://www.useit.com/papers/heuristic/severityrating.
html (Severity rating)