27
System Evaluation: System Evaluation: Usability Usability Assessment Assessment Dr. Dania Bilal Dr. Dania Bilal IS 582 IS 582 Spring 2009 Spring 2009

System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Embed Size (px)

Citation preview

Page 1: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

System Evaluation:System Evaluation:Usability Usability

Assessment Assessment

Dr. Dania BilalDr. Dania Bilal

IS 582IS 582

Spring 2009Spring 2009

Page 2: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

What is Usability? What is Usability? • Evaluation mechanism that Evaluation mechanism that

measures multiple components of measures multiple components of the design of the user interfacethe design of the user interface

• Addresses the relationships Addresses the relationships between a system and its usersbetween a system and its users

• Emerges from the human-Emerges from the human-computer interaction fieldcomputer interaction field– based on user-centered interface based on user-centered interface

design principles design principles

Page 3: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Importance of UsabilityImportance of Usability• Bridges the gap between human and machinesBridges the gap between human and machines• Provides information about the user Provides information about the user

experience and goalsexperience and goals• Measures system effectiveness in relation to Measures system effectiveness in relation to

its intended users rather than to system its intended users rather than to system specificationsspecifications

• The sooner problems are found, the less The sooner problems are found, the less expensive it will be to fix themexpensive it will be to fix them– Saves money in product costSaves money in product cost

Page 4: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Importance of UsabilityImportance of Usability

• Learn aboutLearn about– users and their goals users and their goals – Difficulty achieving tasksDifficulty achieving tasks– System design problems that contribute System design problems that contribute

to user failuresto user failures

• Generate requirements for Generate requirements for improving existing or creating new improving existing or creating new user-centered designuser-centered design

Page 5: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Usability AttributesUsability Attributes

• As described by Jakob NielsenAs described by Jakob Nielsen– LearnabilityLearnability– EfficiencyEfficiency– MemorabilityMemorability– Errors & their severityErrors & their severity– Subjective satisfactionSubjective satisfaction

Page 6: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

LearnabilityLearnability

• System must be easy to learn, System must be easy to learn, especially for novice usersespecially for novice users

• Hard to learn system is designed for Hard to learn system is designed for expert usersexpert users

• Command-driven systems such as Command-driven systems such as Dialog is designed for users who are Dialog is designed for users who are able to learn the commands and able to learn the commands and construct search strategies accordingly.construct search strategies accordingly.

Page 7: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

EfficiencyEfficiency

• System should be efficient to use so System should be efficient to use so that once the user has that once the user has learnedlearned how how to use it, the user can achieve a high to use it, the user can achieve a high level of productivitylevel of productivity– Efficiency increases with learningEfficiency increases with learning– Efficiency differ from effectivenessEfficiency differ from effectiveness– Efficiency and effectiveness are Efficiency and effectiveness are

indications of successindications of success

Page 8: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

MemorabilityMemorability

• System should be easy to rememberSystem should be easy to remember– No need to learn how to use system all over No need to learn how to use system all over

again after a period of not using it again after a period of not using it • System feature (searching, browsing, System feature (searching, browsing,

finding hidden features, etc.) are easy finding hidden features, etc.) are easy to remember in terms ofto remember in terms of– How?How? how to find themhow to find them– What?What? what they are and what they what they are and what they

dodo– Where?Where? where they are in the systemwhere they are in the system

Page 9: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Errors Errors

• System should have a low error rateSystem should have a low error rate• System should provide user with a System should provide user with a

recovery mechanism recovery mechanism – how to correct a problem, what to do how to correct a problem, what to do

next, suggestion for correcting a next, suggestion for correcting a problem, etc.problem, etc.

• Two types of errors:Two types of errors:– Minor errorsMinor errors– Major errorsMajor errors

Page 10: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Minor ErrorsMinor Errors

• Errors that did not greatly slow Errors that did not greatly slow down user’s interaction with the down user’s interaction with the system system

• User is able to recover from themUser is able to recover from them– through system feedbackthrough system feedback– through awareness of error madethrough awareness of error made

• Not considered catastrophicNot considered catastrophic

Page 11: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Major ErrorsMajor Errors

• Difficult to recover from themDifficult to recover from them• Lead to faulty work if high in Lead to faulty work if high in

frequencyfrequency• May not be discovered by the userMay not be discovered by the user

– Considered catastrophicConsidered catastrophic• Affect productivityAffect productivity• Cause negative affectCause negative affect• Cause abandoning systemCause abandoning system

Page 12: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Subjective SatisfactionSubjective Satisfaction

• System should be likeable by users System should be likeable by users • System should meet user goalsSystem should meet user goals

– SatisfactionSatisfaction– Positive experiencePositive experience– Sense of achievementSense of achievement– Willingness to user system againWillingness to user system again

Page 13: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

AssumptionsAssumptions

• The designer’s best guess is not good enoughThe designer’s best guess is not good enough• The user is always rightThe user is always right• The user is not always rightThe user is not always right• Users are not designersUsers are not designers• Designers are not usersDesigners are not users• More features are not always betterMore features are not always better• Minor interface details matterMinor interface details matter• Online help does not really helpOnline help does not really help

Nielsen, J. (1993).Nielsen, J. (1993). Usability Engineering. Usability Engineering. San Diego: Morgan San Diego: Morgan Kaufman.Kaufman.

Page 14: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Jakob Nielsen’s 10 Usability Jakob Nielsen’s 10 Usability HeuristicsHeuristics

• Visibility of system statusVisibility of system status• Match between system and real worldMatch between system and real world• User control and freedomUser control and freedom• Consistency & standardsConsistency & standards• Error preventionError prevention• Error diagnosis and recoveryError diagnosis and recovery• Recognition rather than recallRecognition rather than recall• Flexibility & efficiency of useFlexibility & efficiency of use• Aesthetic and minimalist designAesthetic and minimalist design • Help and documentationHelp and documentation

Page 15: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Eight Golden Rules by Eight Golden Rules by Shneiderman & Plaisant Shneiderman & Plaisant

(2009)(2009)• Strive for consistencyStrive for consistency• Cater to universal usabilityCater to universal usability• Offer informative feedbackOffer informative feedback• Design dialogue to lead closureDesign dialogue to lead closure• Prevent errorsPrevent errors• Permit easy reversal of actionsPermit easy reversal of actions• Support internal locus of controlSupport internal locus of control• Reduce short term memory loadReduce short term memory loadDesigning the User InterfaceDesigning the User Interface, chapter 2, pp. , chapter 2, pp.

74-75.74-75.

Page 16: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Factors Influencing User Factors Influencing User Interaction Interaction

• Address the functionality of the Address the functionality of the system vis-à-vis:system vis-à-vis:– user goals and tasksuser goals and tasks– user expectationsuser expectations– user cognitive processes, mental model, user cognitive processes, mental model,

conceptualization of system useconceptualization of system use– user methods for solving tasks user methods for solving tasks – contextcontext– system interface designsystem interface design

Page 17: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Usability MethodsUsability Methods

Page 18: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

What Method(s) to Use?What Method(s) to Use?

• One or a mix of methods can be used One or a mix of methods can be used • Depends on project usability goalsDepends on project usability goals

– BudgetBudget– Time constraintsTime constraints– Availability of experts for data collection and Availability of experts for data collection and

analysisanalysis– User recruitment to participate in data User recruitment to participate in data

collection collection – Complexity of system to be assessedComplexity of system to be assessed

Page 19: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Cognitive walkthrough Cognitive walkthrough methodmethod

• Focuses on how a user might achieve a Focuses on how a user might achieve a task and the problems he/she might task and the problems he/she might experienceexperience

• Experts play role of the user and put Experts play role of the user and put themselves in the user’s shoes in themselves in the user’s shoes in performing each taskperforming each task

• Experts perform tasks given by Experts perform tasks given by experimenter (project leader or observer)experimenter (project leader or observer)

• Experts take notes while performing each Experts take notes while performing each tasks keeping the intended user in mind tasks keeping the intended user in mind

Page 20: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Cognitive walkthrough Cognitive walkthrough methodmethod

• Role of Experimenter/observerRole of Experimenter/observer– meets with experts prior to data collection to decide on parts of meets with experts prior to data collection to decide on parts of

system interface to evaluate, broad tasks, and other matters as system interface to evaluate, broad tasks, and other matters as needed needed

– prepares specific tasks prepares specific tasks – develops usability materialsdevelops usability materials– provides introduction about data collection provides introduction about data collection – decides time and location of data collection individually or in decides time and location of data collection individually or in

consultation with expertsconsultation with experts– schedules debriefing session with experts to discuss collected schedules debriefing session with experts to discuss collected

datadata– develops benchmarks by which to evaluate success on each task develops benchmarks by which to evaluate success on each task

performed by each expertperformed by each expert– reviews draft of data collection compiled by each expertreviews draft of data collection compiled by each expert– analyzes collected dataanalyzes collected data– meets with experts to discuss draft of the usability reportmeets with experts to discuss draft of the usability report– compiles the final usability reportcompiles the final usability report

• See also Project #2 for additional information.See also Project #2 for additional information.

Page 21: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Heuristic Heuristic Evaluation/InspectionEvaluation/Inspection

• Evaluators individually interact with Evaluators individually interact with an interface and evaluate it based on an interface and evaluate it based on predefined sets of predefined sets of heuristics/rules/guidelinesheuristics/rules/guidelines

• Each evaluator goes through the Each evaluator goes through the interface and rates each usability interface and rates each usability problem or violation found based on problem or violation found based on a severity rating scale a severity rating scale

Page 22: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Heuristic Heuristic Evaluation/InspectionEvaluation/Inspection

• Each evaluator compiles a usability Each evaluator compiles a usability report containingreport containing– each feature evaluated in the interfaceeach feature evaluated in the interface– severity rating of each usability severity rating of each usability

problem found in interfaceproblem found in interface– explanation of each usability problem explanation of each usability problem

saying saying why this is a problemwhy this is a problem– suggestion for solving the problemsuggestion for solving the problem

Page 23: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Heuristic Heuristic Evaluation/InspectionEvaluation/Inspection

• One of the evaluators (or team One of the evaluators (or team leader) compiles the evaluators’ leader) compiles the evaluators’ reports reports

• Team leader aggregates the results Team leader aggregates the results of the reports (should calculate the of the reports (should calculate the mean value of severity ratings given mean value of severity ratings given by the evaluators) by the evaluators)

• Generates the final usability reportGenerates the final usability report

Page 24: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Severity of System Severity of System ProblemsProblems

• Combination of 3 factorsCombination of 3 factors– Frequency of problem occurrenceFrequency of problem occurrence

• Common or rare?Common or rare?

– ImpactImpact• Is problem easy or difficult to user to Is problem easy or difficult to user to

overcome?overcome?

– PersistencePersistence• Is problem common and user can’t Is problem common and user can’t

overcome or is it one-time problem and user overcome or is it one-time problem and user can overcome?can overcome?

Page 25: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Severity Rating Scale of Severity Rating Scale of Usability ProblemsUsability Problems

00 == I don't agree that this is a usability problem at I don't agree that this is a usability problem at all all

11 == Cosmetic problem only: need not be fixed unless Cosmetic problem only: need not be fixed unless extra time is available on project extra time is available on project

22 == Minor usability problem: fixing this should be Minor usability problem: fixing this should be given given low priority low priority

33 == Major usability problem: important to fix, so Major usability problem: important to fix, so should be given high priority should be given high priority

44 == Usability catastrophe: imperative to fix this Usability catastrophe: imperative to fix this before product can be released before product can be released

Nielson, Jakob. Nielson, Jakob. http://www.useit.com/papers/heuristic/severityratihttp://www.useit.com/papers/heuristic/severityrating.htmlng.html

Page 26: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

Class ActivityClass Activity

• Visit the Library of Congress OPAC.Visit the Library of Congress OPAC.• Select a component of the interface Select a component of the interface

to evaluate against a combined list to evaluate against a combined list of heuristics and Rules.of heuristics and Rules.

Page 27: System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009

SourcesSources• http://http://www.usabilityfirst.comwww.usabilityfirst.com/methods/methods • http://http://

www.useit.com/papers/heuristic/heuristic_list.htmlwww.useit.com/papers/heuristic/heuristic_list.html (Nielsen’s usability heuristics)(Nielsen’s usability heuristics)

• http://http://www.useit.com/papers/heuristic/heuristic_evaluatiowww.useit.com/papers/heuristic/heuristic_evaluation.htmln.html (how to conduct a heuristic evaluation) (how to conduct a heuristic evaluation)

• http://http://www.uie.comwww.uie.com/articles/articles (collection of articles) (collection of articles)• http://http://www.uie.com/articles/usability_tests_learnwww.uie.com/articles/usability_tests_learn//

Learning about usability test (Jared Spool)Learning about usability test (Jared Spool)• http://http://

www.useit.com/papers/heuristic/severityrating.htmlwww.useit.com/papers/heuristic/severityrating.html (Nielsen’s Severity Rating Scale)(Nielsen’s Severity Rating Scale)