29
Chapter 12 cont.: Introducing Evaluation

Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Chapter 12 cont.: Introducing Evaluation

Page 2: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Three evaluation case studies

• Improving a design: the HutchWorld patient support system.

• Multiple methods help ensure good usability: the Olympic messaging system (OMS).

• Evaluating a new kind of interaction: an ambient system.

Page 3: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

The HutchWorld patient support system

• This virtual world supports communication among cancer patients.

• Privacy, logistics, patients’ feelings, etc. had to be taken into account.

• Designers and patients speak different languages.

Page 4: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

My Appearance

Page 5: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Evaluation

• Informal evaluation• Observation of users

– No critical mass– Users prefer asynchronous communication– Games were popular

• Usability evaluation in labs • Evaluation of the revised version• When it is time to stop testing?

Page 6: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Multiple methods to evaluate the 1984 OMS

• Early tests of printed scenarios & user guides.

• Early simulations of telephone keypad. • An Olympian joined team to provide

feedback.• Interviews & demos with Olympians

outside US.• Overseas interface tests with friends

and family.

Page 7: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Multiple methods to evaluate the 1984 OMS cont.

• Free coffee and donut tests (65 users).• Usability tests with 100 participants.• A ‘try to destroy it’ test (24 students). • Pre-Olympic field-test at an

international event.• Reliability of the system with heavy

traffic (2800 and 1000 users).• ->cultural differences

Page 8: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Evaluating an ambient system

• The Hello Wall is a new kind of system that is designed to explore how people react to its presence.

• What are the challenges of evaluating systems like this?

Page 9: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Key points

• Evaluation & design are closely integrated in user-centered design.

• Some of the same techniques are used in evaluation as for establishing requirements but they are used differently (e.g. observation interviews & questionnaires).

• Three main evaluation approaches are:usability testing, field studies, and analytical evaluation.

Page 10: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Key points cont.• The main methods are:observing, asking users, asking

experts, user testing, inspection, and modeling users’ task performance.

• Different evaluation approaches and methods are often combined in one study.

• Triangulation involves using a combination of techniques to gain different perspectives, or analyzing data using different techniques.

• Dealing with constraints is an important skill for evaluators to develop.

Page 11: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Chapter 13: An evaluation framework

Page 12: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

The aims are:

• To discuss the conceptual, practical and ethical issues involved in evaluation.

• To introduce and explain the DECIDE framework.

Page 13: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

DECIDE: a framework to guide evaluation

• Determine the goals.• Explore the questions.• Choose the evaluation approach and

methods.• Identify the practical issues.• Decide how to deal with the ethical

issues.• Evaluate, analyze, interpret and present

the data.

Page 14: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Determine the goals

• What are the high-level goals of the evaluation?

• Who wants it and why?• The goals influence the approach

used for the study.• Some examples of goals:

− Check to ensure that the final interface is consistent.

− Investigate how technology affects working practices.

− Improve the usability of an existing product .

Page 15: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Explore the questions

• All evaluations need goals & questions to guide them.

• E.g., the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions:– What are customers’ attitudes to these new tickets? – Are they concerned about security?– Is the interface for obtaining them poor?

• What questions might you ask about the design of a cell phone?

Page 16: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Choose the evaluation approach & methods

• The evaluation approach influences the methods used, and in turn, how data is collected,analyzed and presented.

• E.g. field studies typically:– Involve observation and interviews.– Do not involve controlled tests in a

laboratory.– Produce qualitative data.

Page 17: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Identify practical issues

For example, how to:

•Select users•Stay on budget•Stay on schedule•Find evaluators•Select equipment

Page 18: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Decide about ethical issues

• Develop an informed consent form

• Participants have a right to:- Know the goals of the study;- Know what will happen to the findings;- Privacy of personal information;- Leave when they wish; - Be treated politely.

Page 19: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Ethical issues – Norway

• Norsk samfunnsvitenskapelig datatjeneste NSDhttp://www.nsd.uib.no/personvern/

• Innsamling av opplysninger om personer (personopplysninger) må meldes til Personvernombudet for forskning

• Personopplysninger er opplysninger som direkte eller indirekte kan identifisere en person.

• Direkte personidentifiserbare opplysninger: navn, personnummer eller andre personlige kjennetegn.

• Indirekte personidentifiserbare opplysninger: bakgrunnsopplysninger som kan gjøre det mulig å spore opplysningene tilbake til en enkeltperson, for eksempel bostedskommune eller institusjonstilknytning kombinert med opplysninger om alder, kjønn, yrke, nasjonalitet, diagnose, etc.

Page 20: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

An example – Norway

Formålet med denne undersøkelsen er å utforske brukernes opplevelse av Internettjenester. Undersøkelsen er en del av et forskningsprosjekt hvor blant annet TV2/SUMO og SINTEF deltar. Det er i alt 19 spørsmål som tar ca. 10 minutter å besvare. Du velger selv om du vil svare på alle spørsmålene.

Dersom du vil være med i loddtrekningen av to gavekort, hver på 1000 kroner, må du oppgi e-postadressen din.

Vi ber om at det kun er personer over 15 år som besvarer denne brukerundersøkelsen.

SINTEF og TV2/SUMO garanterer at samlet data vil bli behandlet strengt konfidensielt, og senest innen prosjektslutt 15. mai 2010 vil opplysningene om deg anonymiseres. Ingen enkeltpersoner vil kunne identifiseres i fremtidige publikasjoner.

Page 21: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Our Experience

21

COOL

Technology

SE artefacts

0 %10 %20 %30 %40 %50 %60 %70 %80 %90 %

Exp

erim

enta

lM

ater

ial

Task

-pe

rform

ing

Act

ions

Pla

nnin

g,S

trate

gy a

ndR

efle

ctio

n

Com

preh

ensi

onan

d P

robl

ems

Inte

ract

ion

obse

rver

Categories

Perc

enta

ge CTARTAFCT

• Developers don’t care about monitoring• People give away sensitive information (bank account, passwords)

Karahasanovic, A. (IS, 2003).

Page 22: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Evaluate, interpret & presentdata

• The approach and methods used influence how data is evaluated, interpreted and presented.

• The following need to be considered:- Reliability: can the study be replicated?

• Field studies versus experiments

– Validity: is it measuring what you expected?• Average performance – recording user errors

Page 23: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Considerations

• Biases: is the process creating biases?– Expert evaluations

• Scope: can the findings be generalized?– Novices versus expert system users

• Ecological validity: is the environment influencing the findings? - i.e. Hawthorn effect. – Expectations of the users (placebo effect)

Page 24: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

Key points

• There are many issues to consider before conducting an evaluation study.

• These include the goals of the study, the approaches and methods to use, practical issues, ethical issues, and how the data will be collected, analyzed and presented.

• The DECIDE framework provides a useful checklist for planning an evaluation study.

Page 25: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

…and some lessons learned

• Encourage collaboration. Investigating motivation, user engagement, user involvement, co-experience and sociability at the level of communities and families is essential for applications aiming to support sharing and co-creation of UGC. Both tasks and evaluation methods should reflect this priority. Extending well known methods such as interviews, focus groups, and group-based expert walkthroughs with hands-on sessionsand usage of collaborative tasks has been very useful for capturing these factors.

Page 26: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

• Start to evaluate UX as early as possible. Early feedback is very valuable to the developers. In particular, feedback on motivation, emotions, and anticipated engagement is valuable. However, one should adapt both the methods and the measurement to the evaluation phase. As the project progresses, one can move towards finer granularity evaluation. For example, one can measure the emotions related to a general idea of a tool for collaborative writing early in a project and emotions related to a particular function of the tool later in the project.

Page 27: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

• Evaluation should be playful and provide added value for the participants. One cannot overemphasize the importance of providing a safe, comfortable and playful evaluation environment, and giving ‘something extra’ to the study participants. The opportunity to learn and try something completely new and to affect the development of new applications is not only very stimulating and rewarding for the communities of users and experts participating in the evaluation, but also positively affects usefulness of the evaluation methods. When working with communities it is very important to build a trustful relationship for ensuring a successful long term relationship.

Page 28: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

• Prepare for diversity. In depth knowledge of your communities—the different groups of users and non-users—is essential for successful data collection. Different versions of questionnaires and focus group guidelines should be prepared for different user groups (e.g., professional cabaret artists, amateur artists, and theatres) and evaluators/moderators should be able to speak ‘different languages’ (e.g., to talk to children, teenagers, and elderly people) at the same time.

Page 29: Chapter 12 cont.: Introducing Evaluation · Chapter 12 cont.: Introducing Evaluation. Three evaluation case studies ... •Interviews & demos with Olympians outside US. •Overseas

• Be best friends with the developer. Good knowledge of the application under development is very important for the success of the evaluation. Evaluators/moderators should be able to explain ideas behind paper prototypes and screenshots. Communicating the results of the evaluation clearly and in formats understandable to the developers is extremely important for uptake of the evaluation

results.Karahasanovic, A., and Obrist, M. (2010). Investigating the Usefulness of Methods for Evaluating User

Experience of Social Media Applications, Proceedings of the International Workshop on the Interplay between User Experience and Software Development (I-UxSED 2010), NordiCHI workshop, Reykjavik, Iceland, October 2010.