Introduction to UX Methods

Preview:

DESCRIPTION

100-level presentation given at the 4th annual St. Louis Day of .NET conference. Not intended to be a comprehensive overview of all UX methods.

Citation preview

06 August 2011 @dgcooley #STLDODN 1

Introduction toUser Experience Methods

Introduction to User Experience MethodsDanielle Gobert Cooley

@dgcooley

06 August 2011 @dgcooley #STLDODN 2

06 August 2011 @dgcooley #STLDODN 3

Rate This Session!

http://uxmethods.notlong.com/

06 August 2011 @dgcooley #STLDODN 4

About me

• 12 years as user researcher/usability specialist• BE, Biomedical & Electrical Engineering• MS, Human Factors in Information Design• Selected Employers & Clients

danielle@dgcooley.com

@dgcooley

06 August 2011 @dgcooley #STLDODN 5

Important Things to Know About UX Methods

06 August 2011 @dgcooley #STLDODN 6

Please RememberTh

ings to

Know

The purpose of these methods is to

inform your design.

They are not validation methods.

06 August 2011 @dgcooley #STLDODN 7

Let Me Repeat ThatTh

ings to

Know

The purpose of these methods is to

inform your design.

They are not validation methods.

06 August 2011 @dgcooley #STLDODN 8

You Are Not Your UserTh

ings to

Know

YOU

NOT YOU

06 August 2011 @dgcooley #STLDODN 9

Why Do It? To Avoid Ending Up HereTh

ings to

Know

06 August 2011 @dgcooley #STLDODN 10

One More Thing…Th

ings to

Know

The purpose of these methods is to

inform your design.

They are not validation methods.

06 August 2011 @dgcooley #STLDODN 11

Usability Study

06 August 2011 @dgcooley #STLDODN 12

How It’s Done

1. Recruit representative end users.

2. Observe impartially as they attempt to perform tasks with a prototype.

3. Typically, participants are asked to think aloud as they use the prototype to perform the tasks. This provides insight into WHY certain interface elements are confusing and what might work better.

Usabilit

y Stu

dyTips…– Recruiting the right users is key!– Avoid bias everywhere – in task

phrasing, your and your observers’ body language, and in verbal questions asked. – Recordings are great, but huge time

sucks.– Quantitative studies often aren’t worth it.

06 August 2011 @dgcooley #STLDODN 13

A Note About Prototype FidelityUsa

bility S

tudy

06 August 2011 @dgcooley #STLDODN 14

Advantages

• Controlled setting means easier logistics.• Recording and observing is easier, too.

• For the rare quantitative study, lab-based testing makes it easier to use such tools as Morae or Ovo.

• Lab-based testing has fewer variables to control, which can be a factor for more rigid studies.

Usabilit

y Stu

dy

06 August 2011 @dgcooley #STLDODN 15

Disadvantages

• Lab setting provides no context of use.• Labs can be expensive to rent or build

– (but they don’t have to be)• Participants are sometimes timid in a lab setting

Usabilit

y Stu

dy

06 August 2011 @dgcooley #STLDODN 16

Field Study

06 August 2011 @dgcooley #STLDODN 17

How It’s Done

1. Recruit representative end users.

2. Observe impartially in the environment in which the product will be used as they attempt to perform tasks with a prototype.

3. Collect artifacts.

Field St

udy

06 August 2011 @dgcooley #STLDODN 18

Advantages

• Gathers contextual data– Ambient light, noise– Distractions

• Participants usually less intimidated

• Much more convenient for participants, so recruiting can be easier

Field St

udyContextual Inquiry?

Though the terms are often used interchangeably, Contextual Inquiry is

actually a type of field study that follows a very specific format.

06 August 2011 @dgcooley #STLDODN 19

Disadvantages

• Logistics are more difficult for researchers.• Observation is more challenging.• Recording is more challenging.• Security issues sometimes prohibit photographs or other

recording.

Field St

udy

06 August 2011 @dgcooley #STLDODN 20

Card Sort

06 August 2011 @dgcooley #STLDODN 21

How It’s Done

1. Recruit representative end users.

2. Identify content items to be categorized

3. Participants sort the content items into groupings that make sense to them.

Card So

rtTwo types …

–In an OPEN card sort, participants create the categories.

–In a CLOSED card sort, the researcher establishes the categories.

06 August 2011 @dgcooley #STLDODN 22

Advantages

• Incredibly inexpensive• Done very quickly with remote

evaluation tools.• Asynchronous, so scheduling is not an

issue. Participants take part at their convenience.

Card So

rt

06 August 2011 @dgcooley #STLDODN 23

Disadvantages

• More complicated with large sets of cards.

• Really, there’s almost no reason NOT to do a card sort, unless you don’t plan to use the results.

Card So

rt

06 August 2011 @dgcooley #STLDODN 24

Tree Test

06 August 2011 @dgcooley #STLDODN 25

How It’s Done

1. Recruit representative end users.

2. Set up study with IA to be evaluated.

3. Give participants specific content elements to find in that architecture.

Tree T

est

06 August 2011 @dgcooley #STLDODN 26

Advantages

• Incredibly inexpensive• Done very quickly with remote

evaluation tools.• Asynchronous, so scheduling is not an

issue. Participants take part at their convenience.

Tree T

est

Yep. Just like card sorting!

06 August 2011 @dgcooley #STLDODN 27

Disadvantages

• The full IA and nav structure must be created in order to execute a tree test, so there is significant investment in the “prototype,” if you will.

Tree T

est

Tree Test vs. Card Sort–An OPEN Card Sort generates an information architecture.–A CLOSED Card sort

usually evaluates high-level labeling.–A Tree Test evaluates findability in an existing information

architecture.

OK. This one IS a validation method.

06 August 2011 @dgcooley #STLDODN 28

Survey

06 August 2011 @dgcooley #STLDODN 29

How It’s Done

1. Recruit participants2. Write survey3. Relax while the data rolls

right in.

Surve

y

06 August 2011 @dgcooley #STLDODN 30

Advantages

• Cheap• Fast• Remote• Easy data collection• Large number of participants

Surve

y

06 August 2011 @dgcooley #STLDODN 31

Disadvantages

• Data are self-reported.– What people do is not the same as what people SAY they do.

• Good question curation is surprisingly challenging.

Surve

y

06 August 2011 @dgcooley #STLDODN 32

Expert Review

06 August 2011 @dgcooley #STLDODN 33

How It’s Done

• An experienced UX Specialist analyzes the product, looking for common mistakes or interface elements or interactions that are not consistent with best practices.

Exper

t Rev

iewHeuristic Evaluation?

Though this term is thrown around a lot, a Heuristic Evaluation is really a specialized type of Expert Review.

06 August 2011 @dgcooley #STLDODN 34

Advantages

• Considerably less expensive than lab or field studies

• Often relatively fast – again, as compared to lab or field studies.

Exper

t Rev

iew

06 August 2011 @dgcooley #STLDODN 35

Disadvantages

• No actual end-user perspective.• Experts vary.

Exper

t Rev

iew

06 August 2011 @dgcooley #STLDODN 36

Other Techniques

06 August 2011 @dgcooley #STLDODN 37

In No Particular Order…

• Journaling Studies – Users keep a journal of their interactions (good and bad) with the product.

• A/B Testing – Two different versions of a product are placed online and success rates analyzed.

• Analytics – Web site or product metrics are analyzed to determine user success or failure.

• Personas – Descriptive profiles of representative end users. This is actually an output of field research.

Other

06 August 2011 @dgcooley #STLDODN 38

Recap & Additional Resources

• User Experience is important. Really.• These are NOT validation techniques!• There are a lot of methods to choose from.

http://uxmethods.notlong.com/Nov 2011