CMU Usable Privacy and Security Laboratory A case study in UI design and evaluation for computer...

Preview:

Citation preview

CMU Usable Privacy and SecurityLaboratory

http://cups.cs.cmu.edu/

A case study in UI A case study in UI design and evaluation design and evaluation for computer securityfor computer security

Rob Reeder

January 30, 2008

• CMU Usable Privacy and Security Laboratory • 2

MemogateMemogate: A user interface : A user interface scandal !!scandal !!

• CMU Usable Privacy and Security Laboratory • 3

OverviewOverview

Task domain: Windows XP file permissions

Design of two user interfaces: native XP interface, Salmon

Evaluation: Which interface was better?

Analysis: Why was one better?

• CMU Usable Privacy and Security Laboratory • 4

Part 1: File permissions in Part 1: File permissions in Windows XPWindows XP

File permissions task: Allow authorized users access to resources, deny unauthorized users access to resources

Resources: Files and folders

Users: People with accounts on the system

Access: 13 types, such as Read Data, Write Data, Execute, Delete

• CMU Usable Privacy and Security Laboratory • 5

Challenges for file Challenges for file permissions UI designpermissions UI design

Maybe thousands of users – impossible to set permissions individually for each

Thirteen access types – hard for a person to remember them all

• CMU Usable Privacy and Security Laboratory • 6

Grouping to handle usersGrouping to handle users

Administrators

Power Users

Everyone

Admin-defined

• CMU Usable Privacy and Security Laboratory • 7

A problematic user groupingA problematic user grouping

Ari

Bill Miguel

Cindy

Xu

Yasir

Zack

Group A Group B

• CMU Usable Privacy and Security Laboratory • 8

Precedence rulesPrecedence rules

No setting = Deny by default

Allow > No setting

Deny > Allow

(> means “takes precedence over”)

Grouping to handle access Grouping to handle access typestypes

9

Execute

• CMU Usable Privacy and Security Laboratory • 10

MoralMoral

Setting file permissions is quite complicated

But a good interface design can help!

• CMU Usable Privacy and Security Laboratory • 11

The XP file permissions The XP file permissions interfaceinterface

The Salmon interfaceThe Salmon interface

ProjectF

12

Expandable GridExpandable Grid

13

• CMU Usable Privacy and Security Laboratory • 14

Example task: WesleyExample task: Wesley Initial state•Wesley allowed READ & WRITE from a group

Final state•Wesley allowed READ, denied WRITE

What needs to be done•Deny Wesley WRITE

• CMU Usable Privacy and Security Laboratory • 15

What’s so hard?What’s so hard? Conceptually: Nothing!

Pragmatically:•User doesn’t know initial group

membership•Not clear what changes need to be made•Checking work is hard

Learning Wesley’s initial Learning Wesley’s initial permissionspermissions

1 2

3

4

Click “Advanced”

Click “Effective Permissions”

Select WesleyView Wesley’s Effective Permissions

16

Learning Wesley’s group Learning Wesley’s group membershipmembership

5

6

7

89

Bring up Computer Management

interface

Click on “Users”

Double-click

WesleyClick

“Member Of”

Read Wesley’s

group membership

17

Changing Wesley’s Changing Wesley’s permissionspermissions

1011

12

Click “Add…”

Deny Write

Click “Apply”

18

Checking workChecking work

13 14

15

16

Click “Advanced”

Click “Effective Permissions”

Select WesleyView Wesley’s Effective Permissions

19

XP file permissions XP file permissions interface: Poorinterface: Poor

20

• CMU Usable Privacy and Security Laboratory • 21

Part 2: Common security UI Part 2: Common security UI design problemsdesign problems

Poor feedback

Ambiguous labels

Violation of conventions

Hidden options

Omission errors

Problem #1: Poor feedbackProblem #1: Poor feedback

1 2

3

4

Click “Advanced”

Click “Effective Permissions”

Select WesleyView Wesley’s Effective Permissions

22

Salmon: immediate Salmon: immediate feedbackfeedback

ProjectF

23

Grid: consolidated Grid: consolidated feedbackfeedback

24

Problem #2: Labels (1/3)Problem #2: Labels (1/3)

Full Control

Modify

Read & Execute

Read

Write

Special Permissions

25

Problem #2: Labels (2/3)Problem #2: Labels (2/3)Full Control

Traverse Folder/Execute File

List Folder/Read Data

Read Attributes

Read Extended Attributes

Create Files/Write Data

Create Folders/Append Data

Write Attributes

Write Extended Attributes

Delete

Read Permissions

Change Permissions

Take Ownership 26

Salmon: clearer labelsSalmon: clearer labels

ProjectF

27

• CMU Usable Privacy and Security Laboratory • 28

Grid: fewer, clearer labelsGrid: fewer, clearer labels

Problem #3: Violating interface Problem #3: Violating interface conventionsconventions

29

Problem #3: Violating interface Problem #3: Violating interface conventionsconventions

30

Salmon: better checkboxesSalmon: better checkboxes

ProjectF

31

Grid: direct manipulationGrid: direct manipulation

32

• CMU Usable Privacy and Security Laboratory • 33

Problem #4: Hidden optionsProblem #4: Hidden options

• CMU Usable Privacy and Security Laboratory • 34

Problem #4: Hidden Problem #4: Hidden optionsoptions

1 2

3

Click “Advanced” Double-click entry

Click “Delete” checkbox

Salmon: All options visibleSalmon: All options visible

ProjectF

35

Grid: Even more visibilityGrid: Even more visibility

36

Problem #5: Omission Problem #5: Omission errorserrors

37

Salmon: Feedback helps prevent Salmon: Feedback helps prevent omission errorsomission errors

ProjectF

38

Grid: No omission errorsGrid: No omission errors

39

• CMU Usable Privacy and Security Laboratory • 40

FLOCK: Summary of design FLOCK: Summary of design problemsproblems

Feedback poor

Labels ambiguous

Omission error potential

Convention violation

Keeping options visible

• CMU Usable Privacy and Security Laboratory • 41

Part 3: Evaluation of XP Part 3: Evaluation of XP and Salmonand Salmon

Conducted laboratory-based user studies

Formative and summative studies for Salmon

I’ll focus on summative evaluation

• CMU Usable Privacy and Security Laboratory • 42

Advice for user studiesAdvice for user studies

Know what you’re measuring!

Maintain internal validity

Maintain external validity

• CMU Usable Privacy and Security Laboratory • 43

Common usable security Common usable security metricsmetrics

Accuracy – with what probability do users correctly complete tasks?

Speed – how quickly can users complete tasks?

Security – how difficult is it for an attacker to break into the system?

Etc. – satisfaction, learnability, memorability

• CMU Usable Privacy and Security Laboratory • 44

Measure the right things!Measure the right things!

Speed is often useless without accuracy (e.g., setting file permissions)

Accuracy may be useless without security (e.g., easy-to-remember passwords)

• CMU Usable Privacy and Security Laboratory • 45

Measurement instrumentsMeasurement instruments

Speed – Easy; use a stopwatch, time users

Accuracy – Harder; need unambiguous definitions of “success” and “failure”

Security – Very hard; may require serious math, or lots of hackers

Internal validityInternal validity Internal validity: Making sure your results

are due to the effect you are testing

Manipulate one variable (in our case, the interface, XP or Salmon)

Control or randomize other variables•Use same experimenter•Experimenter reads directions from a script•Tasks presented in same text to all users•Assign tasks in different order for each user•Assign users randomly to one condition or other

46

• CMU Usable Privacy and Security Laboratory • 47

External validityExternal validity External validity: Making sure your experiment

can be generalized to the real world

Choose real tasks•Sources of real tasks:

Web forums Surveys Your own experience

Choose real participants•We were testing novice or occasional file-

permissions users with technical backgrounds (so CMU students & staff fit the bill)

• CMU Usable Privacy and Security Laboratory • 48

User study compared User study compared Salmon to XPSalmon to XP Seven permissions-setting tasks, I’ll

discuss two:•Wesley• Jack

Metrics for comparison:•Accuracy (measured as deviations in

users’ final permission bits from correct permission bits)

•Speed (time to task completion)•Not security – left that to Microsoft

• CMU Usable Privacy and Security Laboratory • 49

Study designStudy design Between-participants comparison of

interfaces

12 participants per interface, 24 total

Participants were technical staff and students at Carnegie Mellon University

Participants were novice or occasional file permissions users

• CMU Usable Privacy and Security Laboratory • 50

Wesley and Jack tasksWesley and Jack tasks

Initial state•Wesley allowed

READ & WRITE

Final state•Wesley allowed

READ, denied WRITE

What needs to be done•Deny Wesley

WRITE

Initial state• Jack allowed READ,

WRITE, & ADMINISTRATE

Final state• Jack allowed READ,

denied WRITE & ADMINISTRATE

What needs to be done•Deny Jack WRITE &

ADMINISTRATE

Wesley task

Jack task

• CMU Usable Privacy and Security Laboratory • 51

Percent successful completions by task

58

25

83

100

0

25

50

75

100

Wesley task Jack task

Task Name

Pe

rce

nt

of

Use

rs W

ho

C

orr

ectl

y C

om

ple

ted

T

asks

Salmon outperformed XP in Salmon outperformed XP in accuracyaccuracy

XP

XPSalm

on

Salm

on

43% improvement

300% improvement

• CMU Usable Privacy and Security Laboratory • 52

Percent successful completions by task

58

25

83

100

0

25

50

75

100

Wesley task Jack task

Task Name

Pe

rce

nt

of

Use

rs W

ho

C

orr

ectl

y C

om

ple

ted

T

asks

Salmon outperformed XP in Salmon outperformed XP in accuracyaccuracy

XP

XPSalm

on

Salm

on

p = 0.09 p < 0.0001

• CMU Usable Privacy and Security Laboratory • 53

Speed (Time-to-Task-Completion) Results

208 208183 173

0

50

100

150

200

250

Wesley task Jack task

Task Name

Tim

e (s

eco

nd

s)

Successful XP users

Successful Salmon users

Salmon did not sacrifice Salmon did not sacrifice speedspeed

XP

XP

Salm

on

Salm

on

• CMU Usable Privacy and Security Laboratory • 54

Speed (Time-to-Task-Completion) Results

208 208183 173

0

50

100

150

200

250

Wesley task Jack task

Task Name

Tim

e (s

eco

nd

s)

Successful XP users

Successful Salmon users

Salmon did not sacrifice Salmon did not sacrifice speedspeed

XP

XP

Salm

on

Salm

on

p = 0.35 p = 0.20

• CMU Usable Privacy and Security Laboratory • 55

Part 4: AnalysisPart 4: Analysis

What led Salmon users to better performance?

• CMU Usable Privacy and Security Laboratory • 56

How users spent their time - How users spent their time - WesleyWesley

Average behavior time per participant for Wesley task

0

5

10

15

20

25

30

35

40

45

Behavior

Tim

e (s

eco

nd

s) All XPFP users

All Salmon users

Successful XPFP users

Successful Salmon users

• CMU Usable Privacy and Security Laboratory • 57

Where Salmon did better - Where Salmon did better - WesleyWesley

Average behavior time per participant for Wesley task

0

5

10

15

20

25

30

35

40

45

Behavior

Tim

e (s

eco

nd

s) All XPFP users

All Salmon users

Successful XPFP users

Successful Salmon users

• CMU Usable Privacy and Security Laboratory • 58

Where XP did better - Where XP did better - WesleyWesley

Average behavior time per participant for Wesley task

0

5

10

15

20

25

30

35

40

45

Behavior

Tim

e (s

eco

nd

s) All XPFP users

All Salmon users

Successful XPFP users

Successful Salmon users

• CMU Usable Privacy and Security Laboratory • 59

How users spent their time - How users spent their time - JackJack

Average behavior time per participant for Jack task

0

10

20

30

40

50

60

Behavior

Tim

e (s

eco

nd

s) All XPFP users

All Salmon users

Successful XPFP users

Successful Salmon users

• CMU Usable Privacy and Security Laboratory • 60

Where Salmon did better - Where Salmon did better - JackJack

Average behavior time per participant for Jack task

0

10

20

30

40

50

60

Behavior

Tim

e (s

eco

nd

s) All XPFP users

All Salmon users

Successful XPFP users

Successful Salmon users

• CMU Usable Privacy and Security Laboratory • 61

Where XP did better - JackWhere XP did better - Jack

Average behavior time per participant for Jack task

0

10

20

30

40

50

60

Behavior

Tim

e (s

eco

nd

s) All XPFP users

All Salmon users

Successful XPFP users

Successful Salmon users

• CMU Usable Privacy and Security Laboratory • 62

Common UI problems Common UI problems summarysummary

Feedback poor

Labels ambiguous

Omission error potential

Convention violation

Keeping options visible

• CMU Usable Privacy and Security Laboratory • 63

User interface evaluation User interface evaluation summarysummary

Know what you’re measuring

Internal validity: Control your experiment

External validity: Make your experiment realistic

• CMU Usable Privacy and Security Laboratory • 64

Rob Reederreeder@cs.cmu.edu

CMU Usable Privacy and Security Laboratory

http://cups.cs.cmu.edu/

• CMU Usable Privacy and Security Laboratory • 65

x-x-x-x-x-x-x END x-x-x-x-x-x-x-x-x-x-x-x END x-x-x-x-x-x-xx-x

• CMU Usable Privacy and Security Laboratory • 66

ResultsResults

Small-size Large-size

Task type Accuracy Speed Accuracy Speed

View simple

View complex

Change simple

Change complex

Compare groups

Conflict simple

Conflict complex

Memogate simulation

Precedence rule test

Grid

Windows

89%56%

94%17%

89%94%

61%0%

89%83%

67%61%

89%0%

100%94%

89%94%

61%56%

10039%

100100

67%17%

67%83%

72%61%

1006%

94%78%

78%78%

29s64s

35s55s

30s52s

70sInsufficient data

39s103s

55s103s

29s

20s66s

Insufficient data

42s118s

42s61s

39s67s

50s42s

73s104s

52sInsufficient data

105s116s

71s115s

100s143s

111s126s

• CMU Usable Privacy and Security Laboratory • 68

Measure the right thing!Measure the right thing!Keystroke dynamics analysis poses a real threat to any computer user. Hackers can easily determine a user’s password by recording the sounds of the users' keystrokes. We address this issue by introducing a new typing method we call "Babel Type", in which users hit random keys when asked to type in their passwords. We have built a prototype and tested it on 100 monkeys with typewriters. We discovered that our method reduces the keystroke attack by 100%. This approach could potentially eliminate all risks associated with keystroke dynamics and increase user confidence. It remains an open question, however, how to let these random passwords authenticate the users.

Recommended