53
Enabling non-visual interaction Stephen Brewster Glasgow Interactive Systems Group School of Computing Science University of Glasgow [email protected] June 2013

Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Embed Size (px)

DESCRIPTION

Oplægget blev holdt ved InfinIT-arrangementet Temadag om Interaktionsdesign, der blev afholdt den 20. juni 2013. Læs mere om arrangementet her: http://www.infinit.dk/dk/hvad_kan_vi_goere_for_dig/viden/reportager/styr_din_mobiltelefon_med_et_nik.htm

Citation preview

Page 1: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Enabling non-visual interaction

Stephen Brewster

Glasgow Interactive Systems Group School of Computing Science

University of Glasgow

[email protected]

June 2013

Page 2: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

2

Page 3: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

3

Multimodal interaction

Key area of work is Multimodality

More human way to work

Not everyone has all senses / control capabilities

May not always be available all of the time

No one sense can do everything on its own

Using other senses/control capabilities to make up for lack of visual display

Page 4: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

4

Research areas

Novel multimodal interaction techniques

Touchscreen and mobile user interfaces Improving the usability and user experience

Tabletop interaction with phone

Interaction with 3D TV, phone + TV

User interfaces for cameraphones and digital cameras

Accessibility Blind users and visualisation, Older adults, navigation, mobility

Multimodal home care

Mobile health apps / sports performance apps

Page 5: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

5

Modalities

Non-speech audio

Earcons, 3D sound, sonification, Musicons

Computer haptics

Force-feedback, pressure input, temperature output

Tactile (vibrotactile and pin arrays), ultrasound

Gestural interaction

On-screen, in-air, multi-touch, capacitive sensing

Smell

Page 6: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Display-less interaction design?

No display at all?

Then only input with no feedback

Difficult design problem

Visual display-less?

Make up for lack of visual display by the use of alternative input and output techniques

6

Page 7: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Overview of talk

Motivation

Issues in mobile interaction

‘Eyes free’ and ‘hands free’

Alternative multimodal solutions

Gestures/pressure for input

Haptic and audio displays for output

Tools that you might use to replace visual displays

Might need new forms of input as well as output

7

Page 8: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Mobile interaction problems

Mobile interaction takes place in the world

Users involved in other tasks, on the move

Contexts very varied

Hands and eyes may be busy

Visual display may not be accessible or appropriate

New forms of interaction are needed if keyboard and screen are not easily available

8

Page 9: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Touchscreens

Wide application of touchscreens

Phones, tablets, TV remotes, ….

Larger display area, direct interaction with finger, more flexible use of device, no need for physical keyboard

Touchscreens problems

No tactile feedback - ‘feel’ is poor

Input difficult and error prone

Requires much visual attention

Two hands

‘Fat finger’ problem

9

Page 10: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Solutions?

Need a set of tools to use when visual displays not available

Multimodality

Gestures/pressure for input

Haptic and audio displays for output

10

Page 11: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

GESTURE INPUT

11

Page 12: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Why gestures for input?

Kinaesthetic perception means gestures ‘eyes free’

Types

On screen of the device

Device in hand

Different body locations

Self-contained, no screen or surface needed

Can be one handed, no handed

Good if users are involved in something else, e.g. carrying, operating machinery

Many sensors included in devices already

Others easily added via Bluetooth

12

Page 13: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Multi-touch gestures

On-screen gestures

Tactile guidance for gestures

T-bars

Dynamic feedback

Keep finger on target

File-o-feel

Touch-n-twist

13

Page 14: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

14

Head gesture interaction

Non-visual interface where users could nod at audio sources to select them

Hands-free

3D audio for output

Will discuss more of the audio design later

Worked well when users were mobile

People could easily nod and walk

Backward nods not ideal

Page 15: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Wrist gestures

Can rotate wrist to control a remote cursor

Investigated whether users could select targets using wrist

Very effective

90% accuracy for 9°

Targets

Other interactions

Shoulder click, foot

tap, head nod, body tap, ...

15

More info: www.gaime-project.org

Page 16: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Do gesture systems work in the wild? Gesture RSS System

Allows browsing of news feeds

7 participants using the system on their morning commute for a week

Interaction

Menu Up/down/select -> rotate/shake right wrist

Back up a level -> Shake left wrist

Gaiting the gestures -> rotate left wrist upside down

Non-visual interaction

Speech/Non-speech audio

Page 17: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

17

Page 18: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

PRESSURE INPUT

18

Page 19: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Pressure input

Little studied in HCI, but a rich source of input and control

Musical instruments

Drawing, holding / grasping

Can we uses pressure as another input mechanism?

Avoid the ‘fat finger’ problem by doing gestures in z dimension

No need for (x,y) positioning of finger so easy to do eyes free

19

Page 20: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Pressure menus

20

Page 21: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Grip and grasp

Can we use the way we grip a device to control it?

Can we use this for interaction?

Make a two-handed interaction into a one handed version

21

Page 22: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

22 52 sec

Page 23: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Grip results

Compared rotate and zoom

Pinch/rotate using multitouch and 2 hands

Grip

One handed grip equal to or better than traditional method

Less time hunting for small buttons

No finger occlusions

No ‘fat finger’ problem

Also works well when walking

Squeezing devices very effective for input 23

Page 24: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

HAPTIC FEEDBACK

24

Page 25: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

25

Haptic feedback

Haptics – to do with the sense of touch

Display to the skin

Many different components

Pressure, temperature, vibration, …

Has benefits over visual display

Eyes-free

Tactons, tactile icons

Page 26: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

26

Design of Tactons

Tactons are tactile messages that can be used to communicate non-visually

Encode information using parameters of cutaneous perception

Waveform

Duration/rhythm

Body location

Page 27: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Tactile button feedback

Touchscreen phones have no tactile feedback for buttons

More errors typing text and numbers

Compared performance of real buttons to touchscreen, to touchscreen+tactile

In lab and on Glasgow subway

Touchscreen+tactile as good as real buttons

Touchscreen alone was poor

Combining tactile + audio feedback

27

Page 28: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Tactile feedback for typing

Previous studies showed adding tactile feedback to touchscreen typing increases performance

Can we use the tactile feedback to communicate more?

Ambient display

Change the feel of buttons based on external factor

Arrival of email, proximity of friend

Roughness and duration

Duration indicated proximity

Roughness indicated friend or family

Users could identify meaning while typing very accurately

28

Page 29: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Artex: Artificial textures from everyday surfaces

Lack of tactile feedback on touchscreens

Goal Multiple texture patches for texturing screen

Aims to feel like a familiar texture, not a tactile effect

Texture with everyday textures Record texture using contact mic attached to stylus

Process into a loopable audio file

Vary amplitude and playback rate with user’s finger speed over the screen

Page 30: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

THERMAL FEEDBACK

30

Page 31: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Temperature Based Interaction

Temperature an unused part of touch

Can we use it for communication?

Very strong emotional response to temperature

Humans are very sensitive to temperature

Key technique for determining material properties

Children’s hotter/colder game

31

Page 32: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Temperature

Peltier device

4 heat pumps (2 pairs of hot and cold)

Can be mobile or desk based

Ran a detailed series of psychophysical studies to investigate ranges of temperatures that should be used

Also tested these mobile to see more real-world effects

32

Page 33: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

33

Page 34: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Indoor mobile thermal study

34

Page 35: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Effects of changing environment

Front of School Back of School

Page 36: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Design Recommendations

Palm is most sensitive but wrist and arm are acceptable

Stimulus intensities should be at least 3°C to guarantee detection but 6°C at most for cooling and <6°C for warming to ensure comfort

Both warm and cool stimuli are detectable and comfortable but cool stimuli are preferred

Cool detected fastest

Moderate rate of change (2-3°C/sec) provide good saliency but lower rate of change required for high intensity stimuli

Page 37: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Applications

Thermal icons

Enhancing emotional experiences

Thermal feedback can enhance the experience of consuming media (images, music)

Notifications and warnings

37

Page 38: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

ULTRASOUND FEEDBACK

38

Page 39: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Ultrasound haptics

New project with University of Bristol

Using phased array ultrasound beams we can create pressure waves in the air

Create ‘feelable’ forces in the air above the emitters

Non-contact haptics

Can also be used to support (light) objects in the air

39

Page 40: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

40

Page 41: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

41

Page 42: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

42

Page 43: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Levitation!

43

Page 44: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Ultrasound haptics

Challenges

Position array around the edges of a device to create feedback

Combine with Kinect depth camera for non-contact input and output

Texture design

Just beginning to see the applications

44

Page 45: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

AUDIO FEEDBACK

45

Page 46: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

46

Non-speech audio feedback

Music, structured sound, sound effects, natural sound

Why non-speech sound?

Icons vs text, non-speech vs speech

Good for rapid non-visual feedback

Trends, highly structured information

Earcons

Structured non-speech sounds

Musicons

Short snippets of well known music used for interaction

People very good at recognising music

Page 47: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

3D audio interaction

Need to increase the audio display space

Deliver more information

Quickly use up display space

3D audio

Provides larger display area

Monitor more sound sources

Non-individualised HRTFs, headphones

Planar sound (2.5D)

‘Audio windows’

Each application gets its own part of the audio space 47

Page 48: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

How do we use spatial audio?

Applications

Progress indicator

Diary

Pie Menus

Non-visual navigation

Combines well with gesture

Both spatial

Pointing/orienting towards a sound is natural

48

Page 49: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

AudioFeeds

Mobile application for monitoring activity in social media

Monitoring state of feeds

Spotting peaks of activity in one feed

Twitter, FaceBook, RSS

Spatialized sound

Placed each type of activity in different location

Each type had different sound

Within that different actions have related sounds

49

Page 50: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

AudioFeeds

Users able to monitor feeds and maintain overview

Even with complex soundscapes

When mobile

50

FaceBook (water)

Twitter (birds)

RSS (abstract

instruments)

Inbox msg (splash)

Friend feed (chirp)

CNN (digeridoo)

News feed (bubbles)

Direct msg (crow)

BBC (zither)

Notification (pouring)

Reference (junglefowl)

TechCrunch (wind chime)

Friend request (drops)

Hashtag (canary)

Uni News (pan flute)

Page 51: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Pulse: an auditory display to present a social vibe Presenting ‘vibe’ or ‘pulse’ of an area while you move through it

‘Play’ geo-located tweets

Sonification

Presented around the user in 3D sound

Message volume (water splashes)

Message density (flow rate of river)

Topic diversity (bubbling sound)

Tested in lab and in Edinburgh during the festival

Effective at giving awareness

51

Page 52: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Conclusions

Situations where eyes (and hands) not available

Current display/interaction techniques impossible to use

Multimodal interaction can provide new tools for designers

New input techniques needed with non-visual outputs

Gestures good as input can be ‘hands free’

Sound and tactile feedback ‘eyes free’

Hard to overcome lack of visual display

Multimodal interaction techniques provide new opportunities and applications

52

Page 53: Enabling non-visual Interaction af Stephen Brewster, University of Glasgow

Enabling non-visual interaction

Stephen Brewster

Glasgow Interactive Systems Group University of Glasgow

[email protected]

www.dcs.gla.ac.uk/~stephen

June 2013