19
Haptic Navigation in Mobile Context Hanna Venesvirta University of Tampere Department of Computer Sciences Interactive Technology Seminar “Haptic Communication in Mobile Contexts” October 2008

Haptic Navigation in Mobile Context - University of Tamperehui/mobile/papers/Venesvirta-paper.pdf · i University of Tampere Department of Computer Sciences Interactive Technology

  • Upload
    lyphuc

  • View
    215

  • Download
    1

Embed Size (px)

Citation preview

Haptic Navigation in Mobile Context

Hanna Venesvirta

University of Tampere

Department of Computer Sciences

Interactive Technology

Seminar “Haptic Communication in

Mobile Contexts”

October 2008

i

University of Tampere

Department of Computer Sciences

Interactive Technology

Hanna Venesvirta: Haptic Navigation in Mobile Context

Seminar paper, 16 pages, 5 indexes

December 2008

Haptic communication is intuitive and easy to learn, and it

provides a new and unique interaction method for all users and

especially for blind and visually impaired users. There are

several haptic navigation devices. When designing and

implementing these products and prototypes, the projects have

been especially interested in visually impaired users and users in

special situations, such as pilots, firemen and military persons

are. Furthermore, these devices provide some extra for all users.

Key words and terms: Haptic Communication, Haptic Navigation, Haptic I/O, Mobile

or Wearable Devices

ii

Table of Contents

1. Introduction ............................................................................................................... 1

2. Haptic navigation: what, where, why and why not? ................................................. 1

2.1. What is haptic navigation and where it can be used ........................................ 1

2.2. Advantages of haptic navigation ...................................................................... 2

2.3. Challenges of haptic navigation ....................................................................... 3

3. Technologies used in haptic navigation .................................................................... 4

4. Haptic navigation: examples ..................................................................................... 5

4.1. MOMO: A haptic navigation device ................................................................ 5

4.2. Lead-Me Interface for a Pulling Sensation from hand-held Devices............... 7

4.3. CabBoots – Shoes with integrated Guidance System ...................................... 8

4.4. Vibro-Vest A Wireless Haptic Navigation Device ........................................ 10

4.5. ActiveBelt ...................................................................................................... 11

4.6. Technojewerly ................................................................................................ 13

5. Conclusion ............................................................................................................... 14

References ..................................................................................................................... 15

1

1. Introduction

Haptic interaction between the user and the interface or the device has many

advantages. Researchers have found many opportunities on haptic navigation, even as a

main modality of the communication between the user and the device. That is because

haptics have been found to be intuitive and natural way of communication.

Haptics support users’ cognitive attentiveness and perception without disturbing

other modalities. Especially it has thought to be really powerful communication method

for visually impaired people, but as well for situations, where other modalities are

taken.

Haptic navigation is a navigation method, which uses sense of touch as an input or

output channel. The device can interact with the user by vibrating, leaning to the desired

direction, pulling or pushing the user, among others.

Here, I will introduce haptic navigation in mobile context. Section 2 will lead the

reader in to the subject: what is haptic navigation. On section 3, I will discuss what

interaction methods and technologies have been used with haptic navigation, and

furthermore, I will introduce few examples of haptic navigation on section 4. Finally, I

will conclude on section 5.

2. Haptic navigation: what, where, why and why not?

In this section, I will discuss several things: What is haptic navigation; Where it can be

used; What are the advantages of haptic navigation compared other navigational

methods, and; What are the challenges of haptic navigation.

2.1. What is haptic navigation and where it can be used

Word navigation can be understood in several ways: when referring to navigation, it

can mean that some (motor) vehicle, like a car, an aeroplane, or a ship is driven. Then

again, navigation means browsing in the World Wide Web or some (graphical)

interface. In some cases, the word navigation is used to describe a situation, when

someone is guides or controls someone else, or a machine. [OED]

In haptic navigation, user has a device, which leads them to the desired location by

using feedback based on the sense of touch. Feedback can be given via vibrating

(several haptic mice use vibration; see also [IDEO; Erp et al., 2001; Hamm; Wang and

O’Friel]), pulses, by pushing and pulling [Amemiya et al., 2008], or by leaning to the

needed direction [Frey, 2007; Wang and O’Friel]. Sometimes the device can use several

haptic feedbacks at the same time [Wang and O’Friel], or a combination of different

modalities [IDEO]. In general, the haptic communication is one-sided: the device gives

output, but the user does not communicate with the device, not at least with haptic

interaction. Then again, it can be argued whether or not users’ responses (e.g. changing

2

movement) to the information given are communication with the device. At least, it is

not that active.

One can use haptic navigation in a virtual environment, browsing graphical

interface, or moving in the real world. One can find some techniques produced for

haptic navigation in graphical interface: for instance, it is possible to make large

amounts of data more understandable by using haptics and haptic navigation, like

information visualization does. This is useful for example for visually impaired users,

as they are able to access information usually described in visual or graphical means.

Also, this might mean that browsing is faster and easier, as navigation is not based on

reading or using screen readers. (More about haptic navigation in graphical interfaces

see e.g. [Wall and Brewster, 2006].)

Moving in a virtual environment (VE) by means of haptic navigation is very much

the same thing as moving in real world; the user is searching a route and explores the

VE. This means, they might require help to find their way or to control their route.

(More about using haptic navigation in VEs see e.g. [Erp, 2001] or [Nemec et al.,

2004].)

Several haptic navigational devices for navigating in real world have been

developed. Some of the devices have been made for visually impaired (e.g. [Amemiya

et al., 2008; Amemiya and Sugiyama, 2008]), for special situations, where visual and

auditory channels are taken, like when driving a car, navigating a ship, or flying [e.g.

Erp et al., 2005], as well as various common use, in various situation for various people

[e.g. MOMO; IDEO; Frey, 2007].

From the viewpoint of mobile haptic navigation, it is reasoned to concentrate on the

latter. Clearly it seems that haptic navigation is needed and has potential in real world

environments. This is why I have decided, that I will curtail my later discussion on this

particular subject.

2.2. Advantages of haptic navigation

Haptic communication between the device and the user has been noted to be effective,

since haptic modality can provide very intuitive interaction [Amemiya et al., 2008] –

after all, communication based on touch is the very first way of communication. When

communicating with the navigational devices it is important that communication

between the device and the user continues to be intuitive and comfortable. For example,

the designers of CabBoots [Frey, 2007] and Lead-Me interface [Amemiya et al., 2008]

regard that their products are intuitive and easy to learn, as interaction in both products

is based metaphors from real life.

Haptic information is non-verbal and cognitively less distracting than information

from other channels [Amemiya et al., 2008]. When cognitive activity has been

oppressed heavily, like when travelling in unfamiliar surroundings, it is very important

3

to get information without disturbing or limiting important modalities, like seeing or

hearing. As haptic communication is non-verbal and non-visual, attentiveness is not

disturbed, but more information channels can be used. In addition, the visually impaired

users are often mentioned to benefit haptic communication, since they prefer devices

which do not narrow down their last communication channel, hearing. This is really

important benefit, as people who’s seeing is lacking it is very important that they have

their possibility to communicate with other people, and also, be aware their

environment, because of their own safety. Also, it has been discovered that haptic

channel supports long-term location memory [Amemiya et al., 2008; Chapman et al.,

2001].

Haptic navigation devices can be learnt to use fast, even after short practise.

Likewise it has discovered that haptic device can be used for navigation both in

ordinary situations, and in operational environments, like flying a helicopter. [Erp et al.,

2005]

Employing mobile devices for navigation makes the usage of navigational device

easier, as these devices have been produce for situations, when navigation is often

needed: on the road. However, both mobile and haptic elements give challenges for

navigation. In next section some of the challenges will be discussed.

2.3. Challenges of haptic navigation

In general, the way people use cognitive mappings is a great challenge for navigation

devices [Bradley and Dunlop, 2005]. People tend to use cognitive mappings based what

they are looking and what interests them. It has speculated that (1) time of day (night,

day), (2) season, and (3) direction all have an influence how people build cognitive

mappings [Jonsson, 2002]. Because people have different cognitive mappings, it is

challenging to take these into account when designing navigation devices. Also, there

has been very little research about how visually impaired people use cognitive mapping

[Bradley and Dunlop, 2005]. Since potential users for haptic navigation, it would be

important to study this.

Using only haptics in navigation devices brings two communication problems: it is

very difficult to code direction and above all, distance when using haptic feedback [Erp

et al., 2005]. Coding direction is notably easier – though there were only two directions

coded, it has been noted that navigation can be successful [e.g. Bosman et al., 2003]. Of

course, if there are more sensors, understanding directional changes is easier [Erp et al.,

2005].

Coding distance is challenging regardless of how the device is connected to the

user. Then again, when walking, it is not always important to know how big the

distance is – users concern themselves more the direction. Yet, according Erp et al.

[2005], users were pleased if distance and above all, reaching the destination is coded.

4

The problem when coding distance on a haptic navigation device is that people have no

baseline to tell them where exactly the destination is – all they can say, that they are

approaching. Only after long period of using some device people may learn to calculate

distance. It is also possible, that the problem will not ever be solved, and in most cases,

designers have not even tried to implement any coding for it.

Mobility as such brings some challenges for haptic navigation: putting together

mobile device and kinaesthetic interaction. According Amemiya et al. [2008], most of

the interfaces used in mobile devices these days are not suitable for mobile devices.

Mobile devices are often compact, so any massive technologies cannot be used. Also, as

being small-sized devices held usually in one hand, it is challenging to implement

technologies which indicate direction – for instance, vibrating is not that informative

when used with for example mobile phone. Amemiya et al. have their own suggestion

to solve this problem; more of it will be discussed below, on section 4.2.

Finally, there might be existing problem, which was ignored in literature: nearly all

products I found require that user have to either wear them or carry them in their hand.

All wearable devices, like belts and vests, appear to be uncomfortable and not for

everyday use. Then again, hand-held devices require that user carry them, so they limit

users’ activities. GPS Toes (introduced on section 4.6) designed by IDEO [IDEO] were

closest of device, which is cosy and unnoticeable.

3. Technologies used in haptic navigation

In haptic navigation, both tactile and force feedback can be used. It seems that mostly

the tactile feedback, such as vibration, is used nowadays, but according Amemiya et al.

[2008], especially in hand-held devices, such as in mobile phones, force feedback could

be more understandable, effective and intuitive.

As mentioned above, several haptic navigation devices use tactile feedback, usually

vibration. In this paper, I will later introduce following examples that use vibration (and

possible, other interaction methods) to communicate with the user): GPS Toes by IDEO

[IDEO], Vibro-Vest [Hamm], ActiveBelt [Tsukada and Yasumura, 2004] and MOMO

[Wang and O’Friel].

As an examples about force feedback, I will introduce Lead-Me interface by

Amemiya et al. [2008] and CabBoots by Frey [2007]. It can be speculated if MOMO

[Wang and O’Friel] is also using force feedback, too, as it interacts with the user by

vibrating and leaning to the needed direction.

Vibration feedback seems to have great potential to be more intuitive interaction

method to illustrate distance than force feedback. For instance, Erp (et al.) have studied

with users, how distance could be coded in haptic navigation device, which the user

wears [Erp, 2005; Erp et al., 2005].

5

Still, for more everyday situations, like navigating in an unfamiliar city, the force

feedback could be more useful, at least when implemented on a hand-held device.

Firstly, on an everyday situation like this, the user rarely needs any coding for distance,

but rather for direction. Furthermore, people usually would not wear any big,

uncomfortable and heavy wearable devices, but rather they could prefer that some

proper interface could be implemented their own mobile phone.

4. Haptic navigation: examples

In this section, I will introduce six different kinds of haptic navigation devices. None of

these devices are commercial produces, but more like scientific or artistic trials. Yet, it

is never said whether or not some of these examples would end up at the markets. I

have tried to choose different examples as it would be un-wise to discuss similar

devices here.

4.1. MOMO: A haptic navigation device

MOMO is totally haptic and mobile navigation device designed by Wang and O’Friel

[MOMO; Wang and O’Friel]. It navigates the user by vibrating and leaning to the

direction, where one is about to go. There is no haptic input interaction available, only

interaction is output. Unfortunately there is no explicit information available that could

comprehensive describe how user could interact with MOMO. The designers state that

they have pre-programmed the routes they have used to test the device, so it seems that

there is no possibility to give direct orders to the device, at least not nowadays.

MOMO is twelve inches (about 30 centimetres) tall, the diameter is eight inches

(about 20 cm) and the weight one pound (less than half of a kilogramme).

MOMO is comprised of a GPS module, digital compass, an arduino board, two

servo motors and a vibration motor (see Figure 1, on left). MOMO has an open

hardware. Its sweater was crocheted out of wool, cotton, and, as the designers mention,

“love, creating a soft, huggable surface” (see Figure 1, on right).

6

Figure 1: On left, the mechanics of MOMO, on right, MOMO in its clothing.

According Wang and O’Friel, MOMO provides positive emotional experiences,

enabling people to feel empowered in unfamiliar places. The designers have pre-

programmed GPS coordinates of twelve New York City parks, and they have used

MOMO as a tour guide (see Figure 2).

Currently, it is evident that using MOMO for navigation anywhere is not possible.

Yet, I personally think that if the interface of MOMO will be expanded so, that it is

possible to interact with him like giving him direct orders of the destination or the

changes of the route, and then it could be potential navigational device for versatile use.

MOMO has won an award in Art Hacks on Arduino Contest 2008.

7

Figure 2. MOMO working.

4.2. Lead-Me Interface for a Pulling Sensation from hand-held Devices

Lead-Me interface uses metaphor of pulling and pushing when guiding the user.

Amemiya et al. [2008] compares metaphor to a situation, where parent leads a child.

The child can watch around while they know that direction is about to be changed when

parent pushes or pulls them on hand.

Figure 3. Lead-Me: Overview of the prototype of the haptic device.

Amemiya et al. [2008] have developed a design that uses “different acceleration

patterns for the two directions to create a perceived force imbalance and thereby

produce the sensation of directional pushing or pulling”. The prototype of the device is

based on a crack-slider mechanism. The mechanism imparts a feeling of back-and-forth

movement, which can be recognized as a sensation of pushing and pulling. Schematic

drawings and a figure of the prototype are shown in Figure 3.

8

Amemiya et al. have some ideas of how the interface could be used. On Figure 4

Assisted navigation application can be seen (on left). According the designers, Lead-Me

could be implemented to hand-held device, like mobile phone. The device could use

GPS and Lead-Me when guiding user to the destination. Also, they have thought that

Lead-Me could be used in a game controller (see Figure 4, on right).

Figure 4. Proposed interfaces for Lead-Me: Assisted navigation application (on left and

in the middle) and an effective game controller (on right).

Metaphor of the interface seems to fit for situations, where only one navigational

device, (e.g. mobile phone) is used. On situations like this, metaphor of pushing and

pulling could make easier to indicate direction. At the moment, Amemiya et al. have

been testing whether or not the metaphor could be used as effectively as thought.

There is no accurate information whether or not Lead-Me could be implemented

into even smaller devices – if it is not possible, then the device has to be a hand-held,

and that will limit users’ activity in some cases.

4.3. CabBoots – Shoes with integrated Guidance System

With CabBoots (see Figure 5), the user gets to take a walk into a virtual path, which the

shoes will not let them wander off [Frey, 2007; see also Frey]. The idea of the interface

is to plan the most suitable route for the user, based on the known starting point and

desired destination. When the user draws, they can rely that CabBoots know the way.

9

Figure 5. CabBoots.

The designer uses metaphor of walking on a path: user walks along a virtual path. If

the user treads the edge of the path, in other words, is about to stray off the course,

CabBoots model the feeling of the edge, and the user is able to fix their direction. (See

Figure 6)

Figure 6. CabBoots-tour.

The first prototype of CabBoots consist a pair of shoes equipped with sensors and

mechanics, which are wirelessly connected to the laptop computer the user needs to

carry around, which runs a control-software. Servo motors connected to wooden flaps

in the shoes are able to set the angle of the sole when needed. There are several sensors,

which deliver information about the actual state of the shoe, and thereby the foot.

Software is needed for setting the paths in direction and also to provide a visual control-

panel for monitoring the shoes spatial state. All this makes possible to navigate in a

virtual path. The designer argues that the communication metaphor is familiar and it is

based real life experiences. [Frey, 2007]

10

The designer has made a second prototype, in which the shoes host all necessary

parts themselves: e.g. mechanics, electronics, power supply, rf link. Also, the shoes can

be strapped on any shoes (as long as the shoe is a certain size) and are connected

wirelessly via Bluetooth. The control-software can be run not only in a computer, but

also in PDA or mobile phone, so huge progress has been made between 2007 and 2008.

[Frey]

4.4. Vibro-Vest A Wireless Haptic Navigation Device

In the fall 2002 Damon Hamm with 3 other designers built a wireless haptic navigation

and guidance system called Vibro-Vest [Hamm]. Also, the designers produced a game,

which they named “Human Pac-Man”. A player guides person who wears the vest

through a human-sized maze. The person, who is in the maze, gets the feedback sent

from the player by haptic feedback. (See Figures 7 and 8)

Figure 7. The Vest hardware

Vibro-Vest is not technically mobile navigation device; instead a videogame

console is used to guide the piece. However, Vibro-Vest is one example of many

interesting possibilities when exploiting haptic navigation, e.g. in the context of social

interfaces. Then again, the designer mentions [Hamm], that this product has many

possible applications.

Also other vest and belt applications have been produced [e.g. Erp et al., 2005; see

also Erp, 2005]. In most cases feedback is given by vibrating, and the count of sensors

depends of the product. According to Erp et al. [2005], vibrotactile navigation is a

powerful method, especially in illustrating direction. Yet, there is a question whether or

not these kinds of products are mobile: the navigational device itself can be wireless,

but when some other device is needed to specify the route, the concept of mobility is

arguable.

11

Figure 8. Video game console (on left) and the human-sized maze (on right).

4.5. ActiveBelt

ActiveBelt is a novel belt-type wearable tactile display that can transmit directional

information for the user. According Tsukada and Yasumura [2004], there is at least

three advantages in this kind of a haptic navigation device: (1) as there is several

sensors on the belt (on their prototype, they use eight sensors), the user can easily match

the information given from the device to the directions in the real world. (2) As the

users usually wear a belt, there is no need to wear anything extra, as on this device, the

actuator can be attached on the users own belt. (3) Furthermore, there are several

applications the device can use. The applications Tsukada and Yasumura have

developed will be discussed below.

Figure 9 shows the prototype of the device. There are four components: the

hardware (version 1) (1), a GPS (2), a directional sensor (3) and a microcomputer (4).

Tsukada and Yasumura state also that they have designed the version 2 of the device so

that the size of the belt is universal; the sensors can be moved so the device can be worn

by several users.

12

Figure 9. ActiveBelt: Prototype (version 1)

Tsukada and Yasumura have developed four applications for ActiveBelt: (1)

FeelNavi for human navigation systems, (2) FeelSense for location aware information

services, (3) FeelSeek for search of lost properties, and (4) FeelWave for entertainment.

Here, I will introduce the first three of these. (See also Figure 10.)

Figure 10. The basic concepts of proposed applications: on left, FeelNavi; in the

middle, FeelSense; and on right, FeelSeek.

FeelNavi is an application for navigation. The direction is illustrated by vibration.

The prototype uses the latitude and longitude when registering the destination the user

13

needs to reach; the device uses this information among the information about users’

current position and orientation, and activates the sensor that illustrates the direction.

FeelSense is a location-aware information system. The user can pre-register some

information, like which shops they are interested in, and the application can then

communicate to the user, when something matching their concern is nearby.

FeelSeek is an application to remind users of valuables left behind. Furthermore,

the application can lead the user back to the spot the item was left. The combination of

ActiveBelt, FeelSeek application and RFID tags is used.

4.6. Technojewerly

In 2001, IDEO [IDEO] executed a project, which purpose was to bring two new and at

the same time, very ordinary technologies closer to everyday usage. Penta Phone and

Ring Phone are concepts for mobile phones, and GPS Toes is a navigational device (see

Figure 11). The idea of producing these kinds of concepts was to prove, that new

technology does not need to look unfamiliar or uncanny, but it can and should be

integrated our word and users’ person.

GPS Toes (see Figure 11, on right) uses low-power, nano-derived technology. GPS

Toes communicates with a GPS receiver nearby, like the one which is in the user’s

purse. The device indicates the direction by vibrating and lighting up to signal

upcoming direction changes; a ring in a left toe to the left, in a right toe to the right.

According the designers, GPS Toes can be used whether driving a car, walking on the

streets, or hiking on the countryside.

Unfortunately, these products were concepts only, and those cannot be invested

reselling.

Figure 11. Technojewerly. GPS Toes on right.

14

5. Conclusion

What I was most longing during my survey, was some product, which would slightly be

like IDEOs’ Technojewelry concept; that is because in my opinion, GPS Toes is (of the

all products I was able to find) alone imperceptible, mobile, handy and ordinary

enough, that it could be useful when navigating; specially if it could use some

promising interface, like Lead-Me. Most of the devices illustrated on this paper appear

to limit the user’s motion. Vest and belt devices are large and seemingly unwieldy, at

least many of them. All different kinds of hand-held devices, like mobile phones,

require that the user has to keep them in hand while using. Both MOMO and CabBoots

might be too extraordinary for everyday use; likewise, MOMO is needed to be held

with both hands, so it encumbers the user’s activities a lot. Hopefully, in future there

will be some hand-held or wearable devices, which can be used for haptic navigation,

with appearance ordinary enough. For this purpose, ActiveBelt could be promising

option, with the many interesting ideas of applications the developers have.

Haptics open up a possibility for navigation to be novel and cognitively lighter.

Haptic navigation has lots of potential, both in virtual environments and in real world.

The benefits it offers for special needs and for all others are large. Yet, more research

and development is needed, so haptic navigation could be more commonly used.

Currently, haptics can be one element used in navigational devices, together with other

modalities. Perhaps also in the future there is need to use several modalities in

communication between the user and navigation device – after all, as the users are

multimodal, should the device be also.

15

References

[AMEMIYA et al., 2008] AMEMIYA, T., ANDO, H. and MAEDA, T., 2008. Lead-me

interface for a pulling sensation from hand-held devices. ACM

Trans.Appl.Percept., 5(3), pp. 1-17.

[AMEMIYA and SUGIYAMA, 2008] AMEMIYA, T. and SUGIYAMA, H., 2008.

Design of a Haptic Direction Indicator for Visually Impaired People in

Emergency Situations. Computers Helping People with Special Needs. Springer,

pp. 1141-1144.

[BOSMAN et al., 2003] BOSMAN, S., GROENENDAAL, B., FINDLATER, J.W.,

VISSER, T., GRAAF, M. and MARKOPOULOS, P., 2003. GentleGuide: An

Exploration of Haptic Output for Indoors Pedestrian Guidance. Proceedings of the

Mobile HCI, 8-10-2003, Springer Publications, pp. 358-362.

[BRADLEY and DUNLOP, 2005] BRADLEY, A. and DUNLOP, D., 2005. An

Experimental Investigation into Wayfinding Directions for Visually Impaired

People. Personal Ubiquitous Comput., 9(6), pp. 395-403.

[CHAPMAN et al., 2001] CHAPMAN, C.D., HEATH, M.D., WESTWOOD, D.A. and

ROY, E.A., 2001. Memory for kinesthetically defined target location: Evidence

for manual asymmetries. Brain and cognition, 46(1-2), pp. 62-66.

[ERP, 2005] ERP, JAN B.F. VAN, 2005. Presenting directions with a vibrotactile torso

display. Ergonomics, 48(3), pp. 302.

[ERP, 2001] ERP, J.B.F. VAN., 2001. Tactile Navigation Display. Proceedings of the

First International Workshop on Haptic Human-Computer Interaction, 2001,

Springer-Verlag pp. 165-173.

[ERP et al., 2005] ERP, J.B.F. Van., VEEN, H.A.H.C.V., JANSEN, C. and DOBBINS,

T., 2005. Waypoint navigation with a vibrotactile waist belt. ACM

Trans.Appl.Percept., 2(2), pp. 106-117.

[FREY, 2007] FREY, M., 2007. CabBoots: shoes with integrated guidance system.

Proceedings of the 1st international conference on Tangible and embedded

interaction, 2007, ACM pp. 245-246.

[FREY] FREY, M., CabBoots - Shoes with Integrated Guidance System. Available:

http://www.freymartin.de/en/projects/cabboots [9/22, 2008].

[HAMM] HAMM, D., Damon Hamm - Haptic Navigation Device: Vibro-Vest A

Wireless Haptic Navigation Device. Available:

http://www.damonhamm.com/oldsite/vibrovest.html [9/22, 2008].

[IDEO] Technojewelry - Case Studies - IDEO. Available:

http://www.ideo.com/work/item/technojewelry/ [9/30, 2008].

[JONSSON, 2002] JONSSON, E., 2002. Inner navigation: why we get lost and how we

find our way. Scribner, New York, pp. 27-126.

16

[MOMO] MOMO: A haptic navigation device. 2008. Libelium.

[NEMEC et al., 2004] NEMEC, V., SPORKA, A.J. and SLAVIK, P., 2004. Haptic and

Spatial Audio Based Navigation of Visually Impaired Users in Virtual

Environment Using Low Cost Devices. User-Centered Interaction Paradigms for

Universal Access in the Information Society, Springer, pp. 452-459.

[OED] OED, Oxford English Dictionary 2008 - last update [Homepage of Oxford

University Press], [Online]. Available: http://dictionary.oed.com/ [9/25, 2008].

[TSUKADA and YASUMURA, 2004] TSUKADA, K. and YASUMURA, M., 2004.

ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation. Proc.

of Ubicomp 04 (2004), Springer, pp. 384-399.

[WALL and BREWSTER, 2006] WALL, S. and BREWSTER, S., 2006. Feeling what

you hear: tactile feedback for navigation of audio graphs. CHI '06: Proceedings of

the SIGCHI conference on Human Factors in computing systems, 2006, ACM pp.

1123-1132.

[WANG and O’FRIEL] WANG, C. and O’FRIEL, K., MOMO: a haptic navigation

device. Available: http://momobots.com/ [9/22, 2008].