Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

  • Published on

  • View

  • Download

Embed Size (px)


  • Mental Rotation of Tactile Stimuli: UsingDirectional Haptic Cues in Mobile Devices

    Brian T. Gleeson and William R. Provancher, Member, IEEE

    AbstractHaptic interfaces have the potential to enrich users interactions with mobile devices and convey information without

    burdening the users visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to

    use in handheld applications; the users hand, where the cues are delivered, may not be aligned with the world, where the cues are to be

    interpreted. In such a case, the user would be required tomentally transform the stimuli between different reference frames.We examine

    the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users intuitive interpretation of rotated stimuli,

    2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint

    rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret

    rotated stimuli in any one universal way.We find evidence of cognitive processes involving the rotation of analog, spatial representations

    and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), thesemental

    rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a

    handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of

    stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly

    increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface.

    Index TermsHuman information processing, haptic I/O, mental rotation, tactile direction cues, haptics, skin stretch


    HANDHELD devices capable of providing navigationalinformation are now common, with graphical displaysor spoken text to guide a user to their goal. In situationswhere the user must actively monitor their surroundings,however, cues from a navigational aid add to the demandson a users visual and audio attention, potentially impact-ing the users ability to maintain situation awareness. Forusers such as soldiers, drivers, and emergency workers,the burden of additional audio or visual information is, atbest, an inconvenience, and at worst, a safety hazard.A promising alternative is haptic direction cues. Hapticcommunication is known to have certain cognitive advan-tages in situations with high visual or auditory workload[1]. Additionally, for common applications on devices suchas mobile phones, directional haptic cues could provide aless obtrusive channel for human-computer interaction,benefiting general users and providing an importantaccessibility tool for the blind.

    In this paper, we address an important probleminherent to many types of haptic direction communication:mental rotation of haptically perceived stimuli. Whenhaptic stimuli are perceived in one reference frame, but

    interpreted in another, the user must mentally transformthe stimuli from the perceptual-frame to the task-frame. Asan example of this problem, consider navigational informa-tion delivered to the fingertip by a portable device. Thedevice delivers a direction to the fingertip via directionalskin stretch (the haptically perceived frame) and a userwould have to interpret that information as it relates to hisor her surroundings (the task-space). If the users fingerwas not aligned with the task space, a potentially difficultmental transformation would be required (example of suchmisalignment portrayed in Fig. 1). Previous work con-ducted by Parsons on mental rotation of visual objects hasshown that these rotations incur a cognitive cost and canburden spatial working memory [2], potentially interferingwith other spatial tasks. To characterize this potentialdifficulty with handheld haptic devices, this paper presentsthree experiments exploring three aspects of mentalrotation of haptic stimuli: 1) how haptic direction cuesare instinctively interpreted; 2) how a user mentallyrotates the cues into alignment with the environment;and 3) how this rotation task depends on finger orientation.The key contributions of this research are as follows, wherethis paper:

    . presents evidence that directional haptic stimuli canbe used in applications requiring mental rotation,for example, in a handheld device,

    . provides an improved understanding of the mentalprocesses and constraints involved in these rota-tions, and proposes a model of cognitive cost,

    . gives indications that users must be carefully in-structed or trained in correct stimulus interpretation,

    . presents design recommendations for a range ofacceptable rotation angles,


    . B.T. Gleeson is with the Department of Computer Science, University ofBritish Columbia, 308-2181 W. 10th Ave., Vancouver, BC V6T 1Z4,Canada. E-mail:

    . W.R. Provancher is with the Department of Mechanical Engineering,University of Utah, 50 S. Central Campus Dr., RM 2110, MerrillEngineering Bldg., Salt Lake City, UT 84112-9208.E-mail:

    Manuscript received 13 Sept. 2012; revised 21 Dec. 2012; accepted 29 Jan.2012; published online 15 Feb. 2013.Recommended for acceptance by W. Bergmann Tiest.For information on obtaining reprints of this article, please send e-mail, and reference IEEECS Log Number TH-2012-09-0075.Digital Object Identifier no. 10.1109/TOH.2013.5.

    1939-1412/13/$31.00 2013 IEEE Published by the IEEE CS, RAS, & CES

  • . presents a description of finger poses that eithercomplicate or facilitate mental rotation of hapticstimuli, useful when designing how a haptic devicewill be grasped by the user.

    1.1 Prior Work with Mental Rotation and HapticDirection Perception

    Mental rotation of visual stimuli is well studied. The keycharacteristic of mental rotation of visual stimuli is thelinear relationship between the time required to perform arotation task and the required angle of rotation, as shownfirst by Shepard and Metzler [3]. Their interpretation of thisresult is as follows: when users must mentally transform avisually perceived object from one orientation to another,they mentally rotate a spatial representation of the object, ata constant rate, until it reaches the desired orientation. Howone would complete a similar task using hapticallyperceived stimuli is less clear.

    Of greater relevance to haptic researchers are observa-tions that the mental processes and neural structuresinvolved in rotation tasks are linked to perceptual systems.Several behavioral studies reveal that mental rotation ofbody-objects (images of hands, feet, bodies, etc.) involvesembodiment effects, where participants projected their ownreference frame onto the object and mentally simulated themotor actions that would achieve the desired rotation(e.g., [4], [5]). Neural imaging experiments confirmed theseresults, showing that the somatosensory system engages inmotor simulation of body-relevant rotations (e.g., [6], [7],[8]). In all of the above embodiment studies, the relationshipbetween rotation angle and response time was fundamen-tally different for body-relevant objects than for neutralobjects, generally showing a nonlinear relationship betweenrotation angle and task completion time. In our experi-ments, which involve physical orientations of the hand, weexpect to see similar embodiment effects.

    The few studies addressing mental rotation of tactilestimuli have generally produced results similar to thoseobtained in visual studies. Most of these experiments haveinvolved an embossed shape pressed against a stationaryfingertip, such as alphanumeric characters by Carpenterand Eisenberg [9], Rosler et al. [10], Hunt et al. [11] or otherabstract shapes by Marmor and Zaback [12] and Prantheret al. [13]. While the above studies reported evidence ofmental rotation following trends similar to those observedin visual studies, a study by Shimizu and Frost [14] using

    vibrating pins failed to produce evidence of mental rotation,and experiments by Dellantonio and Spagnolo [15] withBraille-like dots only showed mental rotation behaviorunder certain experimental conditions but not others. Thesediffering results show that the stimulus type and the natureof the experimental task can determine how a participantprocesses spatial stimuli, but that in general, mentalrotation processes are often used by participants whentransforming passively perceived static stimuli.

    In a study by Kitada et al., participants felt and activelyexplored models of human hands, attempting to identifythem as left or right hands. In this study, response timesincreased nonlinearly with the angle between the modelhand and a canonical hand orientation [16], showingevidence of both mental rotation and embodiment. Becauseour experiment includes both changing hand orientation andtactile stimuli at the fingertip, we expect to see similar results,i.e., both embodiment effects and evidence of spatial rotation.

    Another potential challenge inherent to directionalhaptic stimuli delivered by a handheld device is theproblem of skewed perceptual reference frames. When theorientation of the hand changes, the perceived orientationof stimuli changes. In studies of orientation perception,Volcic and Kappers [17] found systematic deviations fromveridicality based on hand orientation. They concluded thatwe perceive stimuli in a reference frame that is a weightedaverage between the veridical allocentric (world-centered)reference frame and a second reference frame centered onthe hand. This perceptual effect has been reproduced insubsequent studies, including [9] and [18]. The effects ofhand orientation and changing perceptual reference framesextend to spatial processing and mental rotation [19]. Theeffects of hand orientation and changing haptic referenceframe are also demonstrated in a study by Prather andSathian [20] of haptic perception and comparison and astudy of haptic control of virtual reality environments byWare and Aresenault [21]. These results warn that handorientation could potentially interfere with the accurateperception of haptic direction cues.

    1.2 Prior Work with Tactile Direction Cues

    Devices capable of rendering directional information arewidespread in haptics research. Tan and Pentland [22],among others, have successfully communicated directioncues using arrays of vibrating motors sewn into vests.Van Erp [23] has achieved a similar results with vibratingmotors built into a belt. Additionally, Tan et al. [24]communicated direction using vibrating arrays in a chair,


    Fig. 1. A user of a handheld haptic interface must mentally rotatedirectional haptic stimuli from the finger-centered frame, where thestimuli are perceived, to a world-centered frame, where the informationis to be applied.

    Fig. 2. Motion profile of the tactile stimulus. The contact elementdisplaced the skin of the fingerpad 1 mm at ~10.5 mm/s, paused for0.45 s, and returned to the starting position.

  • while Kern et al. [25] produced directional cues with avibrating steering wheel. Handheld or finger-mounteddevices capable of delivering direction cues are alsocommon. Winfree et al. [26] used inertial forces to commu-nicate direction, while many other researchers used slip orskin stretch at the fingertip, for example, Tsagarakis et al.[27], Wang and Hayward [28], and Webster et al. [29].

    In our previous work, we evaluated the use of directionaltangential skin stretch at the fingertip.We found this methodof tactile communication to be highly effective; when theskin of the fingerpad was displaced 1 mm at 2 mm/s orfaster, in one of four cardinal directions, participants wereable to identify the direction of the stimulus with better than99 percent accuracy [30].

    Our present study is unique in its exploration of mentalrotation of active directional tactile stimuli. While pre-vious studies of directional stimuli, both our studies andthose of others, placed the stimuli and the responses inthe same reference frame, our present work requiresparticipants to perform a transformation between tworotated reference frames.


    Test participants completed three experiments on mentalrotation of tactile stimuli. Participants passively perceived atactile stimulus on the index finger of their right hand,which was often rotated with respect to the allocentric(world-centered) reference frame. They were then asked torespond to this stimulus with a world-aligned joystick intheir left hand. The three experiments are summarized asfollows: 1) Intuitive mapping of stimuli, which sought tocharacterize the natural perceptual reference frame of tactilestimuli; 2) simple rotation, which investigated the relation-ship between response time and rotation angle around asingle axis of rotation in the transverse plane; and3) complex rotation, which examined the effects of rotationsinvolving multiple joints and required participants tointerpret tactile cues in the local fingertip reference frame.

    2.1 Participants

    Fifteen volunteers participated in all three experiments(10 male, 19-33 years oldmean age 25.9). Two partici-pants were left-hand-dominant and 13 were right-hand-dominant, by self-report. All participants reported normaltactile sensitivity.

    2.2 Tactile Stimulus

    All experiments utilized directional skin stretch as a tactilestimulus. In all previous studies of mental rotation,participants perceived static shapes, while in the present

    study the fingerpad is stimulated with a moving device.With the finger held stationary, a 7-mm diameter, hemi-spherical, rubber contact element pressed against thefingerpad of the right index finger moved tangentially onthe skin, displacing and stretching the skin of the fingerpadin a given direction. For each stimulus, the contact elementcompleted a 1:0 0:1 mm out-and-back motion, as shownin Fig. 2, in one of four cardinal directions: proximal(toward the knuckle), distal (away from the knuckle), radial(toward the thumb side of the hand) or ulnar (away fromthe thumb), as shown schematically in Fig. 3. In earlierperceptual studies, subjects were able to correctly identifythe direction of these stimuli with 99% accuracy. Fordetailed information on stimulus rendering and perception,see our earlier work [30], [31].

    2.3 Apparatus

    Test participants sat as shown in Fig. 4, with the tactiledisplay device worn on the right index finger. Allexperiments utilized a custom-built tactile device capableof rendering directional skin stretch stimuli. This device isdescribed in depth in our earlier work [...


View more >