10
Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices Brian T. Gleeson and William R. Provancher, Member, IEEE Abstract—Haptic interfaces have the potential to enrich users’ interactions with mobile devices and convey information without burdening the user’s visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to use in handheld applications; the user’s hand, where the cues are delivered, may not be aligned with the world, where the cues are to be interpreted. In such a case, the user would be required to mentally transform the stimuli between different reference frames. We examine the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users’ intuitive interpretation of rotated stimuli, 2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret rotated stimuli in any one universal way. We find evidence of cognitive processes involving the rotation of analog, spatial representations and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), these mental rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface. Index Terms—Human information processing, haptic I/O, mental rotation, tactile direction cues, haptics, skin stretch Ç 1 INTRODUCTION AND BACKGROUND H ANDHELD devices capable of providing navigational information are now common, with graphical displays or spoken text to guide a user to their goal. In situations where the user must actively monitor their surroundings, however, cues from a navigational aid add to the demands on a user’s visual and audio attention, potentially impact- ing the user’s ability to maintain situation awareness. For users such as soldiers, drivers, and emergency workers, the burden of additional audio or visual information is, at best, an inconvenience, and at worst, a safety hazard. A promising alternative is haptic direction cues. Haptic communication is known to have certain cognitive advan- tages in situations with high visual or auditory workload [1]. Additionally, for common applications on devices such as mobile phones, directional haptic cues could provide a less obtrusive channel for human-computer interaction, benefiting general users and providing an important accessibility tool for the blind. In this paper, we address an important problem inherent to many types of haptic direction communication: mental rotation of haptically perceived stimuli. When haptic stimuli are perceived in one reference frame, but interpreted in another, the user must mentally transform the stimuli from the perceptual-frame to the task-frame. As an example of this problem, consider navigational informa- tion delivered to the fingertip by a portable device. The device delivers a direction to the fingertip via directional skin stretch (the haptically perceived frame) and a user would have to interpret that information as it relates to his or her surroundings (the task-space). If the user’s finger was not aligned with the task space, a potentially difficult mental transformation would be required (example of such misalignment portrayed in Fig. 1). Previous work con- ducted by Parsons on mental rotation of visual objects has shown that these rotations incur a cognitive cost and can burden spatial working memory [2], potentially interfering with other spatial tasks. To characterize this potential difficulty with handheld haptic devices, this paper presents three experiments exploring three aspects of mental rotation of haptic stimuli: 1) how haptic direction cues are instinctively interpreted; 2) how a user mentally rotates the cues into alignment with the environment; and 3) how this rotation task depends on finger orientation. The key contributions of this research are as follows, where this paper: . presents evidence that directional haptic stimuli can be used in applications requiring mental rotation, for example, in a handheld device, . provides an improved understanding of the mental processes and constraints involved in these rota- tions, and proposes a model of cognitive cost, . gives indications that users must be carefully in- structed or trained in correct stimulus interpretation, . presents design recommendations for a range of acceptable rotation angles, 330 IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 3, JULY-SEPTEMBER 2013 . B.T. Gleeson is with the Department of Computer Science, University of British Columbia, 308-2181 W. 10th Ave., Vancouver, BC V6T 1Z4, Canada. E-mail: [email protected]. . W.R. Provancher is with the Department of Mechanical Engineering, University of Utah, 50 S. Central Campus Dr., RM 2110, Merrill Engineering Bldg., Salt Lake City, UT 84112-9208. E-mail: [email protected]. Manuscript received 13 Sept. 2012; revised 21 Dec. 2012; accepted 29 Jan. 2012; published online 15 Feb. 2013. Recommended for acceptance by W. Bergmann Tiest. For information on obtaining reprints of this article, please send e-mail to: [email protected], and reference IEEECS Log Number TH-2012-09-0075. Digital Object Identifier no. 10.1109/TOH.2013.5. 1939-1412/13/$31.00 ß 2013 IEEE Published by the IEEE CS, RAS, & CES

Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

Embed Size (px)

Citation preview

Page 1: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

Mental Rotation of Tactile Stimuli: UsingDirectional Haptic Cues in Mobile Devices

Brian T. Gleeson and William R. Provancher, Member, IEEE

Abstract—Haptic interfaces have the potential to enrich users’ interactions with mobile devices and convey information without

burdening the user’s visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to

use in handheld applications; the user’s hand, where the cues are delivered, may not be aligned with the world, where the cues are to be

interpreted. In such a case, the user would be required to mentally transform the stimuli between different reference frames. We examine

the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users’ intuitive interpretation of rotated stimuli,

2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint

rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret

rotated stimuli in any one universal way. We find evidence of cognitive processes involving the rotation of analog, spatial representations

and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), these mental

rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a

handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of

stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly

increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface.

Index Terms—Human information processing, haptic I/O, mental rotation, tactile direction cues, haptics, skin stretch

Ç

1 INTRODUCTION AND BACKGROUND

HANDHELD devices capable of providing navigationalinformation are now common, with graphical displays

or spoken text to guide a user to their goal. In situationswhere the user must actively monitor their surroundings,however, cues from a navigational aid add to the demandson a user’s visual and audio attention, potentially impact-ing the user’s ability to maintain situation awareness. Forusers such as soldiers, drivers, and emergency workers,the burden of additional audio or visual information is, atbest, an inconvenience, and at worst, a safety hazard.A promising alternative is haptic direction cues. Hapticcommunication is known to have certain cognitive advan-tages in situations with high visual or auditory workload[1]. Additionally, for common applications on devices suchas mobile phones, directional haptic cues could provide aless obtrusive channel for human-computer interaction,benefiting general users and providing an importantaccessibility tool for the blind.

In this paper, we address an important probleminherent to many types of haptic direction communication:mental rotation of haptically perceived stimuli. Whenhaptic stimuli are perceived in one reference frame, but

interpreted in another, the user must mentally transformthe stimuli from the perceptual-frame to the task-frame. Asan example of this problem, consider navigational informa-tion delivered to the fingertip by a portable device. Thedevice delivers a direction to the fingertip via directionalskin stretch (the haptically perceived frame) and a userwould have to interpret that information as it relates to hisor her surroundings (the task-space). If the user’s fingerwas not aligned with the task space, a potentially difficultmental transformation would be required (example of suchmisalignment portrayed in Fig. 1). Previous work con-ducted by Parsons on mental rotation of visual objects hasshown that these rotations incur a cognitive cost and canburden spatial working memory [2], potentially interferingwith other spatial tasks. To characterize this potentialdifficulty with handheld haptic devices, this paper presentsthree experiments exploring three aspects of mentalrotation of haptic stimuli: 1) how haptic direction cuesare instinctively interpreted; 2) how a user mentallyrotates the cues into alignment with the environment;and 3) how this rotation task depends on finger orientation.The key contributions of this research are as follows, wherethis paper:

. presents evidence that directional haptic stimuli canbe used in applications requiring mental rotation,for example, in a handheld device,

. provides an improved understanding of the mentalprocesses and constraints involved in these rota-tions, and proposes a model of cognitive cost,

. gives indications that users must be carefully in-structed or trained in correct stimulus interpretation,

. presents design recommendations for a range ofacceptable rotation angles,

330 IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 3, JULY-SEPTEMBER 2013

. B.T. Gleeson is with the Department of Computer Science, University ofBritish Columbia, 308-2181 W. 10th Ave., Vancouver, BC V6T 1Z4,Canada. E-mail: [email protected].

. W.R. Provancher is with the Department of Mechanical Engineering,University of Utah, 50 S. Central Campus Dr., RM 2110, MerrillEngineering Bldg., Salt Lake City, UT 84112-9208.E-mail: [email protected].

Manuscript received 13 Sept. 2012; revised 21 Dec. 2012; accepted 29 Jan.2012; published online 15 Feb. 2013.Recommended for acceptance by W. Bergmann Tiest.For information on obtaining reprints of this article, please send e-mail to:[email protected], and reference IEEECS Log Number TH-2012-09-0075.Digital Object Identifier no. 10.1109/TOH.2013.5.

1939-1412/13/$31.00 � 2013 IEEE Published by the IEEE CS, RAS, & CES

Page 2: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

. presents a description of finger poses that eithercomplicate or facilitate mental rotation of hapticstimuli, useful when designing how a haptic devicewill be grasped by the user.

1.1 Prior Work with Mental Rotation and HapticDirection Perception

Mental rotation of visual stimuli is well studied. The keycharacteristic of mental rotation of visual stimuli is thelinear relationship between the time required to perform arotation task and the required angle of rotation, as shownfirst by Shepard and Metzler [3]. Their interpretation of thisresult is as follows: when users must mentally transform avisually perceived object from one orientation to another,they mentally rotate a spatial representation of the object, ata constant rate, until it reaches the desired orientation. Howone would complete a similar task using hapticallyperceived stimuli is less clear.

Of greater relevance to haptic researchers are observa-tions that the mental processes and neural structuresinvolved in rotation tasks are linked to perceptual systems.Several behavioral studies reveal that mental rotation ofbody-objects (images of hands, feet, bodies, etc.) involvesembodiment effects, where participants projected their ownreference frame onto the object and mentally simulated themotor actions that would achieve the desired rotation(e.g., [4], [5]). Neural imaging experiments confirmed theseresults, showing that the somatosensory system engages inmotor simulation of body-relevant rotations (e.g., [6], [7],[8]). In all of the above embodiment studies, the relationshipbetween rotation angle and response time was fundamen-tally different for body-relevant objects than for neutralobjects, generally showing a nonlinear relationship betweenrotation angle and task completion time. In our experi-ments, which involve physical orientations of the hand, weexpect to see similar embodiment effects.

The few studies addressing mental rotation of tactilestimuli have generally produced results similar to thoseobtained in visual studies. Most of these experiments haveinvolved an embossed shape pressed against a stationaryfingertip, such as alphanumeric characters by Carpenterand Eisenberg [9], Rosler et al. [10], Hunt et al. [11] or otherabstract shapes by Marmor and Zaback [12] and Prantheret al. [13]. While the above studies reported evidence ofmental rotation following trends similar to those observedin visual studies, a study by Shimizu and Frost [14] using

vibrating pins failed to produce evidence of mental rotation,and experiments by Dellantonio and Spagnolo [15] withBraille-like dots only showed mental rotation behaviorunder certain experimental conditions but not others. Thesediffering results show that the stimulus type and the natureof the experimental task can determine how a participantprocesses spatial stimuli, but that in general, mentalrotation processes are often used by participants whentransforming passively perceived static stimuli.

In a study by Kitada et al., participants felt and activelyexplored models of human hands, attempting to identifythem as left or right hands. In this study, response timesincreased nonlinearly with the angle between the modelhand and a canonical hand orientation [16], showingevidence of both mental rotation and embodiment. Becauseour experiment includes both changing hand orientation andtactile stimuli at the fingertip, we expect to see similar results,i.e., both embodiment effects and evidence of spatial rotation.

Another potential challenge inherent to directionalhaptic stimuli delivered by a handheld device is theproblem of skewed perceptual reference frames. When theorientation of the hand changes, the perceived orientationof stimuli changes. In studies of orientation perception,Volcic and Kappers [17] found systematic deviations fromveridicality based on hand orientation. They concluded thatwe perceive stimuli in a reference frame that is a weightedaverage between the veridical allocentric (world-centered)reference frame and a second reference frame centered onthe hand. This perceptual effect has been reproduced insubsequent studies, including [9] and [18]. The effects ofhand orientation and changing perceptual reference framesextend to spatial processing and mental rotation [19]. Theeffects of hand orientation and changing haptic referenceframe are also demonstrated in a study by Prather andSathian [20] of haptic perception and comparison and astudy of haptic control of virtual reality environments byWare and Aresenault [21]. These results warn that handorientation could potentially interfere with the accurateperception of haptic direction cues.

1.2 Prior Work with Tactile Direction Cues

Devices capable of rendering directional information arewidespread in haptics research. Tan and Pentland [22],among others, have successfully communicated directioncues using arrays of vibrating motors sewn into vests.Van Erp [23] has achieved a similar results with vibratingmotors built into a belt. Additionally, Tan et al. [24]communicated direction using vibrating arrays in a chair,

GLEESON AND PROVANCHER: MENTAL ROTATION OF TACTILE STIMULI: USING DIRECTIONAL HAPTIC CUES IN MOBILE DEVICES 331

Fig. 1. A user of a handheld haptic interface must mentally rotatedirectional haptic stimuli from the finger-centered frame, where thestimuli are perceived, to a world-centered frame, where the informationis to be applied.

Fig. 2. Motion profile of the tactile stimulus. The contact elementdisplaced the skin of the fingerpad 1 mm at ~10.5 mm/s, paused for0.45 s, and returned to the starting position.

Page 3: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

while Kern et al. [25] produced directional cues with avibrating steering wheel. Handheld or finger-mounteddevices capable of delivering direction cues are alsocommon. Winfree et al. [26] used inertial forces to commu-nicate direction, while many other researchers used slip orskin stretch at the fingertip, for example, Tsagarakis et al.[27], Wang and Hayward [28], and Webster et al. [29].

In our previous work, we evaluated the use of directionaltangential skin stretch at the fingertip. We found this methodof tactile communication to be highly effective; when theskin of the fingerpad was displaced 1 mm at 2 mm/s orfaster, in one of four cardinal directions, participants wereable to identify the direction of the stimulus with better than99 percent accuracy [30].

Our present study is unique in its exploration of mentalrotation of active directional tactile stimuli. While pre-vious studies of directional stimuli, both our studies andthose of others, placed the stimuli and the responses inthe same reference frame, our present work requiresparticipants to perform a transformation between tworotated reference frames.

2 GENERAL METHODS

Test participants completed three experiments on mentalrotation of tactile stimuli. Participants passively perceived atactile stimulus on the index finger of their right hand,which was often rotated with respect to the allocentric(world-centered) reference frame. They were then asked torespond to this stimulus with a world-aligned joystick intheir left hand. The three experiments are summarized asfollows: 1) Intuitive mapping of stimuli, which sought tocharacterize the natural perceptual reference frame of tactilestimuli; 2) simple rotation, which investigated the relation-ship between response time and rotation angle around asingle axis of rotation in the transverse plane; and3) complex rotation, which examined the effects of rotationsinvolving multiple joints and required participants tointerpret tactile cues in the local fingertip reference frame.

2.1 Participants

Fifteen volunteers participated in all three experiments(10 male, 19-33 years old—mean age 25.9). Two partici-pants were left-hand-dominant and 13 were right-hand-dominant, by self-report. All participants reported normaltactile sensitivity.

2.2 Tactile Stimulus

All experiments utilized directional skin stretch as a tactilestimulus. In all previous studies of mental rotation,participants perceived static shapes, while in the present

study the fingerpad is stimulated with a moving device.With the finger held stationary, a 7-mm diameter, hemi-spherical, rubber contact element pressed against thefingerpad of the right index finger moved tangentially onthe skin, displacing and stretching the skin of the fingerpadin a given direction. For each stimulus, the contact elementcompleted a 1:0� 0:1 mm out-and-back motion, as shownin Fig. 2, in one of four cardinal directions: proximal(toward the knuckle), distal (away from the knuckle), radial(toward the thumb side of the hand) or ulnar (away fromthe thumb), as shown schematically in Fig. 3. In earlierperceptual studies, subjects were able to correctly identifythe direction of these stimuli with 99þ% accuracy. Fordetailed information on stimulus rendering and perception,see our earlier work [30], [31].

2.3 Apparatus

Test participants sat as shown in Fig. 4, with the tactiledisplay device worn on the right index finger. Allexperiments utilized a custom-built tactile device capableof rendering directional skin stretch stimuli. This device isdescribed in depth in our earlier work [31]. After receivinga tactile stimulus, participants would indicate theirresponse using a four-direction joystick operated withtheir left hand. A monitor in front of the experimentalsetup delivered instructions to participants. Matlab, run-ning the Psychophysics Toolbox [32], drove the tactiledevice and monitored joystick input with �1:5 msaccuracy. Wooden fixtures determined the location andorientation of the tactile device and the participant’s hand,see Fig. 4 (inset). A cover placed over the test environmentblocked out visual stimuli while headphones, playingwhite noise, masked any audio stimuli.

All three experiments utilized the same apparatus butdiffered in the finger orientations tested and the instruc-tions given to the participant.

2.4 Experimental Design

All participants completed the three experiments innumerical order, with Experiments 1 and 2 completed inthe same session and Experiment 3 completed approxi-mately one week later. Within each experiment, thepresentation order of finger orientations was balancedbetween participants. To minimize the effects of learning

332 IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 3, JULY-SEPTEMBER 2013

Fig. 3. Possible response directions on the joystick (left) and stimulusdirections on the fingerpad (right).

Fig. 4. The experimental setup, as used in Experiment 2. Experiments 1and 3 used a separate fixture to fix the orientation and position ofparticipants’ hands, shown inset.

Page 4: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

or fatigue, each experiment tested each orientation twice,with the stimulus repetitions evenly divided betweenthe two presentations. For example, in Experiment 3, aparticipant would respond to 12 repetitions of each stimulusin each orientation, and then repeat the same order of fingerorientations, responding to another 12 repetitions in eachorientation. The presentation order of the stimuli (i.e., theselection of stimulus direction) was pseudorandom, with anequal number of stimuli presented in each direction.

3 EXPERIMENT 1: INTUITIVE MAPPING

3.1 Methods

Experiment 1 tested eight finger orientations in the hor-izontal plane and one out-of-plane orientation. The hori-zontal plane positions included all combinations of threejoint rotations: finger flexion, wrist rotation, and rotation ofthe forearm about the elbow. This experiment evaluatedrotation angles of 0 and 90 degrees, resulting in orientationsA-H shown in Fig. 5. These eight orientations comprised arepresentative sample of the rotations and orientationspossible in the horizontal plane. A ninth orientation featuredthe hand down at the participant’s side with the fingerpointing down, Fig. 5I. This orientation mimics a commonposture for holding a portable device (e.g., mobile phone)and is of interest to tactile interface designers.

In Experiment 1, participants were not instructed howto map between tactile stimuli (on the right index finger)and the joystick (aligned with the allocentric referenceframe, held with the left hand). They were told to respond“...in the direction that you feel best corresponds with thetactile stimulus.”

Participants responded to eight repetitions of each of thefour stimulus directions in each of the nine hand orienta-tions, for a total of 288 stimuli. Participants required anaverage of 24 minutes to complete the experiment.

3.2 Results and Discussion

3.2.1 Consistent Response Patterns

Participants responded consistently to each finger orienta-tion. That is, in a given orientation, participants generallygave the same response for any given stimulus direction.The median consistency, measured as the number ofconsistent responses divided by the total number of stimuli,was 93 percent.

3.2.2 No Single Dominant Response Pattern

We used a least-squares analysis to compare participantresponse patterns to three predicted patterns: the finger-aligned (egocentric) frame and two different world-aligned(allocentric) frames. In the finger-aligned frame, cues areinterpreted with distal stimuli always corresponding toforward on the joystick. In the world frame, cues aremapped directly onto the joystick, with the participantalways moving the joystick in the same absolute direction asthe stimulus, regardless of finger orientation. In ourexperiment, however, the interpretation of the world frameis complicated when the plane of the fingerpad is notparallel to the plane of the joystick base (i.e., orientations B,C, D, F, G, H, and I). In these orientations, stimuli whichmove out-of-plane (down, toward the ground, or up, awayfrom the ground) have no single unambiguous mapping tothe joystick. In these cases, participants tended to choose oneof two response patterns which differed only in theirtreatment of out-of-plane stimuli: 1) world frame 1, withdownward stimuli mapped to the joystick direction coin-cident with the outward normal vector of the fingernail(in the participants current finger orientation), and upwardstimuli mapped to the joystick direction coincident withthe outward normal vector of the fingerpad; and 2) worldframe 2, with the opposite mapping of upward anddownward stimuli. Note that some response patterns areequivalent for some conditions. For each participant andeach condition, response patterns were classified accordingto the minimum least-squares difference between the dataand the predicted patterns. Fig. 6 shows confusion matricesassociated with each hand orientation, Fig. 7 illustratesexamples the three different reference frames and Table 1gives the number of participants classified into eachperceptual pattern.

The majority of participants interpreted the stimuli in thefinger-aligned (egocentric) reference frame. In the secondhalf of the experiment, where participants experienced eachfinger orientation for a second time, more people inter-preted direction cues in the finger-aligned frame than in thefirst half of the experiment. This shift suggests that usersmore readily interpret cues in the finger-aligned frame asthey gain experience with the skin stretch stimuli. For thoseparticipants that chose an allocentric frame, the up/downstimuli (out of the horizontal plane) that had no clearallocentric mapping were generally interpreted according toWorld Frame 1 (downward stimuli mapped to the joystickdirection coincident with the outward normal of thefingernail, for example, they would respond to in theforward direction on the joystick in response to a down-ward cue in orientation D).

GLEESON AND PROVANCHER: MENTAL ROTATION OF TACTILE STIMULI: USING DIRECTIONAL HAPTIC CUES IN MOBILE DEVICES 333

Fig. 5. Finger orientations tested in Experiments 1 and 3. OrientationsA-D were tested with the forearm extended straight forward as follows:A: No joint rotation, B: finger flexed 90 degrees, C: wrist supinated90 degrees, D: finger flexed 90 degrees and wrist supinated90 degrees. Orientations E-H utilized the same finger and wristrotations as A-D, but added a 90-degree rotation of the forearm aboutthe elbow. In orientation I, the participant’s arm was straight down athis/her side, with the index finger pointing at the ground.

Page 5: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

In previous studies of tactile perception in rotated

references frames, other researchers have emphasized the

importance of the hand-centered frame (e.g., [17], [18], [33]).

However, in our study, when the finger-aligned frame and

the hand-aligned frame differed (i.e., when the finger was

bent), participants never interpreted the stimuli in the hand-

aligned frame, choosing instead the finger-aligned frame or

a world-aligned frame. In the previous studies showing the

importance of the hand-centered frame, participants per-

ceived stimuli by grasping them in the hand. In our study,

where stimulation was limited to the fingerpad, it is not

surprising that participants chose the finger-aligned frameover the hand-aligned frame.

While most participants interpreted stimuli in the finger-aligned (egocentric) frame, there is no universal intuitivemapping behavior. For designers of tactile interfaces, theseresults imply that it is not feasible to design an interface thatis intuitive for all users in all finger orientations. Designerscould capitalize on the observed patterns, but in generalit will be necessary to instruct a user how to interpretdirectional tactile stimuli and how they are to be mapped tothe task frame. This is a disadvantage to using directionalstimuli in a handheld device, but establishing suchinterpretation conventions for somewhat ambiguous infor-mation is common in interface design.

334 IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 3, JULY-SEPTEMBER 2013

Fig. 7. Examples illustrating dominant intuitive response patterns forthree finger orientations. The directions are color-coded, for example,the red arrows in Response columns indicate response directionscorresponding to the stimulus shown with a red arrow in the Stimuluscolumn. The response arrows pointing to the top of the page correspondto a response in the forward direction on the joystick. In the egocentricframe, responses are interpreted in the local reference frame alignedwith the fingerpad. In the allocentric frame, responses are interpreted inthe world-aligned reference frame. When the fingerpad was rotated outof the horizontal plane (e.g., orientation D) participants showed twodifferent strategies for mapping out-of-plane stimuli onto the joystick inthe world frame, shown here as world-aligned frames 1 and 2.

TABLE 1Intuitive Response Pattern Classification

Each cell indicates the number of participants whose data best fit theresponse pattern. Bold numbers give the classification of all experi-mental data. Numbers in parentheses give the classification ofparticipants’ responses during the first and second half of theexperiment, respectively. Cells combined between multiple conditionsindicate that response patterns were equivalent for the givenconditions. *The responses of one subject were equally dividedbetween two patterns.

Fig. 6. Confusion matrices showing pooled stimulus-response correspondences in Experiment 1. The matrices show the number of responses ineach response direction (N, E, S, W) resulting from stimuli in each of the stimulus directions (D, R, U, P—see diagram in lower right). Dominantresponse patterns are highlighted as follows: red rectangles = finger-aligned frame; blue rounded-rectangles = world-aligned frame 1; green ovals =world-aligned frame 2.

Page 6: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

4 EXPERIMENT 2: SIMPLE ROTATION

4.1 Methods

Experiment 2 tested six orientations in the horizontal plane,produced by rotating the forearm about the elbow. Handorientations were evenly spaced between 0 and 90 degrees(every 18 degrees).

In this experiment, participants were given specificinstructions on stimulus interpretation. Participants weretold to interpret stimuli in the fingertip-centered referenceframe, regardless of their finger orientation, and respond asquickly and accurately as possible using the joystick in theallocentric frame.

Participants responded to 24 repetitions of each of thefour stimulus directions in each of the six orientations, for atotal of 576 stimuli. Participants required an average of30 minutes to complete this experiment.

4.2 Data Analysis

To better see the effects of hand orientation and to eliminatethe baseline performance differences between participants,we used relative response times (relative RT) in ouranalysis of data from Experiment 2 (and Experiment 3).The time between the onset of a stimulus and the responseis the response time (RT). A participant’s baseline responsetime is the participant’s mean response time in the baselineorientation (finger and joystick aligned, orientation A inFig. 5). Baseline RTs were computed for each participant.The relative RT, the metric used in our analysis, isRT—baseline RT, or the change in response time from theparticipant’s baseline performance.

Except where specified below, data were pooledbetween all trials and all participants. Curve fits wereapplied to subject means for each orientation using a least-squares optimization.

All incorrect responses were rejected from the data set(3.0 percent of data from Experiments 2), along with outlierRT values (>3 standard deviations from the subject’s meanfor a given orientation, 2.1 percent from Experiment 2).Additionally, all data from one participant were rejectedfrom Experiment 2, due to the participant’s unusually higherror rate (>3 standard deviations above the group mean).

4.3 Results and Discussion

4.3.1 High Accuracy

Participants were able to perceive the tactile stimuli andperform the requested mental rotation with high accuracy(mean ¼ 97 percent). In general, participants made moreerrors at larger arm rotation angles, but there were too fewerrors to perform any meaningful quantitative analysis oferror patterns.

4.3.2 Checking Error Rate and Reaction Time

Experiments 2 and 3 use response time (RT) as a measure oftask difficulty. We tested the correlation between RT anderror rate to ensure that the interpretation of our data wasnot confounded by speed-accuracy tradeoff effects (see [16]).This analysis of correlation addresses the following ques-tion: do larger RTs accurately indicate greater task difficulty,or are participants merely choosing a different point onthe speed-accuracy continuum for each finger orientation?

The data show a positive correlation between RT and errorrate, the opposite of what one would expect in the case of aspeed-accuracy tradeoff, implying that RTs can be used asa measure of task difficulty. The positive correlation, asmeasured by Pearson’s r, is statistically significant forall three experiments, Experiment 1: (r ¼ 0:81; p ¼ 0:008),Experiment 2: (r ¼ 0:94, p < 0:001), Experiment 3: (r ¼ 0:96;p < 0:001).

4.3.3 Evidence of Mental Rotation and Embodiment

The results of Experiment 2 demonstrate a clear relation-ship between RT and rotation angle, as shown in Fig. 8.Relative RT clearly increases with arm angle (Pearson’sr ¼ 0:92; p < 0:01). From this result, we conclude thatparticipants were performing a mental rotation, using ananalog spatial representation to rotate the tactile stimulifrom the fingertip frame to the allocentric frame of theresponse joystick (see [3]).

We should note, however, that there are other possibleexplanations for the observed increase in RT, aside from theeffects of mental rotation first proposed by Shepard andMetzler. The most likely alternative interpretation is thatsome other factor caused the task difficulty to increase as afunction of angle. However, because error rates remainedvery low for all angles in our experiment, while RT reliablyincreased, we conclude that mental rotation effects are themostly likely explanation for our results.

The experimental data show a sinusoidal relationshipbetween rotation angle and RT (R2 ¼ 0:41). Note that curveswere fit to subject means, while Fig. 8 shows pooled meansonly, for clarity. This sinusoidal trend also holds on a per-subject basis (fit to single-subject means, median R2 ¼ 0:84).Other nonlinear curves shapes, for example, a parabola, fitsimilarly well, but a sinusoid was chosen as the mostphysically relevant model, following the example of priorstudies [16]. While we chose a sinusoidal fit, we do notconsider this specific curve shape to be of great importance;what is important is that the data follow a nonlinear trend,as opposed to a linear relationship. The nonlinear shape ofour data allows us to place our results in context with earlierstudies. Linear trends are seen in studies of mental rotationof abstract shapes perceived visually (e.g., [3]) or tactilelywith the fingertip (e.g., [12], [10], [13]). Nonlinear andsinusoidal trends, however, most often appear when

GLEESON AND PROVANCHER: MENTAL ROTATION OF TACTILE STIMULI: USING DIRECTIONAL HAPTIC CUES IN MOBILE DEVICES 335

Fig. 8. Pooled data from Experiment 2, with a sinusoidal fit. Error barsdepict 95 percent confidence intervals. For reference, the mean baselinevalue (absolute RT at 0 degrees) was 0.546 s.

Page 7: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

participants exhibit embodiment effects during the mentalrotation of visually or tactilely perceived human hands (e.g.,[34], [35], [16], [5]) or while controlling an object throughphysical hand rotation [21]. The implication, therefore, isthat participants in our study were mentally rotating theirhand into the allocentric frame (0 degree position), and theninterpreting the stimulus in that orientation.

5 EXPERIMENT 3: COMPOUND ROTATION

5.1 Methods

Experiment 3 tested the same orientations as Experiment 1(see Fig. 5), but gave explicit instructions to interpretdirection cues in the reference frame of the fingertip, as wasdone in Experiment 2. Again, we asked participants torespond as quickly and accurately as possible. EachParticipant successfully completed a training session beforethe main experiment.

Participants responded to 24 repetitions of each of fourstimulus directions in each of the nine orientations, for atotal of 864 stimuli. Participants required an average of41 minutes to complete the experiment.

Data analysis methods for Experiment 3 were similar tothose used in Experiments 2, again focusing on relative RT.A total of 4.3 percent of data was rejected due to incorrectresponses and 2.0 percent was rejected as outliers.

5.2 Results and Discussion

5.2.1 General Trends

Results from Experiment 3 indicate that rotations of the arm,finger and wrist all increase RT, as seen in Fig. 9.Qualitatively, the hand poses featuring a single joint rotation(B, C, and E) all performed similarly with relatively littleimpact on response time. Poses F and G, with two jointrotations each, performed similarly and with increasedimpact on RT, but pose D, which also involved two jointrotations, had a larger impact. Pose H, with three jointrotations, resulted in the slowest responses. Pose I, the out-of-plane pose, had similar RT to the one-rotation poses.

The trend in error rate followed the same pattern as the

trend in RT, but again there were too few errors to conduct a

meaningful analysis of error rate. Participants responded

correctly to 95.7 percent of all stimuli, with the highest mean

accuracy for orientation A (97.9 percent) and minimum

mean accuracy for orientation H (89.7 percent). The

increased error rate at the more extreme hand orientations

could be due to the difficulty of the mental rotation task, or

to the skewed perceptual reference frame of such hand

orientations (e.g., [33]), or to a combination of these factors.

5.2.2 No Relationship to Intuitive Response Patterns

We hypothesized that participants who intuitively inter-

preted stimuli in the fingertip frame would respond faster

during mental rotation tasks, but this was not the case. In

general, a participant’s mean Experiment 3 relative RTs did

not correlate with the participant’s tendency to respond in

the fingertip frame in Experiment 1. In orientations A-H, the

correlation between Experiments 1 and 3 was not significant

(in all cases, p > 0:32), but was significant for orientation I

(r ¼ 0:55, p ¼ 0:034). This result implies that the partici-

pants’ ability to perform mental rotations is independent of

their intuitive mapping behavior.

5.2.3 Quantitative Analysis: Cognitive Cost Is a Function

of Summed Joint Angles, with Interactions

To understand which aspects of hand pose affect perfor-

mance, and for a more quantitative understanding of our

data, we considered five models of mental rotation perfor-

mance, each testing different potentially important factors.We conducted an initial analysis of variance to determine

which factors to consider in our models. The analysis tested

the factors Participant, Arm Angle, Wrist Angle, Finger

Angle, and all interactions. All factors showed main effects

on relative RT (in all cases p < 0:001), as did all interactions

(in all cases, p < 0:003), with the exception of Wrist Angle �Finger Angle (p ¼ 0:068). From this we concluded that all

joint angles and most interactions were valid for considera-

tion in our models.

5.2.4 On the Omission of the Out-of-Plane Pose (I)

All models were only applied to hand orientations A-H.

Data produced by orientation I, with the hand rotated out of

the horizontal plane, did not fit well in any model that fit

the behavior observed in orientations A-H.The poor fit of orientation I is predicted by the literature;

the time required for mental rotation of images of hands

depends on the axis and direction of rotation [34]. The RT

for condition I is not significantly different than RTs for

conditions C and E, both of which feature a straight finger

and a single rotation [Fð2;4123Þ ¼ 0:037, p ¼ 0:20]. This

suggests that the rotation of the arm down out-of-plane may

not incur as large of a cognitive cost as other rotations,

although additional experiments are required for any

definitive conclusions about out-of-plane rotations. The

low cost of orientation I is encouraging for device designers,

as this orientation represents a common carrying position

for a handheld device.

336 IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 3, JULY-SEPTEMBER 2013

Fig. 9. Results of Experiment 3, along with model predictions. Error barsindicate 95 percent confidence intervals. For reference, the meanbaseline value (absolute RT at in orientation A) was 0.484 s. Modelswere fit to subject means, while group means are shown here, for clarity.Note that models were not fit to orientation I.

Page 8: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

5.2.5 Model Comparison

To gain a qualitative understanding of the effects of fingerpose, we evaluated several likely models relating observedrelative RT to finger orientation. For each model, weperformed F-tests to evaluate the contribution of eachparameter. In the case of nested models, we used aregression analysis for nested models, testing the contribu-tion of each additional parameter for statistical significance.We rejected four models before accepting a final modelrelating relative RT to the sum of joint rotations with asingle interaction parameter.

5.2.6 Rejected Models

Movement time. Movement time, the time required tophysically move the hand between a given pose and thebaseline position, was one model suggest by the literature.Behavioral and neural imaging studies suggest that weexecute mental rotation of body parts by mentally simulat-ing the movement of our limbs through the requiredrotation (e.g., [6]). Parsons compared the time required toperform mental rotations of hand images to the timerequired to physically execute the equivalent hand motionand found the two to be proportionally related [34]. In asmall study with four participants, we measured the timerequired to physically move from each of our experimentalorientations (B-H) to the baseline orientation (A), withresults shown in Table 2. A model predicting RT as linearlyproportional to movement time fit the data poorly(R2 ¼ 0:34). We, therefore, conclude that the cognitive costof mental rotation is not related to physical movement time,and suggest that, in this case, embodiment and mentalsimulation do not fully account for the observed data.

Spatial rotation. A second model considered pure spatialrotation, considering the optimum rotation path withoutregard for principle axes or joint anatomy. RT was modeledas linearly proportional to the spatial angles shown inTable 2. For example, orientation D was modeled as asingle rotation of 120 degrees around an oblique axis lyingon the vector [1,1,1]. This model predicts RT onlymoderately well (R2 ¼ 0:53), implying that participantsdid not use simple spatial rotations in the experimentaltask. This result agrees with earlier work showing thatpeople are unable to effectively perform mental rotationsaround oblique axes [36].

Sum of joint rotations. We considered an anatomy-basedmodel, predicting RT as linearly proportional to the sum ofall joint rotation angles. This model assumes that mentalrotations associated with different joint rotations all havethe same cognitive cost. The improved fit of this model(R2 ¼ 0:55) suggests that mental rotations may be related toanatomical joint angles.

Individually weighted joint rotations. Because our ANOVAshowed main effects of all joint angles, we considered ananatomy-based model where a separate proportional con-stant was fit to each of the three joints. While this model,using three parameters, fit the data well (R2 ¼ 0:58), the fitcannot pass a test for statistical significance. That is, themodel overfits the data; the three parameters do not accountfor a statistically significant amount of the variance in thedata [Fð3;116Þ ¼ 2:1, p ¼ 0:11]. We therefore conclude thatmental rotations of the finger, wrist, and arm do not havesignificantly different cognitive costs.

5.2.7 Final Model: Sum of Joint Rotations with

Interaction

Fig. 9 shows that rotation of both the finger and wrist(i.e., conditions D and H) incurs a greater cost than the sumof the costs associated with the two individual rotations.We therefore tested a model with interaction between thefinger and wrist. Our ANOVA justified the choice of thisinteraction term; while all interactions are staticallysignificant, the interaction between the finger and wrist isthe strongest [Fð1;12137Þ ¼ 107, p < 0:001]. The model isshown as follows:

RT ¼ C1 �X

�joints þ C2 � �finger � �wrist; ð1Þ

where Ci is a constant and �x is a joint rotation. This modelfits the data well (R2 ¼ 0:61), as seen in Fig. 9. Unlike theprevious model (“Individually Weighted Joint Rotations”),this fit is statistically significant [Fð2;117Þ ¼ 43, p ¼ 0:017].Moreover, when treating this model and “Sum of JointRotations” as nested models, the addition of the interactionterm is a statistically significant improvement [Fð1;5Þ ¼ 30,p ¼ 0:0028]. That is, even though this model has one moreparameter than the previous model, the addition of thisparameter is statistically justified. Thus, we conclude that inthe tested mental rotation task, the cognitive cost of therotation is proportional to the sum of all joint angles andthat all joints contribute an approximately equal cost. Ourcomparison of models also shows that rotations of the indexfinger and wrist interact, levying additional cognitive cost.

6 CONCLUSION AND DESIGN IMPLICATIONS

Our results suggest that users are able to understand and usedirectional haptic cues even in situations requiring mentalrotation. Participants identified the direction of hapticstimuli quickly and accurately, despite the significantdifficulties of the task. We conclude, therefore, that directionalhaptic cues may be used effectively in mobile or handheldapplications, despite the difficulties of mental rotation andskewed perceptual reference frames. We do, however, expectthat explicit instructions or training (e.g., to interpret cues in afinger-aligned frame) will help users to correctly interpret rotatedstimuli, as uninstructed users in our first experiment did notintuitively interpret stimuli in any one uniform manner.

Our results provide some insight into the mentalprocess underlying the mental rotation of haptic stimuli.Participants appear to be using an analog spatial repre-sentation of the stimuli, in a process similar to thatobserved in studies of visual stimuli. This result is

GLEESON AND PROVANCHER: MENTAL ROTATION OF TACTILE STIMULI: USING DIRECTIONAL HAPTIC CUES IN MOBILE DEVICES 337

TABLE 2Model Parameters

Page 9: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

somewhat surprising, given the very simple, four-directionstimuli used in our experiment.

The data show a nonlinear relationship between angleand RT, suggesting that participants mentally rotated theirhand into alignment with the allocentric frame andinterpreted the stimuli in that frame. The shape of thisnonlinear relationship provides important guidance forinterface designers. Users can be expected to perform smallrotations, for example, up to 40 degrees, easily and withlittle change in performance over this range of angles. Atlarger angles, however, performance costs are greater, andincrease quickly with angle.

Based on our results, we propose the following guide-lines for interface design: The effects of mental rotation need notbe a major consideration for all rotation angles less than 40 degreesand all small rotation angles can be considered as approxi-mately equal in terms of performance. Larger rotationsshould be avoided or minimized, as any increase in rotationangle will impose an additional burden on the user anddecrease performance.

The results of Experiment 3 showed that all individualjoint rotations incur a similar cost, but that rotations of thefinger and wrist interact to add additional task difficulty.This suggests that interface designers may have consider-able freedom in allowing any joint rotation, but shouldwork to prevent combined wrist and finger rotation. Our studyof intuitive mapping between reference frames failed to finda single pattern common to all users. However, asparticipants’ intuitive mapping behavior did not relate totheir later performance, individual variations in mappingshould not adversely impact a user’s ability to interact withsuch a haptic interface.

7 FUTURE WORK

In future work, we will continue to refine the models relatingfinger orientation and cognitive cost by testing a greaterrange of rotation angles. Because our models were unable toaccurately predict the results obtained from orientation I, wewill conduct separate studies to investigate hand positionsthat lie outside of the transverse (horizontal) plane. Otheropportunities for further research include the effect ofcontinuously changing orientation and the interaction ofmental rotation with other simultaneously executed tasks.

ACKNOWLEDGMENTS

The authors would like to thank Rebecca Koslover for herhard work preparing and proctoring the experiments.Thanks, also, to Dr. Astrid Kappers for her advice andcomments. This work was funded, in part, by the USNational Science Foundation grant IIS-0746914.

REFERENCES

[1] C.D. Wickens and J.G. Hollands, Engineering Psychology andHuman Performance, third ed. Prentice-Hall, 2000.

[2] L.M. Parsons, “Serial Search and Comparison of Features ofImagined and Perceived Objects,” Memory and Cognition, vol. 16,no. 1, pp. 23-35, 1988.

[3] R.N. Shepard and J. Metzler, “Mental Rotation of Three-Dimensional Objects,” Science, vol. 171, no. 3972, pp. 701-703,1971.

[4] L.M. Parsons, “Imagined Spatial Transformation of One’s Body,”J. Experimental Psychology, General, vol. 116, no. 2, pp. 172-191,1987.

[5] L.A. Cooper and R.N. Shepard, “Mental Transformations in theIdentification of Left and Right Hands,” J. Experimental Psychology,Human Perception and Performance, vol. 104, no. 1, pp. 48-56, 1975.

[6] J.M. Zacks and P. Michelon, “Transformations of VisuospatialImages,” Behavioral and Cognitive Neuroscience Revs., vol. 4, no. 2,pp. 96-118, 2005.

[7] J.M. Zacks, “Neuroimaging Studies of Mental Rotation: A Meta-Analysis and Review,” J. Cognitive Neuroscience, vol. 20, no. 1,pp. 1-19, 2008.

[8] S.M. Kosslyn, G.J. DiGirolamo, W.L. Thompson, and N.M. Alpert,“Mental Rotation of Objects versus Hands: Neural MechanismsRevealed by Positron Emission Tomography,” Psychophysiology,vol. 35, no. 2, pp. 151-161, 1998.

[9] P.A. Carpenter and P. Eisenberg, “Mental Rotation and the Frameof Reference in Blind and Sighted Individuals,” Perception andPsychophysics, vol. 23, no. 2, pp. 117-124, 1978.

[10] F. Rosler, B. Roder, M. Heil, and E. Hennighausen, “TopographicDifferences of Slow Event-Related Brain Potentials in Blind andSighted Adult Human Subjects during Haptic Mental Rotation,”Cognitive Brain Research, vol. 1, no. 3, pp. 145-159, 1993.

[11] L.J. Hunt, M. Janssen, J. Dagostino, and B. Gruber, “HapticIdentification of Rotated Letters Using the Left or Right Hand,”Perception and Motor Skills, vol. 68, no. 3, pp. 899-906, 1989.

[12] G.S. Marmor and L.A. Zaback, “Mental Rotation by the Blind:Does Mental Rotation Depend on Visual Imagery,” J. ExperimentalPsychology: Human Perception and Performance, vol. 2, no. 4, pp. 515-521, 1976.

[13] S.C. Prather, J.R. Votaw, and K. Sathian, “Task-Specific Recruit-ment of Dorsal and Ventral Visual Areas during Tactile Percep-tion,” Neuropsychologia, vol. 42, no. 8, pp. 1079-1087, 2004.

[14] Y. Shimizu and B.J. Frost, “Effect of Orientation on Visual andVibrotactile Letter Identification,” Percept Motor Skills, vol. 71,no. 1, pp. 195-198, 1990.

[15] A. Dellantonio and F. Spagnolo, “Mental Rotation of TactualStimuli,” Acta Psychologica, vol. 73, no. 3, pp. 245-257, 1990.

[16] R. Kitada, H.C. Dijkerman, G. Soo, and S.J. Lederman, “Repre-senting Human Hands Haptically or Visually from First-Personversus Third-Person Perspectives,” Perception, vol. 39, pp. 236-254,2010.

[17] R. Volcic and A. Kappers, “Allocentric and Egocentric Referenceframes in the Processing of Three-Dimensional Haptic Space,”Experimental Brain Research, vol. 188, no. 2, pp. 199-213, 2008.

[18] R. Volcic, M. Wijntjes, E. Kool, and A. Kappers, “Cross-ModalVisuo-Haptic Mental Rotation: Comparing Objects betweenSenses,” Experimental Brain Research, vol. 203, no. 3, pp. 621-627,2010.

[19] R. Volcic, M.W.A. Wijntjes, and A.M.L. Kappers, “Haptic MentalRotation Revisited: Multiple Reference Frame Dependence,” ActaPsychologica, vol. 130, no. 3, pp. 251-259, 2009.

[20] S.C. Prather and K. Sathian, “Mental Rotation of Tactile Stimuli,”Cognitive Brain Research, vol. 14, no. 1, pp. 91-98, 2002.

[21] C. Ware and R. Arsenault, “Frames of Reference in Virtual ObjectRotation,” Proc. Symp. Applied Perception in Graphics and Visualiza-tion, 2004.

[22] H.Z. Tan and A. Pentland, “Tactual Displays for WearableComputing,” Personal and Ubiquitous Computing, vol. 1, no. 4,pp. 225-230, 1997.

[23] J. Van Erp, “Presenting Directions with a Vibrotactile TorsoDisplay,” Ergonomics, vol. 48, no. 3, pp. 302-313, 2005.

[24] H.Z. Tan, R. Gray, J.J. Young, and R. Traylor, “A Haptic BackDisplay for Attentional and Directional Cueing,” Haptics-e, vol. 3,article 1, 2003.

[25] D. Kern, P. Marshall, E. Hornecker, Y. Rogers, and A. Schmidt,“Enhancing Navigation Information with Tactile Output Em-bedded into the Steering Wheel,” Proc. Int’l Conf. PervasiveComputing, 2009.

[26] K.N. Winfree, J. Gewirtz, T. Mather, J. Fiene, and K.J.Kuchenbecker, “A High Fidelity Ungrounded Torque FeedbackDevice: The iTorqU 2.0,” Proc. Third Joint World HapticsEuroHaptics Conf. and Symp. Haptic Interfaces for Virtual Environ-ment and Teleoperator Systems (World Haptics ’09), pp. 261-266,2009.

338 IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 3, JULY-SEPTEMBER 2013

Page 10: Mental Rotation of Tactile Stimuli: Using Directional Haptic Cues in Mobile Devices

[27] N.G. Tsagarakis, T. Horne, and D.G. Caldwell, “Slip Aestheasis: APortable 2D Slip/Skin Stretch Display for the Fingertip,” Proc.Eurohaptics, 2005.

[28] Q. Wang and V. Hayward, “Biomechanically Optimized Dis-tributed Tactile Transducer Based on Lateral Skin Deformation,”Int’l J. Robotics Research, vol. 29, pp. 323-335, 2009.

[29] R.J. Webster III, T.E. Murphy, L.N. Verner, and A.M. Okamura,“A Novel Two-Dimensional Tactile Slip Display: Design, Kine-matics and Perceptual Experiments,” Trans. Applied Perception,vol. 2, no. 2, pp. 150-165, 2005.

[30] B.T. Gleeson, S. Horschel, and W. Provancher, “Perception ofDirection for Applied Tangential Skin Displacement: Effects ofSpeed, Displacement and Repetition,” IEEE Trans. Haptics, vol. 3,no. 3, pp. 177-188, July-Sept. 2010.

[31] B.T. Gleeson, S. Horschel, and W. Provancher, “Design of aFingertip-Mounted Tactile Display with Tangential Skin Displace-ment Feedback,” IEEE Trans. Haptics, vol. 3, no. 4, pp. 297-301,Oct.-Dec. 2010.

[32] D.H. Brainard, “The Psychophysics Toolbox,” Spatial Vision,vol. 10, pp. 433-436, 1997.

[33] A.M.L. Kappers and J.J. Koenderink, “Haptic Perception of SpatialRelations,” Perception, vol. 28, no. 6, pp. 781-795, 1999.

[34] L.M. Parsons, “Temporal and Kinematic Properties of MotorBehavior Reflected in Mentally Simulated Action,” J. ExperimentalPsychology: Human Perception and Performance, vol. 20, no. 4,pp. 709-730, 1994.

[35] L.M. Parsons, “Imagined Spatial Transformations of One’s Handsand Feet,” Cognitive Psychology, vol. 19, no. 2, pp. 178-241, 1987.

[36] L.M. Parsons, “Inability to Reason about an Object’s OrientationUsing an Axis and Angle of Rotation,” J. Experimental Psychology:Human Perception and Performance, vol. 21, no. 6, pp. 1259-1277,1995.

Brian T. Gleeson received the BS degreein engineering physics from the University ofColorado in 2003 and the PhD degree inmechanical engineering from the University ofUtah in 2011. He has worked at the Center forAstrophysics and Space Astronomy, Design NetEngineering, and Zhejiang SciTech University.He is currently a postdoctoral fellow in theDepartment of Computer Science at the Uni-versity of British Columbia. His present research

interests include communication design in human-robot interaction.

William R. Provancher received the BS degreein mechanical engineering and the MS degree inmaterials science and engineering, both fromthe University of Michigan, and the PhD degreefrom the Department of Mechanical Engineeringat Stanford University, in the field of haptics,tactile sensing, and feedback. His postdoctoralresearch involved the design of bioinspiredclimbing robots. He is currently a tenuredassociate professor in the Department of Me-

chanical Engineering at the University of Utah. He teaches courses inthe areas of mechanical design, mechatronics, and haptics. His activeareas of research include haptics and tactile feedback. He is anassociate editor of the IEEE Transactions on Haptics and cochair of theTechnical Committee on Haptics. He received a US National ScienceFoundation CAREER Award in 2008 and has won best paper and posterawards at the 2009 and 2011 World Haptics Conferences for his work ontactile feedback. He is a member of the IEEE.

. For more information on this or any other computing topic,please visit our Digital Library at www.computer.org/publications/dlib.

GLEESON AND PROVANCHER: MENTAL ROTATION OF TACTILE STIMULI: USING DIRECTIONAL HAPTIC CUES IN MOBILE DEVICES 339