Upload
william-r
View
217
Download
1
Embed Size (px)
Citation preview
Mental Rotation of Directional Tactile Stimuli
Brian T. Gleeson and William R. Provancher
Haptics and Embedded Mechatronics Lab, University of Utah
ABSTRACT
Several researchers have developed haptic devices capable of rendering directional stimuli. When these devices are integrated into mobile or handheld devices, it becomes possible for a user to hold the haptic device in any orientation and thereby receive directional stimuli that may be out of alignment with rest of the world. In such cases, it becomes necessary for the user to perform a mental transformation of the directional stimuli, so that the stimuli may be understood in a fixed or global reference frame. This paper addresses two questions: 1. can users perform such transformations and successfully interpret stimuli, and 2. what cognitive processes are involved in these transformations? In our experiments, users performed timed identification of directional tactile stimuli with their hand in a variety of orientations around a single axis. The results show that: 1. users can successfully identify directional stimuli both quickly and accurately, even when the stimuli are rendered in a rotated reference frame, and 2. these tasks involve the mental rotation of a spatial mental representation of the stimulus, and also show evidence of embodiment effects. Furthermore, small angles of rotation (up to ~40°) incur very little cognitive cost, suggesting that tactile direction stimuli delivered through a handheld device would be robust to variations in user hand orientation.
KEYWORDS: Directional Skin Stretch, Shear Feedback, Mental Rotation.
Index Terms: H.1.2 [Models and Principles]: User/Machine Systems--Human information processing; H.5.2 [Information Interfaces and Presentation]: User Interfaces—Haptic I/O
1 INTRODUCTION
Haptic direction cues are important because they have the
potential to improve safety and enhance the user experience for a
range of devices. For example, drivers in traffic, soldiers in
combat, and emergency workers in a disaster area must all devote
their visual and auditory attention to their surroundings in order to
maintain situation awareness and personal safety. In such
situations, a haptically-enabled device could provide important
directional or navigational information while leaving the user’s
eyes and ears free. In situations of visual and auditory information
overload, haptic communication can provide cognitive advantages
[1]. For all users, and particularly for the blind, haptic interfaces
could provide an unobtrusive means of receiving information
from common devices like portable phones or music players. As
haptic displays are integrated into mobile or handheld devices, it
becomes necessary to address the issue of mental transformation
of haptic stimuli.
In this paper, we address an important problem inherent to
many types of haptic communication: the mental transformation
of haptic stimuli between different reference frames. In common
laboratory evaluations of directional stimuli, the haptic device is
generally placed in the most natural orientation, that is, aligned
with some global reference frame. In application, however, it may
not be possible to guarantee such alignment, particularly as one
considers handheld or wearable devices.
As an example of this problem, consider navigational
information delivered to the fingertip by a portable device. The
device would render a direction stimulus on the fingertip (in the
reference frame of the finger) and the user would have to interpret
that information as it relates to his or her surroundings (in the task
space). If the user’s finger were not aligned with the task space, it
would be necessary for the user to mentally perform some spatial
transformation between the haptically-perceived reference frame
and the task-based, or global, reference frame (Fig. 1).
Any time there exists a difference between the haptically-
perceived reference frame and the task-based reference frame in
which the cues are to be interpreted, some mental spatial
transformation will be required. Such transformations could have
a serious impact on the applied use of haptic stimuli; previous
work conducted on mental rotation of visual objects has shown
that these rotations incur a cognitive cost and can burden spatial
working memory [2], potentially interfering with other spatial
tasks. With the large and growing number of handheld or
fingertip-based direction displays, it is important to study the use
of such devices in situations where mental transformation is
required.
As an initial investigation of the mental transformation of
haptic stimuli, this paper addresses two questions: 1. Can users
successfully interpret directional stimuli when the orientation of
the hand changes and a transformation between reference frames
is required? 2. What are the cognitive processes underlying the
transformation of haptic stimuli between reference frames?
Fig. 1 When using a portable haptic interface, it may be necessary
to mentally transform haptic cues from the finger-centered
frame, where they are perceived, to the world-centered
frame, where they are to be applied.
————————————————
Department of Mechanical Engineering, University of Utah, Salt Lake City, UT, 84112. E-mail: [email protected], [email protected].
171
IEEE Haptics Symposium 20124-7 March, Vancouver, BC, Canada978-1-4673-0809-0/12/$31.00 ©2012 IEEE
2 BACKGROUND
2.1 Accurate Perception of Rotated Stimuli
Prior research suggests that one’s ability to accurately interpret
directional stimuli may be affected by the orientation of the hand.
In studies of orientation perception, Kappers et al. found variable
hand orientation to cause systematic errors in the perception of
stimulus orientation [3]. The effects of hand orientation also
impact spatial processing and the ability of people to mentally
rotate stimuli between reference frames [4]. The adverse impacts
of variable hand orientation are also demonstrated in studies of
haptic perception and comparison [5] and haptic control of virtual
reality environments [6]. These results warn that variable hand
orientation could potentially interfere with the accurate perception
of haptic direction cues, thus motivating our current study.
2.2 Cognitive Processes Underlying Mental Transformation
The question of how we mentally transform stimuli is ultimately a
question how those stimuli are represented in the mind. A
directional stimulus could be represented in two ways:
symbolically or spatially. For example, if a haptic stimulus were
delivered to the fingertip indicating the direction left, this could be
represented in the mind symbolically, like the word ‘left’, or
spatially, like a vector pointing in a given direction. How
directional stimuli are represented is significant because spatial
representations can be subjected to continuous transformations,
while symbolic representations cannot.
The cognitive science literature contains a large body of work
addressing the mental transformation of stimuli and how those
stimuli are represented in the mind, although the majority of this
work discusses only visual stimuli. Speaking generally,
researchers have found that mental rotation of stimuli is complex
and greatly affects both the cognitive load of a task and the time
required to complete the task [7], while mental translation has
fewer clear costs [8]. As such, our initial investigations, presented
in this paper, will focus only on mental rotation of stimuli.
Many researchers have addressed mental rotation of visual
stimuli, beginning with a study showing that the time required to
complete a mental rotation task increases linearly with the
required angle of rotation [7]. This was understood to mean that
the participants were processing the stimuli using analog, spatial
mental representations (as opposed to symbolic representations)
and that they were performing the rotation task using a continuous
mental transformation.
Later studies, of greater relevance to haptic research, have
addressed the mental rotation of images of body parts (hands, feet,
etc.). These experiments have confirmed earlier results, showing
spatial representations and continuous mental transformations
(e.g., [9], [10]). In contrast to studies of abstract images (e.g., [7]),
experiments involving images of body parts show embodiment
effects, where participants mentally simulate moving the test
image as if the depicted hand or foot was a part of their own body.
Neural imaging experiments confirmed these results, showing that
the somatosensory system engages in motor simulation of body-
relevant rotations (e.g., [11] [12] [13]). In our experiments, which
involve different physical orientations of the hand, embodiment
effects play a significant role.
The few studies addressing mental rotation of tactile stimuli
have generally produced results similar to those obtained in visual
studies. Most of these experiments have involved an embossed
shape pressed against an unmoving fingertip, such as
alphanumeric characters [14] [15] [16] or abstract shapes [17]
[18]. A study where participants felt models of human hands also
showed evidence of spatial representation and mental rotation
[19]. Other studies, however, have produced contradictory results.
A study using vibrating pins failed to produce evidence of mental
rotation [20], and experiments with Braille-like dots only showed
mental rotation behavior under certain conditions [21]. These
differing results show that both the stimulus type and the nature of
the experimental task can determine how the stimulus is
represented in the mind and how the participant performs the
rotation task. In all previous haptic studies, participants perceived
static shapes, so it remains uncertain how participants will
perform mental transformations in tasks involving active,
directional stimuli. To the best of our knowledge, the
transformation of active, directional stimuli is addressed for the
first time in the present study.
2.3 Prior Work with Directional Stimuli
Haptics researchers have developed a range of methods to
communicate directional information using tactile stimuli. Arrays
of vibrating motors can successfully communicate direction cues
and have been built into wearable devices including vests [22] and
belts [23]. Vibrotactile cues have also been delivered through a
chair [24] or a steering wheel [25]. Examples of hand-held or
fingertip-based devices include those that use inertial forces to
communicate direction [26], and others which use shear forces,
slip, or skin stretch at the fingertip (e.g., [27], [28] [29] [30]). In
our previous work, we evaluated the uses of directional tangential
skin stretch at the fingertip. We found this method of tactile
communication to be highly effective; when the skin of the
fingerpad was displaced 1 mm at 2 mm/s or faster, in 1 of 4
cardinal directions, participants were able to identify the direction
of the stimulus with better than 99% accuracy [31]. In all previous
studies of directional tactile stimuli, the stimulus reference frames
were aligned with the response reference frames. In the present
study, we investigate the use of tactile stimuli in cases where the
participant must transform the stimuli between two rotated
reference frames.
3 GENERAL METHODS
In an experiment addressing mental rotation around a single axis,
participants perceived a directional tactile stimulus on the index
finger of their right hand and were asked to indicate the direction
of the stimulus using a joystick held in the left hand. The joystick
and left hand were aligned with the allocentric (world-centered)
reference frame, while the right hand and the tactile stimulus were
rotated into a various fixed orientations. That is, stimuli were
perceived in a various reference frames (right hand), and were
interpreted in a constant reference frame (left hand). The
foundations for our experiment were first established through a
pilot study.
3.1 Participants
The experiment was completed by 15 volunteer participants, 10
male, 5 female, aged between 19 and 33 years (mean = 25.9).
3.2 Tactile Stimulus
Our study utilized directional skin stretch as a tactile stimulus. A
custom haptic device stretched and displaced the skin of fingerpad
in a plane tangent to the surface of the fingerpad (Fig. 2). With the
user’s finger held stationary, the haptic device moved a rubber
contact element that was pressed against the skin of the finger,
stretching the skin in one of four cardinal directions. The haptic
device, described in detail in [32], consisted of two servomotors, a
172
flexure stage that converts the rotational motion of the servos to
linear translation, a rubber contact element, and a thimble-type
finger restraint. The contact element which delivered the stimulus
to the fingertip was a ThinkPad TrackPoint (7 mm diameter
rubber hemisphere with sandpaper-like surface finish).
For each stimulus, the contact element completed a 1±0.1 mm
out-and-back motion, as shown in Fig. 3, in one of four cardinal
directions: proximal, distal, radial or ulnar. We limited the stimuli
to only four cardinal directions (90 degree increments) in order to
eliminate the confounding factor of precise stimulus angle
perception. The perception of the four-direction stimuli used in
this experiment is well understood; previous work has shown the
stimuli to be highly salient and to unambiguously communicate
direction. In earlier studies, subjects were able to correctly
identify the direction of stimuli with 99+% accuracy. For detailed
information on stimulus rendering and perception, see [31, 32].
The full perceptual nature of the stimulus is somewhat
complex, consisting of the primary skin stretch but also small
amounts of slip between the finger and contact element, as well as
the contact between the finger and the restraining thimble. The
stimulus is discussed in greater depth in our previous work, but
for the present experiment it is only important that the stimulus
provide unambiguous direction cues, as it has been shown to do
[31, 32]. Although we test only one stimulus type, we consider the
current work to apply broadly to the various types of directional
tactile stimuli discussed in Section 2.3; the skin stretch stimulus
used in our experiment is a representative example of a larger
class of haptic stimuli.
3.3 Apparatus
Test participants sat as shown in Fig. 4(left), with the tactile
display device worn on the right index finger. Wooden fixtures,
fastened to the test surface with hook-and-loop adhesive tape,
determined the location and orientation of the tactile device and
the participant’s hand. After receiving a tactile stimulus,
participants would indicate their response using a four-direction
joystick operated with their left hand. The joystick only allowed
the participant to respond in four cardinal directions so as to
eliminate any confusion arising from ambiguous responses.
Participants received instructions from a monitor positioned in
front of the test apparatus. A PC running Matlab and the
Psychophysics Toolbox [33] controlled the tactile device and
recorded participants' responses with ±1.5 ms timing accuracy.
During the experiment, participants wore headphones playing
white noise and the test environment was covered so that
participants were not able to see their hands.
4 PILOT STUDY
4.1 Pilot Study Procedure
A pilot study was conducted to characterize the natural response
to rotated stimuli. In this study, participants were not instructed
how to map between the stimulus and the joystick. They were told
to respond “…in the direction that you feel best corresponds with
the tactile stimulus.” The pilot study tested a range of pre-defined
hand orientations produced by combining 90º rotations of the
wrist, elbow and index finger (including the 0º and 90º positions
in Fig. 4(right)). Responses were not timed.
4.2 Pilot Study Results
The results of the pilot study show that there is no single, intuitive
mapping between reference frames in our experiment. Participant
responses showed two different mapping patterns: a finger-
aligned mapping and a static, world-aligned (allocentric)
mapping. In the finger-aligned mapping, participants interpreted
the stimuli in the reference frame in which they were rendered.
For example, a distal stimulus on the index finger mapped to the
forward direction on the joystick, even as the finger moved out of
alignment with the joystick. In world-aligned mapping,
participants responded in the direction most closely aligned to the
absolute, physical direction of the stimulus. For example, a
stimulus in a forward direction (away from the body) always
mapped to the forward direction on the joystick, regardless of the
orientation of the stimulus on the finger. Participant responses
were divided approximately evenly between the two mapping
patterns.
From these results we learn that there is no single intuitive
mapping pattern. This result has two important implications for
our primary experiment: first, participants must be trained to
respond in the desired manner, and second, that mapping stimuli
between reference frames is not an unambiguous, automatic
operation.
The presence of multiple mapping patterns forces us to address
the following question: if we are researching the ability of users to
correctly interpret directional stimuli, which mapping pattern do
we consider ‘correct’? We chose finger-aligned mapping for two
reasons: 1. Finger-aligned mapping is most practical for real-
world applications, as it does not require the interface device to be
capable of tracking its orientation with respect to the world-
aligned reference frame, and 2. Asking study participants to map
between the finger-aligned frame and the world-aligned frame
allows us to most directly investigate the problem of mental
rotation and is in accordance with earlier work on this subject
(e.g., [14] [15] [16] [17] [18]).
In our main experiment, presented below, we instructed and
trained participants to use a finger-aligned mapping. This study is
a first effort at understanding the mental rotation and mapping of
tactile direction cues.
Fig. 2. Photo of tactile display with printed circuit board (PCB)
control electronics.
Fig. 3 Tactile stimulus motion profile. The contact element moves
1 mm at approx. 4 mm/s, stretching the skin of the fingerpad.
After a pause of 0.3 s, the contact element returns to the
center position.
173
5 MAIN EXPERIMENT
5.1 Main Experiment Procedure
Our main experiment investigated rotation of tactile stimuli
around a single axis by measuring the time required to interpret
directional stimuli while the hand was placed in a variety of
orientations. Participants were tested on six hand orientations
produced by rotating the forearm about the elbow in the horizontal
plane, as shown in Fig. 4(right). Hand orientations were spaced
every 18° between 0° (right arm extended straight forward) and
90° (right arm pointing to the left). This range of hand orientations
was selected as the most practical means of investigating the
simple case of rotation about a single axis.
Participants were instructed to interpret the stimuli in the
reference frame of the fingertip and respond in the allocentric
(joystick) reference frame. That is, a distal stimulus on the
fingertip was to correspond with a forward response on the
joystick, regardless of the orientation of the finger. Participants
were instructed to respond as quickly as possible. A training
session before the experiment ensured that participants understood
the task and could respond with high accuracy.
Participants responded to 24 repetitions of each of the 4
stimulus directions in each of the 6 hand orientations, for a total of
576 stimuli. The presentation order of hand orientations was
balanced between participants. To minimize the effects of
learning or fatigue, each experiment tested each hand orientation
twice, with the stimulus repetitions evenly divided between the
two blocks. That is, participants responded to 48 stimuli in a given
orientation and then moved on to the next orientation, cycling
through all orientations twice. The order of the stimulus directions
was pseudorandom, with an equal number of stimuli presented in
each direction. Participants required an average of 30 minutes to
complete the experiment.
5.2 Data Analysis
Data analysis focused on relative response times (relative RT).
Response time (RT) is the time elapsed between the onset of the
stimulus and the participant’s joystick response. Relative RT is
(RT – baseline RT) where baseline RT is the average response
time in the baseline condition (0°, i.e., with the index finger
aligned with the joystick). Baseline RT values were calculated
individually for each participant. The use of relative RT in the
analysis eliminates baseline differences between participants and
makes the effect of hand orientation more apparent. Data were
pooled for each subject, producing a mean response time for each
hand orientation. The final analysis considered the resultant
collection of subject means.
All incorrect responses were rejected from the data set
(3.0% of data) along with outlier RT values (> 3 standard
deviations from the subject’s mean for a given orientation, 2.1%
of data). Additionally, all data from one participant were rejected,
due to the participant’s unusually high error rate (>3 standard
deviations above the group mean).
5.3 Direction Cue Error Rate and Reaction Time
Our experiment used response time (RT) as a measure of task
difficulty. As a precursor to our main data analysis, we tested the
correlation between RT and error rate to ensure that the
interpretation of our data was not confounded by speed-accuracy
trade-off effects (c.f. [19]). This analysis of correlation addresses
the following question: do larger RTs accurately indicate greater
task difficulty, or are participants merely choosing a different
point on the speed-accuracy continuum for each finger
orientation? The data show a positive correlation between RT and
error rate, the opposite of what one would expect in the case of a
speed-accuracy trade-off. The positive correlation, as measured by
Pearson’s r, is statistically significant (r = 0.94, p < 0.001). This
positive correlation indicates that, under some conditions, the
participants were making more errors even though they were
spending more time on the task, implying that the difficulty of the
task was not constant. From this we conclude RTs may be used as
a measure of task difficulty.
5.4 Accurate Perception of Directional Stimuli in Rotated Reference Frames
Participants responded quickly and acutely to the experimental
stimuli over the range of tested angles. The mean accuracy was
97% and the mean absolute response time was 0.606 s. These
results show that users can interpret directional stimuli quickly
and accurately, even in cases where the reference frame of the
stimuli is rotated with respect to an absolute reference frame.
From this we conclude that directional tactile stimuli may be used
effectively in cases where the haptic device moves with respect to
the environment, such as would be the case with a handheld
device.
5.5 Cognitive Processes Underlying Mental Transformation
A further analysis of our data addresses the question of how users
transform stimuli between reference frames. This analysis focuses
on the relationship between rotation angle and relative RT (there
were too few errors to perform any meaningful analysis of error
patterns). We hypothesized that the directional stimuli used in our
experiment would produce a spatial mental representation, and
that the experimental task would require a mental rotation of this
spatial representation. Based on previous studies (e.g., [7], [9],
[10]), we expected that a spatial mental representation and mental
rotation would be indicated in the data by an increase in relative
RT as a continuous function of the angle between finger frame
and the joystick (world) frame.
Fig. 5 summarizes our results and shows a clear correlation
between rotation angle and relative RT. That is, as the angle
between the finger and joystick reference frames increased,
participants required more time to respond. This correlation is
statistically significant (Pearson’s r=0.602, p<0.0001). This
increase in time implies increased task difficulty and a greater
amount of cognitive processing for greater rotation angles.
From this correlation, we conclude that participants were
forming a spatial representation of the stimuli and performing a
mental rotation, rotating the tactile stimuli from the fingertip
Fig. 4 Left: The test environment. Repositionable fixtures provided
for repeatable positioning of the hand and finger. During
experiments, the test environment was covered with a
wooden lid so that participants could not see their hands.
Right: The six hand orientations tested.
174
frame to the joystick frame (cf., [7]). This result is somewhat
surprising; given the simple stimulus used in our experiment (4
cardinal directions), it would have been reasonable to assume that
participants would perform the experimental task using a simple
symbolic strategy, e.g., by memorizing the relationship between
stimuli and responses, similar to a mental lookup table. This was
not the case, however. A non-spatial mental representation of
stimuli would not be sensitive to the spatial orientation of the
finger. Based on the findings in the large body of prior mental
rotation research, the observed correlation between rotation angle
and response time indicates a mental rotation process based on a
spatial representation of the stimuli (e.g., [7], [9], [10]).
Further insight into the cognitive processes underlying the
experimental task can be gained by considering the shape of the
RT-angle relationship. Our data show a non-linear relationship
between rotation angle and RT, which we fit with a sinusoidal
curve (R2 = 0.41). Other non-linear curves shapes, e.g., a
parabola, fit similarly well, but a sinusoid was chosen as the most
physically relevant model, following the example of a prior study
which also addressed tactile sensing and the mental rotation of
hands [19]. Note that curves were fit to the collection of all
participant means, (6 data points per participant), while Fig. 5
shows pooled means only, for clarity.
The non-linear shape of our data allows us to place our results
in context with earlier studies, suggesting the presence of
embodiment effects in our study. Prior studies of mental rotation
involving abstract shapes generally show linear trends (e.g., [17],
[15], [18]). Non-linear and sinusoidal trends, qualitatively similar
to our results, most often appear when participants exhibit
embodiment effects during the mental rotation of visually or
tactilely perceived human hands (e.g., [34], [35], [19], [10]) or
while controlling an object through physical hand rotation [6].
The implication, therefore, is that participants in our study were
performing some sort of embodied rotation, that is, mentally
rotating their hand into the baseline position (0° position), and
then interpreting the stimulus in that orientation. This result is
important because it allows the classification of our results into
the larger body of past research, shedding light on the cognitive
processing underlying the interpretation of rotated stimuli and
indicating a direction for future work. Because of the presence of
embodiment effects, mental rotation of tactile stimuli may not be
a matter of simple spatial transformation, but may also incorporate
physiological factors, such as the pose of the hand or the angle of
specific joints. A deeper investigation into the effect of hand pose
on mental transformation is the subject of ongoing research.
6 CONCLUSION
Our results suggest that directional haptic cues could be used
effectively in applications where some mental rotation would be
required. Despite the combined difficulties of the mental rotation
task and the skewed perceptual reference frames at rotated finger
orientations, participants were able to identify the direction of the
tactile stimuli quickly and with high accuracy.
The relationship between finger orientation and RT shows that,
even with simple four-direction stimuli, participants mentally
processed the stimuli using analog spatial representations. The
shape of the time-angle curve is of relevance to haptic interface
designers; rotations of small angles (say, 0-40°) can be executed
without great cost, but larger rotations may impede performance.
Additionally, the non-linear nature of the time-angle curve
indicates the presence of embodiment effects, suggesting a
possible relationship between the specific pose of the hand and the
difficulty of the mental transformation.
The results of the present study have the potential to impact a
broad range of haptic applications. Many haptic devices have been
designed to convey directional information. When these devices
are integrated into a mobile or handheld device, users will be
forced to contend with haptic stimuli rendered in the variable
reference frame of the hand. A thorough understanding of the
mental transformation of haptic stimuli will help engineers and
designers to develop haptic interactions that are suitable for
mobile and handheld applications.
Based on the results of this study, we propose the following
guidelines for the design of haptic devices and interactions: 1.
Designers may produce devices and interactions that would
require users to successfully interpret spatial stimuli presented in a
rotated reference frame, although, 2. The interpretation of such
stimuli is not unambiguous and users will have to be instructed on
proper interpretation. 3. Expect that the interpretation of these
stimuli will come at the cost of longer reaction times and higher
cognitive demands for the user, but, 4. These costs can be
minimized if devices and interactions are designed to prevent
reference frame rotations from exceeding approximately 40
degrees.
Our ongoing work continues to address the problem of rotated
haptic stimuli. Current experiments are addressing the question of
embodiment and the impact of specific hand poses on the
interpretation of haptic cues.
ACKNOWLEDGEMENTS
We thank Rebecca Koslover for her hard work in preparing and
proctoring the experiments. Thanks also to Dr. Astrid Kappers for
her advice and comments. This work was funded, in part, by the
US National Science Foundation under grant # IIS-0746914.
REFERENCES
[1] Wickens, C.D., and Hollands, J.G.: ‘Engineering Psychology and
Human Performance’ (Prentice-Hall Inc., 2000, 3rd edn.)
[2] Parsons, L.M.: ‘Serial search and comparison of features of
imagined and perceived objects’, Memory & Cognition, 1988, 16,
(1), pp. 23-35
[3] Volcic, R., and Kappers, A.: ‘Allocentric and egocentric reference
frames in the processing of three-dimensional haptic space’,
Experimental Brain Research, 2008, 188, (2), pp. 199-213
[4] Volcic, R., Wijntjes, M.W.A., and Kappers, A.M.L.: ‘Haptic mental
rotation revisited: multiple reference frame dependence’, Acta
Psychologica, 2009, 130, (3), pp. 251-259
[5] Prather, S.C., and Sathian, K.: ‘Mental rotation of tactile stimuli’,
Cognitive Brain Research, 2002, 14, (1), pp. 91-98
Fig. 5 Pooled data, along with a sinusoidal fit. Error bars indicate
95% confidence intervals. For reference, the mean baseline
value (absolute RT at 0°) was 0.546 s.
175
[6] Ware, C., and Arsenault, R.: ‘Frames of reference in virtual object
rotation’. Proc. Symposium on Applied perception in graphics and
visualization, Los Angeles, California, 2004
[7] Shepard, R.N., and Metzler, J.: ‘Mental Rotation of Three-
Dimensional Objects’, Science, 1971, 171, (3972), pp. 701-703
[8] Denis, M., and Kosslyn, S., M.: ‘Scanning visual mental images : A
window on the mind’, Cahiers de psychologie cognitive, 1999, 18,
(4), pp. 409-616
[9] Parsons, L.M.: ‘Imagined spatial transformation of one's body’, J
Exp Psychol Gen, 1987, 116, (2), pp. 172-191
[10] Cooper, L.A., and Shepard, R.N.: ‘Mental transformations in the
identification of left and right hands’, J Exp Psychol Hum Percept
Perform, 1975, 104, (1), pp. 48-56
[11] Zacks, J.M., and Michelon, P.: ‘Transformations of Visuospatial
Images’, Behavioral and Cognitive Neuroscience Reviews, 2005, 4,
(2), pp. 96-118
[12] Zacks, J.M.: ‘Neuroimaging Studies of Mental Rotation: A Meta-
analysis and Review’, Journal of Cognitive Neuroscience, 2008, 20,
(1), pp. 1-19
[13] Kosslyn, S.M., DiGirolamo, G.J., Thompson, W.L., and Alpert,
N.M.: ‘Mental rotation of objects versus hands: Neural mechanisms
revealed by positron emission tomography’, Psychophysiology,
1998, 35, (2), pp. 151-161
[14] Carpenter, P.A., and Eisenberg, P.: ‘Mental rotation and the frame of
reference in blind and sighted individuals.’, Perception &
Psychophysics, 1978, 23, (2), pp. 117-124
[15] Rösler, F., Röder, B., Heil, M., and Hennighausen, E.: ‘Topographic
differences of slow event-related brain potentials in blind and sighted
adult human subjects during haptic mental rotation’, Cognitive Brain
Research, 1993, 1, (3), pp. 145-159
[16] Hunt, L.J., Janssen, M., Dagostino, J., and Gruber, B.: ‘Haptic
identification of rotated letters using the left or right hand’, Percept
Mot Skills, 1989, 68, (3), pp. 899-906
[17] Marmor, G.S., and Zaback, L.A.: ‘Mental rotation by the blind: Does
mental rotation depend on visual imagery’, Journal of Experimental
Psychology: Human Perception and Performance, 1976, 2, (4), pp.
515-521
[18] Prather, S.C., Votaw, J.R., and Sathian, K.: ‘Task-specific
recruitment of dorsal and ventral visual areas during tactile
perception’, Neuropsychologia, 2004, 42, (8), pp. 1079-1087
[19] Kitada, R., Dijkerman, H.C., Soo, G., and Lederman, S.J.:
‘Representing human hands haptically or visually from first-person
versus third-person perspectives’, Perception, 2010, 39, pp. 236-254
[20] Shimizu, Y., and Frost, B.J.: ‘Effect of orientation on visual and
vibrotactile letter identification’, Percept Mot Skills, 1990, 71, (1),
pp. 195-198
[21] Dellantonio, A., and Spagnolo, F.: ‘Mental rotation of tactual
stimuli’, Acta Psychologica, 1990, 73, (3), pp. 245-257
[22] Tan, H.Z., and Pentland, A.: ‘Tactual displays for wearable
computing’, Personal and Ubiquitous Computing, 1997, 1, (4), pp.
225-230
[23] Van Erp, J.: ‘Presenting directions with a vibrotactile torso display’,
Ergonomics, 2005, 48, (3), pp. 302-313
[24] Tan, H.Z., Gray, R., Young, J.J., and Traylor, R.: ‘A haptic back
display for attentional and directional cueing’, Haptics-e 2003, 3, (1)
[25] Kern, D., Marshall, P., Hornecker, E., Rogers, Y., and Schmidt, A.:
‘Enhancing Navigation Information with Tactile Output Embedded
into the Steering Wheel’. Proc. Conf. on Pervasive Computing, Nara,
Japan 2009
[26] Winfree, K.N., Gewirtz, J., Mather, T., Fiene, J., and Kuchenbecker,
K.J.: ‘A high fidelity ungrounded torque feedback device: The
iTorqU 2.0’, Proc. World Haptics 2009, pp. 261-266
[27] Katsumi, S., Kazuyuki, T., and Shin, T.: ‘2DOF Flat Actuator for
Tangible Mouse’, Fuji Xerox Tech Rep, 2000, 1, (13), pp. 38-43
[28] Tsagarakis, N.G., Horne, T., and Caldwell, D.G.: ‘Slip Aestheasis: A
Portable 2D Slip/Skin Stretch Display for the Fingertip’. Proc.
Eurohaptics 2005
[29] Wang, Q., and Hayward, V.: ‘Biomechanically Optimized
Distributed Tactile Transducer Based on Lateral Skin Deformation’,
Journal of Robotics Research, 2009, 29, (4), pp. 323-335
[30] Webster III, R.J., Murphy, T.E., Verner, L.N., and Okamura, A.M.:
‘A novel two-dimensional tactile slip display: design, kinematics and
perceptual experiments’, Trans. on Applied Perception, 2005, 2, (2),
pp. 150-165
[31] Gleeson, B.T., Horschel, S.K., and Provancher, W.R.: ‘Perception of
Direction for Applied Tangential Skin Displacement: Effects of
Speed, Displacement and Repetition’, IEEE Trans. on Haptics, 2010,
3, (3), pp. 177-188
[32] Gleeson, B.T., Horschel, S.K., and Provancher, W.R.: ‘Design of a
Fingertip-Mounted Tactile Display with Tangential Skin
Displacement Feedback’, IEEE Trans. on Haptics, 2010, 3, (4), pp.
297 - 301
[33] Brainard, D.H.: ‘The Psychophysics Toolbox’, Spatial Vision, 1997,
10, pp. 433-436
[34] Parsons, L.M.: ‘Temporal and kinematic properties of motor
behavior reflected in mentally simulated action. ’, Journal of
Experimental Psychology: Human Perception and Performance,
1994, 20, (4), pp. 709-730
[35] Parsons, L.M.: ‘Imagined spatial transformations of one's hands and
feet’, Cognitive Psychology, 1987, 19, (2)
176