12
Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1 , Heather Culbertson 1 , Mark R. Miller 1 , Alex Olwal 2 , Sean Follmer 1 1 Stanford University 2 Google Inc. Stanford, CA 94305, USA Mountain View, CA 94043, USA. [email protected], [email protected], [email protected], [email protected], [email protected] Figure 1. Grabity is a novel, unified design based on the combination of vibrotactile feedback, uni-directional brakes, and asymmetric skin stretch. The gripper-style haptic device can simulate grasping motions with a real object (top), in Virtual Reality (bottom). Gravity provides vibrotactile feedback during contact, high stiffness force feedback during grasping, and weight force feedback during lifting. ABSTRACT Ungrounded haptic devices for virtual reality (VR) applica- tions lack the ability to convincingly render the sensations of a grasped virtual object’s rigidity and weight. We present Gra- bity, a wearable haptic device designed to simulate kinesthetic pad opposition grip forces and weight for grasping virtual objects in VR. The device is mounted on the index finger and thumb and enables precision grasps with a wide range of motion. A unidirectional brake creates rigid grasping force feedback. Two voice coil actuators create virtual force tangen- tial to each finger pad through asymmetric skin deformation. These forces can be perceived as gravitational and inertial forces of virtual objects. The rotational orientation of the voice coil actuators is passively aligned with the real direction of gravity through a revolute joint, causing the virtual forces to always point downward. This paper evaluates the performance of Grabity through two user studies, finding promising ability to simulate different levels of weight with convincing object rigidity. The first user study shows that Grabity can convey Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. UIST 2017, October 22–25, 2017, Quebec City, QC, Canada. © 2017 Copyright is held by the owner/author(s). ACM ISBN 978-1-4503-4981-9/17/10 https://doi.org/10.1145/3126594.3126599 various magnitudes of weight and force sensations to users by manipulating the amplitude of the asymmetric vibration. The second user study shows that users can differentiate different weights in a virtual environment using Grabity. ACM Classification Keywords H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems-Artificial, Augmented, and Virtual Real- ities; H.5.2 [User Interfaces]: Haptic I/O Author Keywords Haptics; Virtual Reality; Mass Perception; Weight Force; Grasp INTRODUCTION In order to grasp and manipulate objects in the real world, humans rely on haptic cues such as fingertip contact pres- sure and kinesthetic feedback of finger positions to determine shape, and proprioceptive and cutaneous feedback for weight perception, among other modalities [26]. To create haptic interfaces that can provide a realistic grasping experience we must support these same modalities and render similar forces to a user’s hands. Current virtual reality (VR) systems can render realistic 3D objects visually, but most lack the ability to provide a realistic haptic experience. Grounded kinesthetic haptic devices can render many of these forces for either a single contact point [27] or even five fingers [14]. However, grounded haptic devices are large, mechanically complex, and have limited workspaces.

Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

  • Upload
    others

  • View
    9

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

Grabity: A Wearable Haptic Interface for Simulating Weightand Grasping in Virtual Reality

Inrak Choi1, Heather Culbertson1, Mark R. Miller1, Alex Olwal2, Sean Follmer1

1Stanford University 2Google Inc.Stanford, CA 94305, USA Mountain View, CA 94043, USA.

[email protected], [email protected], [email protected], [email protected],[email protected]

Figure 1. Grabity is a novel, unified design based on the combination of vibrotactile feedback, uni-directional brakes, and asymmetric skin stretch. Thegripper-style haptic device can simulate grasping motions with a real object (top), in Virtual Reality (bottom). Gravity provides vibrotactile feedbackduring contact, high stiffness force feedback during grasping, and weight force feedback during lifting.

ABSTRACTUngrounded haptic devices for virtual reality (VR) applica-tions lack the ability to convincingly render the sensations ofa grasped virtual object’s rigidity and weight. We present Gra-bity, a wearable haptic device designed to simulate kinestheticpad opposition grip forces and weight for grasping virtualobjects in VR. The device is mounted on the index fingerand thumb and enables precision grasps with a wide range ofmotion. A unidirectional brake creates rigid grasping forcefeedback. Two voice coil actuators create virtual force tangen-tial to each finger pad through asymmetric skin deformation.These forces can be perceived as gravitational and inertialforces of virtual objects. The rotational orientation of thevoice coil actuators is passively aligned with the real directionof gravity through a revolute joint, causing the virtual forces toalways point downward. This paper evaluates the performanceof Grabity through two user studies, finding promising abilityto simulate different levels of weight with convincing objectrigidity. The first user study shows that Grabity can convey

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the Owner/Author.

UIST 2017, October 22–25, 2017, Quebec City, QC, Canada.© 2017 Copyright is held by the owner/author(s).ACM ISBN 978-1-4503-4981-9/17/10https://doi.org/10.1145/3126594.3126599

various magnitudes of weight and force sensations to users bymanipulating the amplitude of the asymmetric vibration. Thesecond user study shows that users can differentiate differentweights in a virtual environment using Grabity.

ACM Classification KeywordsH.5.1 [Information Interfaces and Presentation]: MultimediaInformation Systems-Artificial, Augmented, and Virtual Real-ities; H.5.2 [User Interfaces]: Haptic I/O

Author KeywordsHaptics; Virtual Reality; Mass Perception; Weight Force;Grasp

INTRODUCTIONIn order to grasp and manipulate objects in the real world,humans rely on haptic cues such as fingertip contact pres-sure and kinesthetic feedback of finger positions to determineshape, and proprioceptive and cutaneous feedback for weightperception, among other modalities [26]. To create hapticinterfaces that can provide a realistic grasping experience wemust support these same modalities and render similar forcesto a user’s hands. Current virtual reality (VR) systems canrender realistic 3D objects visually, but most lack the abilityto provide a realistic haptic experience. Grounded kinesthetichaptic devices can render many of these forces for either asingle contact point [27] or even five fingers [14]. However,grounded haptic devices are large, mechanically complex, andhave limited workspaces.

Page 2: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

In the context of consumer VR, haptic devices must becomemore compact and mobile. However, it remains challengingto render forces, such as grasp and weight, using ungroundedhaptic feedback devices. A variety of wearable and portablehaptic devices have been investigated with different typesof gloves, exoskeletons, and handheld controllers to providean immersive experience to users during interaction in VR.While these devices provide haptic feedback for informationalcues, 3D shape information, and stiffness of virtual objects,creating kinesthetic feedback, such as weight and inertia ofvirtual objects, still remains a challenge. This challenge comesfrom the fact that wearable and portable haptic devices aregrounded to the user’s body, not to the environment. Thereis no electromechanical structure from which to generate akinesthetic force outside of the human body. While there areconflicting constraints of mobility and performance, an idealhaptic device would be a compact and lightweight mobiledevice with the ability to create the necessary forces.

We introduce Grabity, a mobile, ungrounded haptic device,which can display various types of kinesthetic feedback toenhance the grasping and manipulating of virtual objects. Thisfeedback includes gravity, force for inertia, as well as rigidstiffness force feedback between opposing fingers. The designcombines a "gripper" style haptic device [5, 32], for providingopposing forces between the index finger and thumb, and askin deformation mechanism for rendering inertia and massof a virtual object. Previous work has demonstrated that asym-metric skin deformation enabled by linear vibratory motorscan generate perceived virtual forces tangential to finger pads[40, 2, 12].

We apply this same principle to render the virtual gravity forceof different virtual masses, and their associated inertia in 1degree of freedom per finger. To create the sensation of gravityand inertia, we adapt two voice coil actuators to a mobilegripper type haptic device. We utilize different magnitudesof asymmetric vibrations to generate various levels of forcefeedback. The gripper element includes a unidirectional braketo create the rigid, high-stiffness, opposing forces between afinger and thumb.

While previous work has investigated the maximum perceivedforce of a single asymmetric skin deformation signal [40, 12],we specifically investigate weight perception and the ability todiscriminate different levels of stimuli. We conducted a userstudy with 12 participants to measure the perceived virtualforce, which showed that users can discriminate at least twolevels of positive and negative force and we characterized theirassociated magnitude. We then applied these results to a VRenvironment allowing users to grasp and manipulate virtualobjects with contact forces (vibration), pad opposition gripforces (uni-directional braking), gravity and inertia (asymmet-ric skin deformation). A user evaluation with 5 participantshighlighted the ease of manipulation and suggested that userscould discriminate 3 levels of virtual weight.

CONTRIBUTIONS• Design considerations for integrating voice coil actuators

into a mobile haptic device for displaying weight and inertiaof virtual objects.

• Combining grasp force feedback with asymmetric skin de-formation in a virtual environment.

• An integrated VR system with haptic rendering of virtualgravity and inertia, contact forces, and grasp feedback.

• Quantification of perceived virtual forces generated throughvoice coil actuators, by controlling the magnitude of asym-metric vibrations.

• Verification that Grabity allows users to differentiate be-tween virtual weights in VR.

RELATED WORK

Wearable and Handheld Haptic DevicesIn contrast to grounded haptic devices which ground forces ex-ternally, wearable and handheld haptic devices ground forcesto a user’s body. These wearable and handheld devices do notrestrict the motion of a user, allowing the user to move aroundfreely in space. Thus, such devices can support a much largerworkspace.

Wand-based controllers are often used in VR environments,thus a number of haptics-enabled handheld devices have beeninvestigated. These devices are ungrounded, so they cannotrender external forces. They also do not allow users to perceivethe shape of objects through enclosure, or to grasp objects nat-urally. They are, however, more compact and map easily toexisting spatial input techniques. Many such devices (e.g., theHTC Vive controller) utilize vibrotactile feedback. Normal-Touch and TextureTouch, on the other hand, are devices thatcan render texture or contact angle to a single finger [7] usinga tilt platform and tactile array, respectively. Grasping in hand-held devices has also been explored with some systems thatprovide variable stiffness feedback [16]. A recent wand-basedcontroller gives various kinesthetic haptic feedback by shiftingits center of mass [47].

Cutaneous force feedback displays stimulate mechanorecep-tors in the skin that enable people to perceive local shape,texture and other features. Researchers have developed wear-able fingertip-based devices that render contact forces [29, 34],textures [45] and skin stretch [9, 41]. However these devicescannot provide rigid grasping sensations as there is no forceconstraining finger motion.

Researchers have also explored glove-style, wearable exoskele-tons to provide haptic feedback for grasping of virtual objects.CyberGrasp, a commercial device, grounded force feedbackfor each finger through a number of pulleys to the wrist [1].More recently, Dexmo has taken a similar approach, usinglow cost servo motors and mechanical linkages to provideforce feedback for five fingers [15]. While those devicesgrounded forces to the wrist, the Rutgers Master II rendersforces between the fingers and the palm [8]. Other gripperforce feedback devices display forces directly between theindex finger and thumb, which can be combined with otherhaptic modalities such as voice-coil actuators for contact andacceleration feedback [23]. Our Wolverine haptic device uti-lizes uni-directional brakes between three fingers and a thumbto render rigid body forces [10]. While these wearable devices

Page 3: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

can provide kinesthetic feedback for grasping between fingers,they are unable to render external forces such as gravity.

Skin Stretch FeedbackSkin stretch (lateral or shear forces deforming the skin) canbe perceived by different mechanoreceptors and is used inperception of texture, friction, slip, and force [21]. Skin stretchis also believed to be used to aid in grasping and manipulatingobjects [20]. Cutaneous finger tip based skin stretch displaysfor texture perception have been investigated by Hayward andcollaborators using small (1mm), low displacement, vibratingpiezo elements [17, 35]. Skin stretch has also been used todisplay friction [38].

Skin stretch can be perceived directionally, thus it has alsobeen investigated to provide directional cues for applicationssuch as driving guidance [36]. In addition, while many cuta-neous haptic devices focus on finger tip feedback, skin stretchcan also be used to display information to arms and other bodylocations [6, 19].

Skin stretch has also been used to display forces tangentiallyto finger pads, to simulate gravity [29] or other forces [34, 43].These devices use physical tactors in contact with the fingerpad which are displaced laterally across the finger to stretchit. These tactors can be placed in a pen type end effector togive force feedback and guidance cues [39], in a handhelddevice to create forces and torques [37], or mounted directlyto the finger tip to render both contact force and weight [9,41]. These devices can create large amounts of perceived skinstretch but rely on relatively large and bulky DC motors.

More recently, asymmetric, vibration based skin stretch hasbeen shown to enable perception of direction and pulling force[40] . This can be enabled by linear vibration actuators, such aspiezo or linear resonant actuators, which can be compact andlightweight. Culbertson et al. modeled this behavior for voicecoil actuators [12]. In addition to force output, asymmetricskin stretch can be used for wearable directional guidancecues [13]. To our knowledge, asymmetric skin stretch has notyet been investigated in the context of other haptic modalities,such as gripping force feedback.

Haptic Devices for Weight SimulationExternally grounded haptic devices [27, 31, 14] can renderexternal forces such as gravity. Another approach is to changethe mass of the device by moving a fluid to an external reser-voir [33]. Researchers also moved the center of mass activelyto give similar effects [47]. Other researchers have attemptedto use ungrounded, wearable devices to simulate the weightof virtual objects using skin stretch [29, 41]. However thesedevices utilize larger servo motors. Amemiya and Maeda cre-ated a slider-crank system to generate asymmetric vibrationsand showed that these vibrations can be used to change theperceived heaviness of an object [3]. However, this systemrequired the user to always keep the device oriented towardsgravity and the asymmetric vibrations were created using abulky mechanical apparatus rather than a vibration actuator asin this paper.

DESIGN CONSIDERATION

Background: Grasping and manipulationWe receive a variety of haptic sensations while we grasp andmanipulate an object. Imagine that there is a cup of coffeein front of you. First, you will notice that you made contactwhen you touch the cup with your fingers. Once you graspthe cup, you will notice its shape and size. Then, as you liftit, you will feel its weight. During this short interaction, aperson perceives rich information about the cup through hapticfeedback, such as texture, temperature, 3D shape, stiffness,and weight [25].

In this work, we focus on the force feedback (kinesthetic)aspects and do not explore stiffness, texture, or temperaturecomponents. Even though we still need more informationfor fully immersive haptic experience, we believe basic forcefeedback for touching, grasping, lifting, and shaking can dra-matically improve manipulation in current VR interfaces.

Force feedback is necessary for:

• Touching to recognize if we reach or are in contact with anobject.

• Grasping to recognize if we have successfully grasped anobject, but also to stabilize our hand motion, and hapticallyperceive object size.

• Gravity to recognize if we lifted the object, but also to feelits weight.

• Inertia to recognize if we translate the object too fast.

Figure 2. Three directions of pad opposition grasp for holding an object.

Humans sense weight throughout the body [22]. Muscle spin-dles and golgi tendon organs in human arm sense weightsproprioceptively [18]. At the same time, mechanoreceptorson finger pads sense weights by being pressurized or distortedlaterally [21]. By combining proprioceptive and tactile (cuta-neous) feedback, humans sense weights naturally. However,it would be desirable if we can create weight sensation usingjust tactile feedback. Haptic systems creating proprioceptiveweight sensation are generally bulky and heavy because theyneed to be grounded externally with multiple linkages. Mi-namizawa et al. have investigated the role of proprioceptiveand tactile (cutaneous) sensation for weight measuring [30].According to their work, tactile sensation without proprio-ceptive sensation provides certain perceptive cues that helpdifferentiate weights.

While there are various ways to grasp objects, we have nar-rowed our focus to pad opposition grasps between index finger,

Page 4: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

middle finger, and thumb. We believe this grasp choice wouldbe sufficient for basic manipulations, such as picking andplacing virtual objects. In addition, research has yet to showthat people can simultaneously integrate multiple directionsof asymmetric skin stretch well [13]. This also reduces theweight and cost of the device. There are three directions thatthe object can be held in the pad opposition type grip, as illus-trated in Figure 2. We can see that the force direction createdby object mass is parallel to the plane of finger pads in Figure2 (a) and (b). On the other hand, the force direction createdby object mass is normal to the finger pads plane in Figure2 (c). If we utilize cutaneous skin stretch as opposed to pres-sure or proprioceptive cues, then we cannot render weight inorientation (c).

To design a system which can provide feedback for touching,grasping, gravity, and inertia, we chose to combine a grippertype device with cutaneous asymmetric skin stretch. For thisdevice’s performance for our scenarios, we emphasize theoptimization of the following design parameters:

• Weight. The overall mass of the device should belightweight in order for the skin stretch to be perceivedas weight, as the virtual forces have been shown to be low(< 30g) [40] and weight perception acuity decreases as totalweight increases [44].

• Motion range. Wide range of motion to grasp and manipu-late different sized objects.

• Mechanical complexity. Minimize the number of actuators,to reduce cost and weight.

• Anatomical alignment. The index finger and thumb shouldbe parallel in alignment to receive consistent directionalskins stretch cues. Misalignment can create confusion andunintentional torques.

• Stiffness. High stiffness for pad opposition forces. Gripforce of the human hand can exceed 100N [4].

• Performance. Accurate and fast position tracking to inte-grate into VR.

IMPLEMENTATION

Overall StructureAs shown in Figure 3, the device is composed of three rigidbodies: a base, a sliding part, and a swinging part. The baseis mounted on the thumb, and it has retroreflective markersfor an external optical motion capture system for tracking thethumb’s position and orientation. The sliding part is mountedon the index finger and is connected with the base through aprismatic joint that is composed of two carbon fiber tubes. Thissingle degree of freedom allows pinching motions for graspingobjects. The brake mechanism on the sliding part contains abrake lever, a tendon, and a motor. The swinging part, whichis connected to the sliding part and base through revolutejoints (bearings), is composed of two voice coil actuators(Haptuator Mark II, Tactile Labs) and a prismatic joint madeof carbon fiber tubes. Each voice coil actuator is placed closelyto the index finger and thumb pads so that it can transmit thedesired vibration signals properly. The offset distance between

SWINGING PART

BASE

Voice Coil Actuators(Haptuator Mark II)

BearingsPad for Thumb

RetroreflectiveMarkers

Angle Constraint

slide

slide

rotate

Dowel PinTwo Hall Effect SensorsMagnetic Encoder WheelFriction-driven Roller

Friction-driven Rotary Encoder

SLIDING PARTFriction-driven QuadratureEncoder (Inside)

UnidirectionalBrake Lever

Tendon

Motor for Braking

Pad for Fingers

A small motor activates the brakefor grasping haptic feedback

Voice coil actuators vibrate for touching,gravity, and inertia haptic feedback

Components

Kinematic Structure

Actuation Mechanism

Pinching Motion Range: 30mm to 100mm

Total Weight: 65 grams

- Index finger slides on rods for pinching motion.- Swinging part directs downward due to bearings.

Voice Coils Pointing Down

Figure 3. Overall mechanical structure and actuation mechanism.

the revolute joint and center of mass of the swinging partensures that the direction of the voice coil actuators are alwayspassively directed to be normal to ground. The prismatic jointon the swinging part constrains the two angles of voice coilactuators to be the same while allowing them to slide relativeto one another. Most parts of the device are 3D printed using aFormlabs 2 printer (SLA technology), and the device weighs65 grams.

Actuation for Force FeedbackGrabity contains two types of actuators: a brake mechanismand two voice coil actuators. The brake mechanism is usedto create a rigid grasping force, while the voice coil actuatorsprovide both touch sensation at initial contact and the sensationof weight when lifting the object. Figure 4 shows the systemdiagram of Grabity and Figure 5 shows our custom circuitdesign for voice coil actuators.

Touching: Conventional VibrationWhen a user touches a virtual object without grasping, thevoice coil actuators act like conventional vibrotactile transduc-ers and play simple symmetric vibrations to indicate the pointof initial contact. Without any haptic feedback it is difficultto detect contact with an object because users rely on visualfeedback only. The contact vibration in our system was onlyplayed during transient events, not during steady-state con-

Page 5: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

Low-passFilter

Teensy 3.6 Microcontroller (180 MHz ARM Cortex-M4 Processor)

Di�erential Ampli�er (LM356N)

Current Ampli�er (APEX PA75)

NPN Transistor(2N2222A)

QuadratureEncoder

Voice CoilActuator 1

(Haptuator Mark II)

Voice CoilActuator 2

(Haptuator Mark II)

Coreless Motorfor

Brake ActuationPC

DAC1 DAC2PWM1 PWM2DigitalOutput

Low-passFilter

DigitalInput 1

DigitalInput 2 USB

Figure 4. Mechatronic system block diagram.

3

21

84

ALM358N

6

57

84

BLM358N

GND

+3.7

+3.7

GND

GND

GND

-3.7

-3.7

DAC1

PWM1

DAC2

PWM2

-3.7

-3.7

+3.7

+3.7

VC2+

VC2-

VC1+

VC1-

GND

GND

GND

GND

+3.7

+3.7

-3.7

-3.7

2 O

hm (5

W)

2 O

hm (5

W)

OUT A1

-IN A2

+IN A3

V-

4

+IN B 5

V+

6

OUT B 7

A1APEX PA75CD

OUT A1

-IN A2

+IN A3

V-

4

+IN B 5

V+

6

OUT B 7

A2APEX PA75CD

12

Connect to 3.7V battery 2

12Connect to 3.7V battery 1

12

Connect to Teensy 3.6

12

Connect to Teensy 3.6

10k

10k

10k

10k

10k

10k

10k

10k

2k7

2k7

1 μF

1 μF

1 2

Connect to Voice Coil 1

1 2

Connect to Voice Coil 2

10uF

10uF

10uF

NC7

NC5

NC7_2

NC5_2

Figure 5. Current-drive circuitry for two voice coil actuators.

tact. VerroTouch [24] showed that playing vibrations duringtransient events can render realistic contact sensations.

The voice coil actuator on the index finger or the thumb vi-brates individually depending on which finger is contactingthe virtual object. While only contacting an object, the brake isopen so the sliding part floats on the base with some tolerance.Therefore, the vibration interfering the other side of the deviceis not very perceptible.

Grasping: Unidirectional Brake MechanismWhen a user grasps a virtual object, the brake mechanism isactivated to create a rigid passive force. We adapted this unidi-rectional brake mechanism from our Wolverine system [10].The brake mechanism provides a locking force using a brakelever, which is activated by a small DC motor (6 mm). Oncethe brake is engaged, the motor is turned off and the user’sown grasping force keeps the brake lever engaged. While thebrake is engaged it provides strong resistance that exceeds100 N in the direction of the two fingers. However, when theuser releases their grip, the brake disengages, allowing theuser to open her hand. While Wolverine [10] used a rubbertendon to move the brake lever back to the original positionwhen releasing a grip, we use two magnets (one on the brakelever and the other on the body of the device) to reset the lever,for more consistent and reliable performance.

Weight: Asymmetric VibrationWhen a user lifts a virtual object, the voice coil actuatorsvibrate asymmetrically to generate the sensation of weight. Ifthe magnet inside the voice coil actuator moves down quickly

0 25 50 757.5 32.5 57.5

0

A

-A

Time (ms)

Curr

ent (

A)

Figure 6. Current signal generating asymmetric vibration. A is ampli-tude of current.

Oculus Rift

OptiTrack

Grabity

CHAI3D

position, orientation

grasp distance

brake, vibration

Figure 7. Block diagram of VR application.

and moves up slowly, the skin on the user’s finger pads isstretched asymmetrically. As described in section describingOverall Structure and shown in the Kinematic Structure inFigure 3, the passive bearing mechanism orients the voice coilsin the direction of gravity, thus we assume the asymmetricvibration is rendered normal to the ground. By changing theasymmetry of the vibration, Grabity simulates various weights.

Figure 6 shows the shape of commanded current pulses. Theshape of these current pulses is designed to give the actuators alarge step of current initially to cause a large acceleration in themagnet, then to ramp the current back down to slowly returnthat magnet to its starting position. To achieve this asymmet-ric actuation, a fast analog signal and current controller arerequired. Therefore, we used a Teensy 3.6 microcontroller(ARM Cortex-M4 at 180 Mhz with two DACs) generating15 kHz analog signal output and a linear current-drive circuit.A current-drive circuit creates less effective damping to thesystem than a voltage-drive circuit, so it is more suitable forasymmetric vibration control [28]. Based on the previouswork from Culbertson et al., we chose a drive signal with a40 Hz frequency (25 ms period) and 0.3 pulse width ratio (t1= 7.5 ms and t2 = 17.5 ms) [12]. To simulate various weightsensations, we change the amplitude A of the signal, whilekeeping the frequency and pulse width ratio fixed.

Inertia: Asymmetric Vibration + Acceleration FeedbackWhen a user shakes or moves a virtual object quickly, the twovoice coil actuators generate asymmetric vibrations propor-tional to the acceleration of the user’s hand. This feedbackcontrol enables Grabity to render inertia of the virtual object.

Page 6: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

−30

−15

0

15

30A = 0.083 A = 0.165 A = 0.248 A = 0.330 A = 0.413

0 25 50 75−30

−15

0

15

30A = 0.495

0 25 50 75

A = 0.578

0 25 50 75

A = 0.660

0 25 50 75

A = 0.743

0 25 50 75

A = 0.825

Time (ms)

Acc

eler

atio

n (G

)

Figure 8. Acceleration change by changing the magnitude of asymmetric vibration. A is amplitude of current output.

However, with the arrangement of voice coil actuators in Gra-bity, the asymmetric vibration can only generate virtual forcesin the direction of the axis of voice coils (1DOF).

SensingAn OptiTrack motion tracking system is used to track theposition and orientation of the thumb. A magnetic encoderis attached to the index finger sliding assembly to track it’sposition relative to the thumb. The encoder is friction drivenand rolls on the prismatic joint (carbon fiber tube). Using thedata from the motion tracking system and magnetic encoder,we can render user’s thumb and index finger in VR.

The resolution of the magnetic encoder and friction driveassembly was measured to be 2 mm of linear travel. Thisresolution is sufficient based on just noticeable difference(JND) results of fingers to thumb distance perception [42].Our Wolverine system [10] had a Time-of-Flight sensor tomeasure this distance with a resolution of 1 mm; however, italso had±1 mm noise with 100 Hz sampling rate. By adaptingthe magnetic encoder, we can achieve much lower noise andmuch higher sampling rate (kHz), at the cost of resolution.

Software: Virtual Reality Haptics FrameworkThe Grabity software framework is implemented in C++ anduses multiple software libraries. As a virtual haptic device,Grabity requires knowledge of its position as input and pro-duces force as output. The information flow begins with posi-tion tracking of Grabity with the OptiTrack motion trackingsystem. The framework gets the 6DOF pose through the Mo-tive C++ API. The grasping distance is transferred over theTeensy’s serial link (2,000,000 bits per second) from the en-coder in the Grabity device’s slider.

In CHAI3D (version 3.2.0), Grabity is represented as a sub-class of the cGenericHapticDevice that accesses both the de-vice position and the grasping distance. CHAI3D integrationfor Open Dynamics Engine (ODE) renders the physical in-teractions. The display appears on the Oculus VR headset,and the force output given by ODE is passed along to thecGrabityDevice class. The force is further separated into itscomponents as it is to the Teensy microcontroller.

Mass Simulation during GraspingCHAI3D provides the output force, torque, and gripper forceto the custom haptic device. Because Grabity has only twomodes of actuation, these virtual values need to be convertedinto a voice coil signal and a command to lock the slider. Thelocking occurs when the gripper force (the force pushing apartthe thumb and finger, i.e., from gripping a block) is greaterthan an empirically determined threshold of 0.7 N. This valuewas chosen to avoid locking when one finger strikes a block,but trigger locking quickly when a block is grasped.

Determining the voice coil signal is more complex. First,we must assume the voice coil is always pointing downward.This assumption is not always correct, as the coils swing ona limited range and only in one dimension. However, wehave found that most hand orientations the subjects use aresufficiently close to this approximation. As such, we use theoutput force’s z-component and ignore the other two.

Second, we must separately extract the downward force foreach of the two fingers. This information is encoded in thetorque. As previously, we approximate the voicecoil directionsas downward. We thus project the finger-to-thumb vector tothe ground plane, and use that vector to convert torque backto force, and add it to the z-output-force. This force value istransmitted over the serial link to the Teensy microcontroller.

In the Teensy, the force is mapped to a voice coil signal. Weuse the data from the first user study, below, to construct themapping from virtual force to amplitude. The amplitude ofthe signal is capped so that forces larger than can be expressedby the voicecoil are expressed by the maximum force possible.

EVALUATIONWe evaluate the performance of Grabity in both quantita-tive and qualitative ways. First, we investigate accelerationchanges versus the magnitude of asymmetric signals to findthe optimal range of current to generate compelling virtualforces. Second, we quantify the magnitude of virtual forceswith respect to the normal direction to ground (down and up)in a user study. We also investigate if we can change the per-ceived magnitude of virtual forces by changing the amplitudeof the asymmetric vibrations. Third, in a second user study,

Page 7: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8−25

−20

−15

−10

−5

0

5

A

abs(

Max

Acc

eler

atio

n)−a

bs(M

in A

ccel

erat

ion)

[G]

-1G (reference)

Here, asymmetry increases with amplitude.

Figure 9. Acceleration asymmetry vs. Amplitude of Current.

based on the quantitative data, we give different amounts ofvirtual forces to users in a virtual environment and recordqualitative feedback.

Acceleration Asymmetry vs. Current MagnitudeIt is important that the users’ finger pads receive the desiredasymmetric vibration to feel the intended virtual force feed-back. To evaluate the performance of the commanded asym-metric vibration, we attached an accelerometer (ADXL150)to the base part and measured the acceleration of the devicewhile asymmetric vibrations were played. The device wasfixed with clamps to the ground.

We measured acceleration with different magnitudes to deter-mine the relationship between them. The voice coil’s currentamplifier operates between -0.825 A to 0.825 A (with a gain of0.5 A/V and -1.65 V to 1.65 V signal range from Teensy). Weevenly divided the amplifier’s current range into 10 differentmagnitudes of current, and played asymmetric vibrations atdifferent magnitudes through the voice coils. Figure 8 showsthe resulting measured accelerations. Note that positive val-ues are directed upward from ground, and negative values aredirected downward towards ground.

To more clearly see the asymmetry of the accelerations, wesubtracted the absolute value of the minimum accelerationfrom the absolute value of the maximum acceleration. Theresults are shown in Figure 9. When the amplitude is small(A < 0.21), the current is not large enough to generate asym-metry in the accelerations. For current values in the middlerange (0.21 < A < 0.42), the asymmetry of acceleration in-creases proportionally to the amplitude of current. After acertain magnitude (A > 0.42), an increase in current resultedin a decrease in asymmetry of the acceleration signal. Thedirection of the asymmetric acceleration was switched whenthe magnitude increased even more (A > 0.7). This is dueto a limitation of the voice coil actuator we chose. The coiland permanent magnet are connected with elastic membranes.Once the current exceeds a certain threshold (A< 0.42), springforces from the membranes are so large that the magnet cannotmove further, and it receives a large force pulling it backward.A voice coil actuator with a softer or no membrane wouldhelp this in the future. Based on this experiment, we chooseamplitudes (A) of 0.248, 0.330 and 0.413 for the followingevaluation.

0 25 50 757.5 32.5 57.5

0

A

-A

Time (ms)

Curre

nt (A

)

Downward

Time (ms)

Curre

nt (A

)

0 25 50 757.5 32.5 57.5

0

A

-A

Upward

Time (ms)

Curre

nt (A

)

0 25 50 757.5 32.5 57.5

0

A

-A

Upward

0 25 50 757.5 32.5 57.5

0

A

-A

Time (ms)

Curre

nt (A

)

Downward

0 25 50 75

0

A

-A

Time (ms)

Curre

nt (A

)

Symmetric

0 25 50 75

0

A

-A

Time (ms)

Curre

nt (A

)

Symmetric

vs.

vs.

vs.

Figure 10. Virtual forces through asymmetric vibration were quantifiedin 3 conditions. Downward Asymmetric vs. Symmetric (Top), UpwardAsymmetric vs. Symmetric (Middle), Downward Asymmetric vs. Up-ward Asymmetric (Bottom). 3 different current magnitudes were testedin each condition.

User Study 1: Virtual Forces MeasurementThe purpose of this task is to quantify the magnitude of vir-tual forces. First, we evaluate the perceived magnitude of thevirtual force by changing the amplitude of the asymmetricsignal. Second, we aim to measure the virtual force with threedifferent conditions to create weight sensations: downwarddirected force, upward directed force, and the perceived forcedifference between downward and upward directed forces.Figure 10 shows condition sets for virtual force quantification.To achieve the first goal, we use three different magnitudes(A∈ {0.248,0.330,0.413}) chosen for their different accelera-tion profiles as shown in Figure 9. To achieve the second goal,we compare downward asymmetric vibrations with symmetricvibrations, upward asymmetric vibrations with symmetric vi-brations, and downward asymmetric vibrations with upwardasymmetric vibrations. The two compared vibrations alwayshave the same current amplitude.

We have two hypotheses:

Hypothesis 1: The magnitude of perceived virtual forces canbe manipulated with the amplitude of the asymmetric vibrationsignal, for both downward and upward directed forces (normalto ground).

Hypothesis 2: Comparing from a baseline of an upward virtualforce reference, a downward virtual force will increase theperceived force difference.

Task and ProcedureDuring this task, participants compared two weights multi-ple times relying on the virtual force sensation. We used thestaircase method [11] to measure the amount of virtual forceswith a reference weight of 65 grams (the device weight) and astep-size of 5.4 grams, which was decided based on pilot tests.Physical weights were added to one of the comparison vibra-tion patterns, as shown in 12 (right), according to the staircasemethod. Participants were wearing a blindfold, earplugs, and

Page 8: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

Table 1. Weight Equivalence p-values, Bonferroni corrected, **p<0.005, ***p<0.0005, + means adjusted p-value is greater than 1Signal Low, Down Mid, Down High, Down Low, Up Mid, Up High, Up Low, Both Mid, Both

Mid, Down 0.16High, Down 0.0014** +

Low, Up 0.23 <0.0001*** <0.0001***Mid, Up 0.0006** <0.0001*** <0.0001*** +High, Up <0.0001*** <0.0001*** <0.0001*** 0.20 +Low, Both + 0.033 0.0002*** 0.96 0.0043** 0.0001***Mid, Both 0.49 + + <0.0001*** <0.0001*** <0.0001*** 0.11High, Both 0.0004*** + + <0.0001*** <0.0001*** <0.0001*** <0.0001*** +

Figure 11. Virtual force measuring experiment setup. Right: Devicewith added physical weights.

noise canceling headphones playing white noises to eliminatepossible effects that may arise from visual and audio cues.To avoid possible effects from different sizes, the device wasalways fixed to give users a 70 mm grasping distance.

First, participants were trained for 5 minutes on how to graspthe device correctly and were shown the different asymmet-ric vibration cues. Then, participants were guided througha staircase procedure to determine the perceived force. Theexperimenter handed the device to participants. Once theparticipants said they are ready to measure the weight, the ex-perimenter set the baseline signal, and the participants movedtheir hands up and down three time to measure and rememberits weight. Once they stopped the motion, the experimentertook the device, changed the vibration signal, and gave thedevice again to participants. Participants repeated the mo-tion, then decided which one was heavier. The experimenterthen added the step-sized physical weight to the signal thatparticipants determined felt lighter. This procedure was re-peated until the "lighter" signal reversed (from baseline signal+ weight to asymmetric signal or vice versa) three times. Therecorded weight is the average of the three reversal points.

Participants12 participants (6 female, 6 male) were recruited for the ex-periment. All participants were right-handed and 24–29 yearsold. Their hand sizes (length from bottom of palm to the endof the middle finger) varied 14–21 cm. Four participants re-sponded that they had never tried haptic devices, besides thosein cellphones or commercial game controllers. The subjectsreceived a nominal compensation ($15) for their participation.

ResultsThe results of this first user study are shown in Figure12. We analyzed the results using a one-way repeated

Down+

Low

Down+

Mid

Down+

High

Up+Lo

wUp

+Mid

Up+Hi

gh

Both

+High

Both

+Mid

Both

+Low

Eff

ective

Wei

ght

[gra

ms]

0

-10

-20

-30

10

20

Figure 12. Result of virtual force measurement task

measures ANOVA. We found a significant effect of Signalon Weight (F(8,90)=12.1, p<0.001, η2=0.51 with CI=[0.30,0.57]). Bonferroni-corrected significance measures are givenin Table 1.

DiscussionOur first hypothesis, that the magnitude of the virtual force canbe manipulated with the amplitude of the asymmetric signal,was supported by the data. Most of the comparisons betweenupward and downward forces were significant. In addition,the low and the high amplitude signals in the downward forcecase were shown to be significantly different. However, therewas not any statistical significance between the Mid Downand other Down conditions. We believe this could be dueto the nonlinearity of asymmetric responses of the three cur-rent levels. As shown in Figure 9 the higher current levels,which correspond to mid Down and high Down, were closer inasymmetry. Another effect may be the time between the com-parisons as well as the fact that the step-size for the referenceweight was close to the just noticeable difference for weightperception. In addition, finger pad placement and grip forcehave a strong effect on the asymmetric skin stretch. Althoughwe instructed the participants to have a consistent grip through-out out the experiment, given the duration of the study andthe number of times the device was gripped during the study,it seems likely that this was inconsistent within and certainlybetween users. Finally, finger size and other anthropometricfactors seem to play a role in sensitivity to asymmetric skin

Page 9: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

Figure 13. Integration of Grabity with VR setup for tasks

stretch. Thus, while the results are encouraging we believefurther study is needed to collect more controlled data. How-ever, these results should give a good impression of real worldperformance where many different users might use a generichaptic device.

However, we found no evidence to support our second hy-pothesis, that providing a baseline upward force to comparethe downward force to would increase the perceived forcedifference. This result is particularly interesting, because wehave modeled the effect of asymmetric skin deformation asan additive factor. This could be due to the time betweenwhich participants felt the stimuli because it took some timefor the experimenter to add weight, so changes were not in-stantaneous. An immediate up-then-down signal may providea more noticeable delta. In addition, because the physicalweights were added most often to the up reference signal,users had conflicting weight cues. Some users reported thatthey perceived the physical weights as mass and the up signalas a pulling force, not less mass. This suggests that physicalweight or mass cannot be "offset" by a virtual force.

User Study 2: Discriminating weights in a Virtual Environ-mentIn the first user study, we investigated whether different mag-nitudes of asymmetric vibrations create different perceivedvirtual forces. In this section, we test the ability of users todiscriminate between different simulated weights with a VRsetup shown in Figure 13. Furthermore, we gather qualitativefeedback about using Grabity for VR applications.

Task and ProcedureIn a virtual reality environment, the participant manipulatesthree blocks with different virtual masses using Grabity, seeFigure 14. The experimenter instructs the participant to sortthese blocks in the manipulation space from lightest on theleft to heaviest on the right. The force of weight from a blockis simulated by one of three voice coil signals used duringthe first user study (A = 0.248, 0.330, 0.413). The participantcompletes 6 rounds of the sorting experiment. To avoid thecolor effect on the weight of objects, we randomized the colorsfor every round. We also informed users before starting thetask that colors do not have a relationship with weights.

Figure 14. Weight sorting task.

Figure 15. Free activity using Grabity in VR for qualitative feedback.

After the 6 rounds, the experimenter changed the virtual realityenvironment to contain four objects: a small box, large box,flat box, and cylinder. The participant was allowed to freelyexplore these objects for 3 minutes before being asked to fillout a qualitative questions.

In order for the participants to become accustomed to the skindeformation actuator and the handheld haptic device, theyare shown the maximum asymmetric vibration upward anddownward cues before the study began. Then, in the firstvirtual environment in the study, there is one virtual block,which the experimenter instructs the subject to manipulateuntil they are comfortable picking up and moving it. As in thefirst user study, the participant wore a headset, earplugs, andheadphones playing white noise.

The power of the signal the participants feels does not respondto acceleration or tilting, only to whether the user is graspinga block and the mass of that block. However, during the free-play exploration mode, the actuator simulates a continuousrange of forces. During this mode, natural physical effects,like forces and torques due to acceleration, are rendered byGrabity. We wished to show the widest expressiveness of ourdevice possible when receiving qualitative feedback.

Participants5 participants (1 female, 4 male) participated in the task. Par-ticipants in this group did not overlap with participants of thefirst user study. All participants were right-handed and 23-28years old. 2 participants responded that they had never triedVR applications. Participants did not receive compensationfor this short task.

ResultsThe confusion matrix is shown in Figure 16.

Using the Kruskal-Wallis nonparametric test, we show a sig-nificant effect of virtual weight on ordering (χ2(2)=30.95, p <0.001). A post-hoc test using Mann-Whitney tests with Bon-ferroni correction showed significant differences among allgroups. The groups Light and Heavy are different (p < 0.001,r=1.06), Light and Medium are different (p < 0.001, r=0.79),and Medium and Heavy are different (p < 0.05, r=0.50).

Page 10: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

Figure 16. Confusion matrix of sorting different weights task.

Comfort

Discernibility

Confidence

Ease of Manipulation

Weight over Vibration

Realism

11 52 3 4

Figure 17. Medians from Likert scale survey questions in blue. Modesare marked by yellow circles.

The post-test questionnaire contains six Likert scale questionsabout the usability of the device, shown in Figure 17.

We collected user statements on the holistic experience ofGrabity as well as on the sensation for grasping virtual objects.Every user mentioned that the weight-sorting task was difficult:“...oftentimes I could not tell the difference between the twoweights,” and “distinguishing between similar sized weightswas quite challenging.” Users appreciated the ungroundedquality of Grabity: “It is nice to be able to grab virtual objectswithout being tethered,” and were surprised at the virtual forcesensation: “In the ‘heaviest’ setting it really felt like my handwas being pulled down, and on the ‘lightest’ my hand felt likeit was floating up.”

DiscussionThe users’ rankings showed significant differences amongthe different virtual weights. However, as evidenced by thequalitative results, users found it difficult to tell which blockweighed the most. In addition, users felt stronger vibrationcues than weight cues. The amplitude of the signal, as opposedto the virtual weight, is a confounding factor.

LIMITATIONS AND FUTURE WORKWhile our results indicate that asymmetric skin stretch canindeed enhance grasping interactions in VR, we also identi-fied numerous challenges, which we hope to address in futurework. We are particularly interested in quantifying the effectof vibration on force rendering realism. While we are able tomask the vibration sounds using earplugs and noise-canceling

headphones, there is still a perceivable vibratory effect, and wewould like to better understand its influence on the force sen-sation. Further, we would like to quantify how the perceptionof force rendering changes over time, as it is known that ourmechanoreceptors are designed to gradually accommodate andcancel out vibration feedback. In addition, because Grabityuses a pair of voice coil actuators, theoretically, torques couldbe rendered. While we have experienced this in the lab, furtherstudy to quantify and understand this effect is necessary.

We are also interested in exploring different types of voicecoil actuators or other actuation technologies that can enablestronger effects. Additionally, sensing of grip force (throughpressure sensors) and finger pad placement (through capacitivesensing arrays) could help provide more robust and consis-tent virtual forces for different users and across different gripstyles. Attaching the device to the user’s fingers (importantfor allowing them to open the gripper) also continues to poseproblems both for dampening and for transmitting vibrationsto the dorsal area of the finger, which can limit the skin stretchcues. Better techniques and mechanical designs for finger padgrounding need to be explored. Beyond this, a large limitationremains that asymmetric skin stretch can only render virtualforces tangentially to the finger pad – not normal to it. Oneapproach may be to combine skin stretch with a rotationalmomentum-based pseudo-force rendering technique, such asmotor rotational acceleration [46], which could provide thesensation of torques in an axis that asymmetric skin stretchcan not support.

Finally, we would like to formally evaluate our approach incomparison with grounded, stationary haptic devices. Thiswould, for example, allow us to assess the impact of our limita-tions of only supporting lateral motions on the skin stretchingsurface. Through the insight gathered from additional exper-iments, we would hope to provide a broader set of designguidelines for mobile, ungrounded haptics.

CONCLUSIONSIn this work, we introduced Grabity, which focuses on en-hancing the grasping sensation for virtual objects throughkinesthetic feedback for contact, grasping, gravity and inertia.We contribute a novel, unified design based on the combi-nation of vibrotactile feedback, uni-directional brakes, andasymmetric skin stretch. Our gripper-style haptic device gen-erates opposing forces between the index finger and thumb,and uses a skin stretch mechanism, to render inertia and mass.

While portable devices exist that provide haptic feedback,these are generally limited in the type of sensation that can begenerated given the difficulty to simulate weight and inertiawithout mechanically grounded hardware. Grabity demon-strates how such limitations can be overcome in mobile, un-grounded mechanical designs, while advancing the researchinto asymmetric skin stretch as a means for haptic force ren-dering.

ACKNOWLEDGMENTSWe would like to thank Chris Ploch, Alexa Siu, Mathieu LeGoc for pilot studies, figures, and videos. This work wasfunded in part by a Google Faculty Research Award.

Page 11: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

REFERENCES1. CyberGrasp, CyberGlove Systems Inc.http://www.cyberglovesystems.com/cybergrasp/. (????).Accessed: 2017-04-03.

2. Tomohiro Amemiya and Hiroaki Gomi. 2014. Distinctpseudo-attraction force sensation by a thumb-sizedvibrator that oscillates asymmetrically. In InternationalConference on Human Haptic Sensing and TouchEnabled Computer Applications. Springer, 88–95.

3. Tomohiro Amemiya and Taro Maeda. 2008. Asymmetricoscillation distorts the perceived heaviness of handheldobjects. IEEE Transactions on Haptics 1, 1 (2008), 9–18.

4. AA Amis. 1987. Variation of finger forces in maximalisometric grasp tests on a range of cylinder diameters.Journal of biomedical engineering 9, 4 (1987), 313–320.

5. Federico Barbagli, Kenneth Salisbury, and RomanDevengenzo. 2004. Toward virtual manipulation: fromone point of contact to four. Sensor Review 24, 1 (2004),51–59.

6. Karlin Bark, Jason Wheeler, Gayle Lee, Joan Savall, andMark Cutkosky. 2009. A wearable skin stretch device forhaptic feedback. In EuroHaptics conference, 2009 andSymposium on Haptic Interfaces for Virtual Environmentand Teleoperator Systems. World Haptics 2009. ThirdJoint. IEEE, 464–469.

7. Hrvoje Benko, Christian Holz, Mike Sinclair, and EyalOfek. 2016. NormalTouch and TextureTouch:High-fidelity 3D Haptic Shape Rendering on HandheldVirtual Reality Controllers. In Proceedings of the 29thAnnual Symposium on User Interface Software andTechnology. ACM, 717–728.

8. Mourad Bouzit, Grigore Burdea, George Popescu, andRares Boian. 2002. The Rutgers Master II-new designforce-feedback glove. IEEE/ASME Transactions onmechatronics 7, 2 (2002), 256–263.

9. Francesco Chinello, Monica Malvezzi, ClaudioPacchierotti, and Domenico Prattichizzo. 2015. Designand development of a 3RRS wearable fingertip cutaneousdevice. In Advanced Intelligent Mechatronics (AIM),2015 IEEE International Conference on. IEEE, 293–298.

10. Inrak Choi, Elliot W Hawkes, David L Christensen,Christopher J Ploch, and Sean Follmer. 2016. Wolverine:A wearable haptic interface for grasping in virtual reality.In Intelligent Robots and Systems (IROS), 2016 IEEE/RSJInternational Conference on. IEEE, 986–993.

11. Tom N Cornsweet. 1962. The staircase-method inpsychophysics. The American journal of psychology 75, 3(1962), 485–491.

12. Heather Culbertson, Julie M Walker, and Allison MOkamura. 2016. Modeling and design of asymmetricvibrations to induce ungrounded pulling sensationthrough asymmetric skin displacement. In HapticsSymposium (HAPTICS), 2016 IEEE. IEEE, 27–33.

13. Heather Culbertson, Julie M Walker, Michael Raitor, andAllison M Okamura. 2017. WAVES: A WearableAsymmetric Vibration Excitation System for PresentingThree-Dimensional Translation and Rotation Cues. InProceedings of the 2017 CHI Conference on HumanFactors in Computing Systems. ACM, 4972–4982.

14. Takahiro Endo, Haruhisa Kawasaki, Tetsuya Mouri,Yasuhiko Ishigure, Hisayuki Shimomura, MasatoMatsumura, and Kazumi Koketsu. 2011. Five-fingeredhaptic interface robot: HIRO III. IEEE Transactions onHaptics 4, 1 (2011), 14–27.

15. Xiaochi Gu, Yifei Zhang, Weize Sun, Yuanzhe Bian, DaoZhou, and Per Ola Kristensson. 2016. Dexmo: AnInexpensive and Lightweight Mechanical Exoskeleton forMotion Capture and Force Feedback in VR. InProceedings of the 2016 CHI Conference on HumanFactors in Computing Systems. ACM, 1991–1995.

16. Sidhant Gupta, Tim Campbell, Jeffrey R Hightower, andShwetak N Patel. 2010. SqueezeBlock: using virtualsprings in mobile devices for eyes-free interaction. InProceedings of the 23nd annual ACM symposium on Userinterface software and technology. ACM, 101–104.

17. Vincent Hayward and M Cruz-Hernandez. 2000. Tactiledisplay device using distributed lateral skin stretch. InProceedings of the haptic interfaces for virtualenvironment and teleoperator systems symposium,Vol. 69. ASME, 1309–1314.

18. James Houk and William Simon. 1967. Responses ofGolgi tendon organs to forces applied to muscle tendon.Journal of neurophysiology 30, 6 (1967), 1466–1481.

19. Alexandra Ion, Edward Jay Wang, and Patrick Baudisch.2015. Skin drag displays: Dragging a physical tactoracross the user’s skin produces a stronger tactile stimulusthan vibrotactile. In Proceedings of the 33rd Annual ACMConference on Human Factors in Computing Systems.ACM, 2501–2504.

20. RS Johansson and G Westling. 1984. Roles of glabrousskin receptors and sensorimotor memory in automaticcontrol of precision grip when lifting rougher or moreslippery objects. Experimental brain research 56, 3(1984), 550–564.

21. Kenneth O Johnson. 2001. The roles and functions ofcutaneous mechanoreceptors. Current opinion inneurobiology 11, 4 (2001), 455–461.

22. Lynette A Jones. 1986. Perception of force and weight:Theory and research. Psychological bulletin 100, 1(1986), 29.

23. Rebecca Khurshid, Naomi Fitter, Elizabeth Fedalei, andKatherine Kuchenbecker. 2016. Effects of Grip-Force,Contact, and Acceleration Feedback on a TeleoperatedPick-and-Place Task. IEEE transactions on haptics(2016).

Page 12: Grabity: A Wearable Haptic Interface for Simulating Weight ...€¦ · Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality Inrak Choi 1, Heather

24. Katherine J Kuchenbecker, Jamie Gewirtz, WilliamMcMahan, Dorsey Standish, Paul Martin, JonathanBohren, Pierre J Mendoza, and David I Lee. 2010.VerroTouch: High-frequency acceleration feedback fortelerobotic surgery. In International Conference onHuman Haptic Sensing and Touch Enabled ComputerApplications. Springer, 189–196.

25. Susan J Lederman and Roberta L Klatzky. 1987. Handmovements: A window into haptic object recognition.Cognitive psychology 19, 3 (1987), 342–368.

26. Christine L MacKenzie and Thea Iberall. 1994. Thegrasping hand. Vol. 104. Elsevier.

27. Thomas H Massie, J Kenneth Salisbury, and others. 1994.The phantom haptic interface: A device for probingvirtual objects. In Proceedings of the ASME winterannual meeting, symposium on haptic interfaces forvirtual environment and teleoperator systems, Vol. 55.Citeseer, 295–300.

28. William McMahan and Katherine J Kuchenbecker. 2014.Dynamic modeling and control of voice-coil actuators forhigh-fidelity display of haptic vibrations. In HapticsSymposium (HAPTICS), 2014 IEEE. IEEE, 115–122.

29. Kouta Minamizawa, Souichiro Fukamachi, HiroyukiKajimoto, Naoki Kawakami, and Susumu Tachi. 2007a.Gravity grabber: wearable haptic display to presentvirtual mass sensation. In ACM SIGGRAPH 2007emerging technologies. ACM, 8.

30. Kouta Minamizawa, Hiroyuki Kajimoto, NaokiKawakami, and Susumu Tachi. 2007b. A wearable hapticdisplay to present the gravity sensation-preliminaryobservations and device design. In EuroHapticsConference, 2007 and Symposium on Haptic Interfacesfor Virtual Environment and Teleoperator Systems. WorldHaptics 2007. Second Joint. IEEE, 133–138.

31. Jun Murayama, Laroussi Bougrila, YanLin Luo,Katsuhito Akahane, Shoichi Hasegawa, Béat Hirsbrunner,and Makoto Sato. 2004. SPIDAR G&G: a two-handedhaptic interface for bimanual VR interaction. InProceedings of EuroHaptics, Vol. 2004. 138–146.

32. Zoran Najdovski and Saeid Nahavandi. 2008. Extendinghaptic device capability for 3D virtual grasping. Haptics:perception, devices and scenarios (2008), 494–503.

33. Ryuma Niiyama, Lining Yao, and Hiroshi Ishii. 2014.Weight and volume changing device with liquid metaltransfer. In Proceedings of the 8th InternationalConference on Tangible, Embedded and EmbodiedInteraction. ACM, 49–52.

34. Claudio Pacchierotti, Gionata Salvietti, Irfan Hussain,Leonardo Meli, and Domenico Prattichizzo. 2016. ThehRing: A wearable haptic device to avoid occlusions inhand tracking. In Haptics Symposium (HAPTICS), 2016IEEE. IEEE, 134–139.

35. Jérôme Pasquero and Vincent Hayward. 2003. STReSS:A practical tactile display system with one millimeter

spatial resolution and 700 Hz refresh rate. In Proc.Eurohaptics, Vol. 2003. 94–110.

36. Christopher J Ploch, Jung Hwa Bae, Wendy Ju, and MarkCutkosky. 2016. Haptic skin stretch on a steering wheelfor displaying preview information in autonomous cars.In Intelligent Robots and Systems (IROS), 2016 IEEE/RSJInternational Conference on. IEEE, 60–65.

37. William Provancher. 2014. Creating greater VRimmersion by emulating force feedback with ungroundedtactile feedback. IQT Q 6, 2 (2014), 18–21.

38. William R Provancher and Nicholas D Sylvester. 2009.Fingerpad skin stretch increases the perception of virtualfriction. IEEE Transactions on Haptics 2, 4 (2009),212–223.

39. Zhan Fan Quek, Samuel B Schorr, Ilana Nisky,William R Provancher, and Allison M Okamura. 2015.Sensory substitution and augmentation using3-degree-of-freedom skin deformation feedback. IEEEtransactions on haptics 8, 2 (2015), 209–221.

40. Jun Rekimoto. 2014. Traxion: a tactile interaction devicewith virtual force sensation. In ACM SIGGRAPH 2014Emerging Technologies. ACM, 25.

41. Samuel B Schorr and Allison M Okamura. 2017.Fingertip Tactile Devices for Virtual Object Manipulationand Exploration. In Proceedings of the 2017 CHIConference on Human Factors in Computing Systems.ACM, 3115–3119.

42. Hong Z Tan, Xiao Dong Pang, Nathaniel I Durlach, andothers. 1992. Manual resolution of length, force, andcompliance. Advances in Robotics 42 (1992), 13–18.

43. Nikolaos G Tsagarakis, T Horne, and Darwin G Caldwell.2005. Slip aestheasis: A portable 2d slip/skin stretchdisplay for the fingertip. In Eurohaptics Conference, 2005and Symposium on Haptic Interfaces for VirtualEnvironment and Teleoperator Systems, 2005. WorldHaptics 2005. First Joint. IEEE, 214–219.

44. Ernst Heinrich Weber, Helen Elizabeth Ross, and David JMurray. 1996. EH Weber on the tactile senses.Psychology Press.

45. Vibol Yem, Ryuta Okazaki, and Hiroyuki Kajimoto.2016a. FinGAR: combination of electrical andmechanical stimulation for high-fidelity tactilepresentation. In ACM SIGGRAPH 2016 EmergingTechnologies. ACM, 7.

46. Vibol Yem, Ryuta Okazaki, and Hiroyuki Kajimoto.2016b. Vibrotactile and pseudo force presentation usingmotor rotational acceleration. In Haptics Symposium(HAPTICS), 2016 IEEE. IEEE, 47–51.

47. André Zenner and Antonio Krüger. 2017. Shifty: AWeight-Shifting Dynamic Passive Haptic Proxy toEnhance Object Perception in Virtual Reality. IEEETransactions on Visualization and Computer Graphics 23,4 (2017), 1285–1294.