6
IEEE TRANSACTIONS ON JOURNAL NAME, MANUSCRIPT ID 1 A Model for Sharing Haptic Interaction M. Nakao IEEE Member, R. Kitamura, T. Sato and K. Minato IEEE Member Abstract— In this paper, we present new methods for sharing haptic interactions with other persons when touching real-world objects. Unlike the previous approaches, our system does not rely on a virtual model of the object being explored or the integration of visual/auditory modalities to augment the user’s perception. We developed a prototype system that makes it possible to capture active haptic interactions on a transmitting side and then display them passively to a receiving side. To demonstrate the concept, we conducted human experiments using rubber samples and found that the relationship between the haptic sensations of the transmitting and receiving sides follow a power function, and the parameters of this function can be calibrated for the users of the system. Index Terms— haptic sharing, haptic transfer, interaction capture and presence —————————— —————————— 1 INTRODUCTION aptic sensations are perceived through direct con- tact with objects. Understandably, this makes it dif- ficult for one person to experience the same sensa- tions as another person at the same time. Physical limita- tions prevent two people from simultaneously pressing their fingers onto the same location. This simple con- straint hinders the efforts to accurately convey haptic in- formation to other persons experimentally. Once it be- comes possible to accurately transmit and share haptic information among multiple persons, systems can be de- veloped to enrich human communication in realms of activities such as education and entertainment. For exam- ple, haptic sharing will be used to transmit the sensations from technical tasks such as palpation by communicating ‘the feel’ of objects in distant locations [1][2]. In order to realize these systems, many researchers are exploring virtual reality and robotics technologies with which to share haptic information perceived by other people. Earlier research into haptic sharing has focused on the sharing of haptic sensations by active or passive methods. In most haptic systems, the user can actively touch mod- els constructed in virtual spaces [3]. In others, the user perceives the haptic sensations passively, sometimes us- ing visual and auditory channels [4][5]. In both cases, the shared haptic sensations were generated as contact forces with virtual objects. However, the haptic sharing through virtual objects in virtual space necessarily limits instant, casual communication, as no object can be shared unless its physical characteristics has been accurately measured and modeled beforehand. Therefore, we are rather inter- ested in direct haptic transfer without any use of virtual spaces or other sensory channels. Here, we note that two people do not always feel the same haptic sensation even if the same force is presented [6][7]. The sensation is af- fected by other physical conditions, for example, the way of touch and the tension in the fingers. To minimize these factors, we specifically focus on the importance of com- pensating for the difference between active touch and passive touch. This study proposes a model for sharing haptic sensa- tions in the real world in real time. Our first target in this paper is to share interaction with soft, real-world objects. To make this happen, we do not rely on their virtual models. A new haptic transfer model is proposed to equalize the haptic sensation obtained at the transmitting side and the sensation presented at the receiving side. In addition to this, we designed a haptic sharing system that makes it possible to capture 3D interaction and to dynamically display the transferred force as a haptic sensation in an- other place. Human experiments are conducted to evalu- ate the proposed model and the system. The experiments confirm that the proposed system enables people to share haptic sensations sensed in the real world. 2 RELATED WORKS We surveyed previous studies on the sharing of haptic information. Several researchers have carried out sensa- tion-sharing studies in which haptic information is repro- duced by playing back stored information. Nakao et al. [3] developed an arteriosclerosis model and a virtual pal- pation environment for sharing the stiffness of organs. In order to playback the whole process of interaction, Wil- liams II et al. [4] and Wang et al. [8] recorded the finger trajectory of an expert in virtual space. The studies by Fujita et al. [5] and Ishibashi et al. [9] sought to provide haptic information in real time. Fujita’s group [5] investi- gated the passive sharing of the haptic sensations of an elastic object placed in a virtual space. In a same fashion, Ishibashi et al. [9] constructed a remote calligraphy sys- tem that enabled users to write characters in a virtual space with a haptic device resembling a brush. The feature of these sensation-sharing systems is to uti- xxxx-xxxx/0x/$xx.00 © 200x IEEE ———————————————— M. Nakao is with Nara Institute of Science and Technology, Nara, JA- PAN. E-mail: [email protected] R. Kitamura is with Nara Institute of Science and Technology, Nara, JAPAN. E-mail: [email protected] T. Sato is with Nara Institute of Science and Technology, Nara, JAPAN. E-mail: [email protected] K. Minato is with Nara Institute of Science and Technology, Nara, JA- PAN. E-mail: [email protected] Manuscript received May 2, 2009; accepted November 6, 2009. H

A model for sharing haptic interaction

Embed Size (px)

Citation preview

IEEE TRANSACTIONS ON JOURNAL NAME, MANUSCRIPT ID 1

A Model for Sharing Haptic Interaction M. Nakao IEEE Member, R. Kitamura, T. Sato and K. Minato IEEE Member

Abstract— In this paper, we present new methods for sharing haptic interactions with other persons when touching real-world objects. Unlike the previous approaches, our system does not rely on a virtual model of the object being explored or the integration of visual/auditory modalities to augment the user’s perception. We developed a prototype system that makes it possible to capture active haptic interactions on a transmitting side and then display them passively to a receiving side. To demonstrate the concept, we conducted human experiments using rubber samples and found that the relationship between the haptic sensations of the transmitting and receiving sides follow a power function, and the parameters of this function can be calibrated for the users of the system.

Index Terms— haptic sharing, haptic transfer, interaction capture and presence

—————————— ——————————

1 INTRODUCTION

aptic sensations are perceived through direct con-tact with objects. Understandably, this makes it dif-ficult for one person to experience the same sensa-

tions as another person at the same time. Physical limita-tions prevent two people from simultaneously pressing their fingers onto the same location. This simple con-straint hinders the efforts to accurately convey haptic in-formation to other persons experimentally. Once it be-comes possible to accurately transmit and share haptic information among multiple persons, systems can be de-veloped to enrich human communication in realms of activities such as education and entertainment. For exam-ple, haptic sharing will be used to transmit the sensations from technical tasks such as palpation by communicating ‘the feel’ of objects in distant locations [1][2]. In order to realize these systems, many researchers are exploring virtual reality and robotics technologies with which to share haptic information perceived by other people.

Earlier research into haptic sharing has focused on the sharing of haptic sensations by active or passive methods. In most haptic systems, the user can actively touch mod-els constructed in virtual spaces [3]. In others, the user perceives the haptic sensations passively, sometimes us-ing visual and auditory channels [4][5]. In both cases, the shared haptic sensations were generated as contact forces with virtual objects. However, the haptic sharing through virtual objects in virtual space necessarily limits instant, casual communication, as no object can be shared unless its physical characteristics has been accurately measured and modeled beforehand. Therefore, we are rather inter-ested in direct haptic transfer without any use of virtual spaces or other sensory channels. Here, we note that two

people do not always feel the same haptic sensation even if the same force is presented [6][7]. The sensation is af-fected by other physical conditions, for example, the way of touch and the tension in the fingers. To minimize these factors, we specifically focus on the importance of com-pensating for the difference between active touch and passive touch.

This study proposes a model for sharing haptic sensa-tions in the real world in real time. Our first target in this paper is to share interaction with soft, real-world objects. To make this happen, we do not rely on their virtual models. A new haptic transfer model is proposed to equalize the haptic sensation obtained at the transmitting side and the sensation presented at the receiving side. In addition to this, we designed a haptic sharing system that makes it possible to capture 3D interaction and to dynamically display the transferred force as a haptic sensation in an-other place. Human experiments are conducted to evalu-ate the proposed model and the system. The experiments confirm that the proposed system enables people to share haptic sensations sensed in the real world.

2 RELATED WORKS We surveyed previous studies on the sharing of haptic

information. Several researchers have carried out sensa-tion-sharing studies in which haptic information is repro-duced by playing back stored information. Nakao et al. [3] developed an arteriosclerosis model and a virtual pal-pation environment for sharing the stiffness of organs. In order to playback the whole process of interaction, Wil-liams II et al. [4] and Wang et al. [8] recorded the finger trajectory of an expert in virtual space. The studies by Fujita et al. [5] and Ishibashi et al. [9] sought to provide haptic information in real time. Fujita’s group [5] investi-gated the passive sharing of the haptic sensations of an elastic object placed in a virtual space. In a same fashion, Ishibashi et al. [9] constructed a remote calligraphy sys-tem that enabled users to write characters in a virtual space with a haptic device resembling a brush.

The feature of these sensation-sharing systems is to uti-

xxxx-xxxx/0x/$xx.00 © 200x IEEE

———————————————— M. Nakao is with Nara Institute of Science and Technology, Nara, JA-

PAN. E-mail: [email protected] R. Kitamura is with Nara Institute of Science and Technology, Nara,

JAPAN. E-mail: [email protected] T. Sato is with Nara Institute of Science and Technology, Nara, JAPAN.

E-mail: [email protected] K. Minato is with Nara Institute of Science and Technology, Nara, JA-

PAN. E-mail: [email protected]

Manuscript received May 2, 2009; accepted November 6, 2009.

H

2 IEEE TRANSACTIONS ON JOURNAL NAME, MANUSCRIPT ID

lize the pre-constructed model and virtul space. Although this approach allows sharing of 3D positions and force with other people, setup of virtual environments is inevi-tably elaborate. In addition, the degree of reality of 3D models and virtual worlds affects the presence of the hap-tic information. Particularly when touching elastic objects, accurate, real-time simulation of soft objects will be re-quired. These factors severely limit casual communication as well as the realism of haptic interactions. Hence, we are interested in a new haptic sharing framework that does not rely on modeling of target objects or virtual spaces.

In the context of direct haptic transfer without virtual spaces, we further refer to the related studies. Saga et al. [11] developed a haptic system that allows operators to learn how an expert applies force in hand writing. The operators can obtain the recorded force by actively resist-ing the presented force. We note that their system was designed for sharing a 2D trajectory and a 1D pressure in hand writing on stiff materials. In addition, since the force is recorded based on the position and only presented by the trainee's active contact, time series information such as movement of the manipulator was not reproduced. Kikuue et al. [7] developed a device for displaying both 1D position and force. Using the device, they examined the relationship between active perception and passive perception. However, they measured the force as static information based on 1D finger position, and did not as-sume an arbitrary interaction case. Also, they did not pre-sent any design on how to capture the user's interaction. Application to haptic sharing was also out of scope of their research.

In this paper, we try to provide a new environment where operators can share the 3D interaction when touch-ing soft materials in real time. Specifically, we focus on a haptic sharing model that equalizes haptic sensations be-tween operators having different interaction styles, and conducts several experiments to demonstrate how the model parameter can be determined for effective sharing of haptic interaction.

3. HAPTIC SHARING MODEL

3.1 Basic Design We start with the basic problem in haptic sharing, that

is, when one person obtains haptic sensation simply through pushing an elastic object with a forefinger, how can another person acquire the same haptic sensation? To approach this problem, we begin by developing methods for measuring the haptic information at the transmitting side, and for presenting the sensuously equivalent infor-mation at the receiving side.

The basic design of the haptic sharing model is illus-trated in Figure 1. The key elements to reproduce haptic information are the three-dimensional trajectory of the fingertip and the contact force as a response to the inter-action. The trajectory is discretely modeled as time series 3D positions of the fingertip. Therefore, the system we designed on the transmitting side measures the contact force 3F and the position 3P of a fingertip in contact with an actual object. On the receiving side, an

active approach [3][10] and a passive approach [5][7] are generally known to reproduce haptic information based on the measured data. We apply the passive approach because tracking the transmitter’s trajectory actively and simultaneously is often difficult and mostly subject-dependent. Rather than constructing a virtual model of the object being measured at transmitting side, our sys-tem transmits the haptic information directly to the re-ceiving side. Our design allows simultaneous, direct dis-play of the 3D trajectory and of the contact force to the forefinger, without relying on indirect channels such as visual effects or sounds.

Here, in this sharing model, we have to consider dif-ferent interaction styles adopted on each side. The user on the transmitting side actively touches the real object. Sim-ultaneously, the other user on the receiving side passively obtains the reproduced information by our system. In other words, one of our interests is to seek the relationship between the contact forces on both sides in order to share real-istic haptic sensation in touching soft objects.

3.2 Haptic Transfer Function Subjective sensations perceived when an object is ac-

tively contacted differ from those perceived when an ob-ject is passively contacted, even if the same intensity of physical stimulation is actually produced [6][7]. There may also be differences in the way haptic information is perceived between a transmitting system and receiving system as a result of the physical properties of the equip-ment itself (e.g., in the method of attaching the system to provide the haptic sensations, or similar sensations). Even if the contact force measured on the transmitting system is provided to the receiving system with complete fidelity, the sensation will not necessarily be the same for the per-son receiving as it is for the person transmitting.

To equalize the subjective sensations and to minimize the effect of differences between other physical conditions on the transmitting and the receiving side in our system, we introduce a haptic transfer function that defines the relationship between the physical stimuli on both sides. The intensity of human perception is well known to vary exponentially, as described in Stevens' power law [11]. Applying this basis for the purpose of this study, we pos-tulate that the power function in the following equation expresses a contact force sufficient to ensure that the sub-jective sensations on the transmitting and receiving sys-tems are sensuously the same:

ntransreceive kFF (1)

Figure 1. Conceptual illustration of our haptic sharing model. We assume a transfer function f between two sides.

AUTHOR ET AL.: TITLE 3

where transF is the contact force applied to the fingertips, as a response from actively touching the real object on the transmitting side. receiveF is the force displayed to the fingertips through our interface on the receiving side.

This function is based on the hypothesis that receiveF passively obtained on the receiving side produces the same haptic sensation that transF has actively obtained on the transmitting side. k and n are user-specific coefficients determined in accordance with the method used to attach the system and on the individual differences in how sen-sations are perceived.

4. HAPTIC SHARING SYSTEM The prototype haptic sharing system we designed con-

sists of a transmitting system to measure the haptic in-formation, a receiving system to display the transferred information and a control PC. The transmitting system measures the operator’s active interaction with real ob-jects using their finger, that is, the finger position transP for the proprioceptive sense and contact force transF at the fingertip for the cutaneous sense. Through the receiving system, the other user (the receiver) passively obtains the position receiveP and the force receiveF computed by the proposed haptic transfer function (1).

4.1 Capturing Interaction Our prototype transmitting system is composed of two

sensors; a 3DOF force sensor (TEC Gihan Co., Ltd.) and a 6DOF magnetic tracking device (3D Guidance MedSAFE sensor manufactured by Ascension Technology Corpora-tion). The force-sensing unit is installed on a thin alumi-num plate, and measures the contact force transF at 1 kHz refresh rate. The magnetic tracking device measures the position transP of the 1.3mm sensor part in 40cm x 40cm x 30cm workspace. The spacial resolution is 0.5mm, which is valid for sensing the position of the finger because the maximum spatial resolution for a human finger is ap-proximately 1mm [1]. Additionally, compared to optical tracking approaches[12], this magnetic system has the advantage of avoiding occlusion in capturing interactions with elastic objects. As shown in Figure 2, the force sensor and motion sensor are attached to a finger sack made of a flexible elastic material, to avoid obstructing the contact-ing action of the operator. As the unit is mounted on the

operator, not on the object, this approach can be applied to a wider range of objects in the real world. This feature greatly reduces set up costs and restrictions on the choice of target objects, compared to other approaches such as image-based pressure estimation or object-mounted sen-sor approaches. Although the aluminum plate can affect haptic sensation of the transmitter, we assume this influ-ence will be small for sensing the contact force we prelim-inary focus on in this paper.

4.2 Displaying Interaction Our system to provide haptic information on the receiv-ing side is a combination of two subunits: a trajectory guidance unit to provide the position receiveP and a force feedback unit to provide receiveF . We used a PHANToM Desktop device for the trajectory guidance unit and a PHANToM Premium 1.0 for the contact force feedback unit respectively. Figure 3 shows the overall configura-tion of the receiving system. The stylus for trajectory guidance was fastened between the first and second joint of a forefinger with a hook and loop fastener, which con-trols 3D position of the entire forefinger. The stylus for the contact force feedback is fastened to the fingertip with the finger pad of the PHANToM Premium 1.0. Since the displacement and the force are transmitted to different points of the forefinger, we assume the receiver can dis-tinguish between the two pieces of information separately, without the pushing manipulation by the operator being obstructed. This design is a key point of our haptic dis-play unit that allows the displacement and force simulta-neously to be output to one finger.

The function of the trajectory guidance unit is to keep the finger at the target position transP measured by the transmitting system. Accordingly in our system, the posi-tion of the receiving system receiveP is controlled by PID control in accordance with the equations:

dtK

tKK errorI

errorDerrorPcontrol PP

PF (2)

receivetranserror PPP (3) where controlF is the force generated from the trajectory guidance unit in the receiving system, errorP is the vector for the error between the target position transP and cur-rent position receiveP , and t is time. The first, second, and third terms on the right side of this equation are the pro-

Figure 2. User interface for capturing interaction. A 3DOFforce sensor and 6DOF motion sensor are fitted to a forefinger.

Figure 3. User interface for displaying haptic information on the receiving side.

4 IEEE TRANSACTIONS ON JOURNAL NAME, MANUSCRIPT ID

portional term, differential term, and integral term, re-spectively. Parameters KP, KD and KI are adjusted to en-sure that the position of the receiving unit will track the position of the transmitting unit without any delays dur-ing movements in free space and also when exploring an object. These values are consequently configured as KP = 0.06 N/mm, KD = 0.001 N sec/mm, and KI = 0.001 N sec/mm.

The function of the contact force feedback unit is to produce the transferred force to the user based on the equation (1). controlF is displayed to the first joint of the forefinger, and receiveF is directly applied to the fingertip. This design allows the receiving side to feel the shared haptic sensation more intuitively. In the current design, we apply this transfer function to the 1D normal compo-nent of the contact force. Supporting friction force and presence of the texture will be future works.

5. EVALUATION AND USER STUDY We evaluated the haptic sharing design and the capacity of the developed system for sharing haptic interaction by conducting human experiments. For the appearance and demonstration of real-time haptic sharing, see the photo-graphs in Figure 4. In these experiments, the operator on the transmitting side moves the forefinger to draw a triangle-shaped trajectory while pushing a sponge. The receiving side gets the transferred force and the movement through the de-vice in real time.

Five males and five females, 22 to 26 year old graduate students, participated in the experiment. All subjects were right handed and had no known hand impairments. They were briefly trained with the demo applications of PHANToM Omni haptic device for three minutes. Throughout the experiments, the subjects were instructed to keep their finger in a natural, relaxed state and not to resist the presented force. No visual or auditorial infor-mation was presented in the experiments.

5.1 Finger Position

The positional error of the fingertip can be affected by the presented force and the user’s delayed operation. Therefore, the purpose of the first test was to confirm the position tracking capability of the receiving system. The pre-recorded trajectory was used in order to display the same stimuli to all subjects for quantitative evaluation. We prepared two test trajectories, one in the shape of a triangle and the other is a circle (see dotted lines in Figure 5), by capturing the 3D position of the experimenter’s fingertip while pushing a sponge. The contact force was also measured and recorded in 3D.

The test was conducted by having the subject track the two trajectories in the same order, then determining the mean-square error (MSE) defined in equation (4) for the positional error made in an XYZ coordinate system. All the subjects used their dominant hand.

n

ierror

nMSE

1

1P (4)

where n is the number of sampled points. The trajectory was reproduced at two different speeds to confirm the influence of the dynamic movement of the finger. The velocity was set to 20mm/sec or 40mm/sec, respectively. The results are summarized in Table 1. Figure 5 specifi‐

cally shows  the average of  the measured  trajectory of 10 subjects in the 40mm/sec case. MSE is affected by the ve‐locity of the finger movement. The major part of the error was  observed  from  0.0  to  1.0  sec  during manipulation. This error was caused by the delay between the presented force and the userʹs reaction against the first movement of the  haptic  device.  While  the  maximum  resolution  at which human beings recognize movement is estimated to be around 2mm [1], the average errors measured from 1.0 

Figure 4. Snapshots of real-time haptic sharing. (Top) The transmitting side draws a trajectory and (Bottom) the receiving side gets the transferred force and movement.

Average MSE mm (SD)

20mm/sec 40mm/sec Circle 2.19 (0.58) 3.04 (0.88) Triangle 2.65 (0.55) 3.86 (0.70)

Table 1. The average MSE and SD to quantify the posi-tion tracking capability of the receiving system.

AUTHOR ET AL.: TITLE 5

to 4.0 sec were 1.4mm and 2.0mm in each case. Therefore, we concluded  that  the errors are acceptable  in  the stable manipulation  state.  More  effective  position  control  in‐cluding analysis of parameter settings for complex, irreg‐ular movements of the finger will be our future work.

5.2 Haptic Transfer Function The purpose of the second test was to determine a trans-fer function that ensures the same subjective sensations on both sides. In the experiment, we mounted the receiv-ing system and transmitting system on the right and left index fingers of the test subject, applied a force to the sub-ject’s right hand via the receiving system, and asked the subject to apply the same force to an elastic object with his or her left hand simultaneously. The 1D forces that subjects passively felt on the right hand were 0.5N, 0.7N, 1.0N, 1.5N and 2.2N. Each one was displayed to subjects three times in random order. This protocol is similar to the one conducted in the prior force discrimination stud-ies [13][14]. The difference was we presented the passive force to the right hand first, then the subjects were asked to find the force that was perceived as the same as when actively pushing the real elastic object by the left hand. This protocol is directly utilized for the purposes of the calibration in the developed haptic sharing system.

Figure 6a shows the relation between the presented force on the receiving side and the average of the matched force on the transmitting side for 10 subjects. As the relation is generally linear on the logarithmic axis, this

result is consistent with the hypothesis that the transfer function can be expressed as equation (1). On this basis, the equation appears to be applicable to systems for transmitting trajectory and contact force to the fingertip in the manner we propose here, regardless of the configu-ration used for measurement of the haptic information. On the other hand, from Figure 6a, the parameters k and n vary too widely among individuals to determine a single set that is applicable to all users. The individual differ-ences produced in the parameters are thought to result from user dependent differences in perception, the man-ner in which the system is used, and the contact area on the finger pads of the haptic device and measurement unit (due to differences in the sizes of the users’ fingers and hands).

Considering the individual differences, we determined the parameters k and n for each subject by curve fitting to his/her experimental data of 5 points shown in Figure 6a. Figure 6b shows the matched force of 10 subjects calculat-ed by the user-specific haptic transfer function using the parameters determined for each subject via curve fitting. The function was designed to reproduce a haptically equivalent contact force on the receiving side by transfer-ring the force measured in the transmitting side. The line-arity and conformity of the graph demonstrate the haptic transfer function captures the individual characteristics and correctly compensates for the difference between the active and passive contacts.

Figure 5 (a) Average trajectories of the subject's fingertip in x, y and z(depth) coordinates for the validation of the tra-jectory guidance unit. As shown in the figure, the MSE error between the receiving and transmitting sides is small.

Figure 6. (a) The contact force to produce haptically equivalent sensation. (b) The force transferred by the haptic transfer function. This shows the same force can be felt in the receiving side.

6 IEEE TRANSACTIONS ON JOURNAL NAME, MANUSCRIPT ID

6. CONCLUSION AND FUTURE WORK In this paper, we presented a model for real-time, direct sharing of haptic interaction for real-world objects. A new haptic transfer model was introduced to equalize haptic sensations between a transmitting side and a receiving side. In addition to this, we designed a haptic sharing system that makes it possible to capture an interaction and to present the transferred force as a haptic sensation in another place. We confirm that the transfer function for contact force in the proposed system can be expressed as a power function. Our model captures the individual characteristics and correctly compensates for the differ-ence between the active contact and the passive contact.

Although the proposed transfer function successfully nor-malizes these factors by calibrating user-specific parameters beforehand, the interface design is still preliminary and can be optimized for interaction capture and presence. As recent studies [15] report force discrimination of dynamic systems, further quantification of stiffness discrimination in haptic sharing will be valuable field of study. On the transmitting side, also, the force sensor can interfere with natural interac-tion with real-world objects. Implementing thinner, non-metal materials to the body of the sensor will enable more natural interaction with real elastic objects. We are also interested in the validation of the proposed transfer func-tion on friction force. However, in order to do this, the receiving system should be improved for directly display-ing friction force to the finger. We will continue to seek a universal design where everyone can robustly share accurate haptic sensation more naturally.

ACKNOWLEDGEMENTS

This study was supported by the Grant-in-Aid for Scien-tific Research for Young Scientists (A) (16680024) from The Ministry of Education, Culture, Sports, Science and Technology, Japan. We would like to express special thanks to all practitioners for the user study.

REFERENCES

[1] Grigore C. Burdea, Force and Touch Feedback for Virtual 

Reality, Wiley‐Interscience, 1996. 

[2] A. Haans and W.  IJsselsteijn, ”Mediated Social Touch: a 

Review of Current  Research and Future Directions”, Virtual 

Reality, Vol.9, pp. 149‐159, 2006. [3] M. Nakao, T. Kuroda, M. Komori, H. Oyama, K. Minato and T. Takahashi, "Transferring Bioelasticity Knowledge through Haptic Interaction", IEEE Multimedia, Vol. 13, No. 3, pp.50-60, 2006. [4] Robert L. Williams II et al, ”Implementation and Evalua‐

tion of a Haptic Playback System”, Haptics‐e, Vol. 3, No. 3, 

pp.1‐6, 2004. 

[5] K. Fujita and Y. Ikeda, ʺRemote Haptic Sharing of Elastic 

Soft Objects  ‐Real‐time  Estimation  and Display  by Contact 

Area Control ‐ʺ, In Proc.of World Haptics, Poster, 2005. [6]  Roberta  L.  Klatzky,  Susan  J.  Lederman,  ʺExperimental 

Psychologyʺ, Handbook of Psychology, John Wiley Sons, Vol.  

4, pp. 147‐176, 2003. 

[7] R. Kikuue and T. Yoshikawa, ”Haptic Display Device with 

Fingertip  Presser  for  Motion/Force  Teaching  to  Human”, 

Proc. IEEE International Conference on Robotics Automation, 

pp.868‐873, 2001. 

[8] D. Wang, Y. Zhang and C. Yao,  ʺStroke‐based Modeling 

and Haptic Skill Display for Chinese Calligraphy Simulation 

Systemʺ, Vol. 9, No. 2, pp. 118‐132, 2006. 

[9] Y. Ishibashi and T. Asato, ”Media Synchtonization Control 

with  Prediction  in  a  Remote  Haptic  Calligraphy  System”, 

Proc. of ACM SIGCHI, 2007. 

[10]  S.  Saga, N. Kawakami  and  S.  Tachi,  ”Haptic  Teaching 

using Opposite Force Presentation”,  In Proc. of World Hap‐

tics, Poster, 2005. 

[11] S. S. Stevens, ʺOn the Psychophysical Lawʺ, Psychologi‐

cal Review, Vol.  64, Issue. 3, pp. 153–181, 1957. [12] P. G. Kry and D. K. Pai, "Interaction Capture and Syn-thesis", ACM SIGGRAPH, pp. 872-880, 2006. [13] S. S. Shergill, P. M. Bays, C. D. Frith and D. M. Wolpert, 

“Two Eyes for an Eye: The Neuroscience of Force Escalation”, 

Science, Vol. 301, No. 5630,  p187, 2003. 

[14] L. A.  Jones and  I. W. Hunter: “A Perceptual Analysis of 

Stiffness,” Experimental Brain Research, Vol.  79, No.  1, pp. 

150–156, 1990. 

[15] A.  Islar, Y. Li, V. Patoglu  and M. K. OʹMalley,  ʺPassive 

and Active Discrimination  of Natural  Frequancy  of Virtual 

Dynamic Systemsʺ, IEEE Trans. on Haptics, Vol. 2, No. 1, pp. 

40‐51, 2009.