2
Wearable Haptics for Human Guidance in Structured and Unstructured Environments Tommaso Lisini Baldi and Domenico Prattichizzo Dipartimento di Ingegneria dell’Informazione e Scienze Matematiche, Universit` a di Siena, Siena, Italy. Abstract— In real world scenarios, visual and auditory channels may be overloaded with a huge quantity of stimuli resulting in some cases inefficacious or unusable. A possible solution is to provide information exploiting an underutilized sense, i.e., the sense of touch. As the sound, a tactile stimulus consist in a signal with varying frequency and amplitude. Differently from the auditory feedback, tactile sensation directly engages our motor learning system with extraordinary sensitivity and speed. Moreover, tactile communication can be used in situations where visual or auditory stimuli are distracting, impractical or unsafe. In this abstract, we report our main results showing the capability of wearable haptics interfaces to guide human in structured and unstructured environments. I. INTRODUCTION Let us assume that single or multiple humans want to reach a final location in a large environment. Demonstrative scenarios are for instance, assisting older adults or visually-impaired persons, and helping people in a dangerous situation with poor visibility and no way of hearing clearly due to environmental noise. In these contexts, haptic guidance has been successfully exploited in the last years. Closely related are the researches presented in [1], and [2]. In [1], a vibrotactile belt is used to guide humans towards goal points. Scheggi et al. in [2] presented a new paradigm to assist navigation of mixed human-robot teams using haptic information. While kinesthetic feedback is common in haptic systems, we exploited wearable vibrotactile interfaces, since tactile devices are generally more portable, less encumbering, and have a wider range of action than the kinesthetic ones [3]. Compared with existing strategies, our idea has the following pros: (i) the user has the hands free, thus other physical tasks may be accomplishable; (ii) it is easy to extend the physical interaction to multiple users; (iii) since we are using wearable devices, the proposed approach can be combined with other body parts; iv) the user is not constrained: directions and walking speed are only suggested. II. DESCRIPTION OF THE HAPTIC INTERFACE Tactile vibratory sensitivity is influenced by the spatial location on the body, the distance between the stimulators, the frequency of stimulation, and the age of the user. Studies have demonstrated that vibration is better sensed on hairy skin due to its thickness and nerve depth, and that vibrotactile stimuli are best perceived in bony areas [4]. In particular, wrists and spine are generally preferred for detecting vibrations, with arms and ankles next in line [5]. Due to the aforementioned considerations and since our aim is to design an intuitive and non-obtrusive device which could be easily worn, we concentrated our effort on the development of vibrotactile elastic band, that can be worn on the ankles as well as on the wrists. Starting from the results presented by Scheggi in [6] and Lisini Baldi in [7], we decided to use the bilateral configuration, that required two devices, one for each body part. Please refer to [6] for further details about study procedures and results. Technical aspects of the wearable devices and positioning evaluation are reported in [7]. Our contribution consists in the (a) (b) Fig. 1. Representative scenarios. In (a) a visually impaired subject is guided by a remote operator via two vibrotactile bracelets. A mobile-phone streams the view of the blind user to the remote operator so that he can properly guide the subject to the desired location. In (b), the remote social walking system: the anklet device measures user gait cadence, the information are sent to a server using a smart phone and the server updates the companion smart-phone commanding times vibrations at the anklets. development of approaches for suggesting directions and gait cadence by means of haptic interfaces. Complementary scenarios are considered: i) the environment is unknown and not structured with sensors/cameras; ii) the map of the environment is available and the user can be localized into. III. HUMAN GUIDANCE IN UNSTRUCTURED ENVIRONMENTS For what concerns the guidance in an unstructured environ- ment, we studied and developed two guidance policies. The goal of the former is suggesting direction to blind people, the latter aims at creating a system for “feeling” a partner remotely while walking [8]. Both strategies exploit the internet connection as mean to exchange vibrotactile cues. In the proposed scenarios the map of the environment is not necessary. In the first case a mobile phone streams a video of the surrounding to a remote operator, whereas in the second the users have no visual impairments and localization is not required. A. Suggesting directions As a first step, we present a remote guidance system for visual impaired people. In fact, blinds encounter many difficulties in living an autonomous life, due to the reduced capability to perceive the surrounding environment. In particular, navigation and orientation seem to be very challenging when moving in unknown domains. Trained guide dogs and canes are useful for collision avoidance and obstacle warning but they do not provide directional information to guide visually impaired toward a desired location. Based on these observation, we proposed an effective assistive system for visually impaired. The blind is equipped with a mobile phone for video streaming, two vibrotactile bracelets and a cane which is used to avoid potential obstacles. Note that the proposed vibrotactile devices do not substitute the cane or the guiding dog for obstacle avoidance; they represent assistive devices which provide directional information in order to properly reach the desired location. A remote volunteer is in charge of controlling the video stream

Wearable Haptics for Human Guidance in Structured and ...sirslab.dii.unisi.it/papers/2018/lisini_SIDRA19.pdf · exploited in the last years. Closely related are the researches presented

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Wearable Haptics for Human Guidance in Structured and ...sirslab.dii.unisi.it/papers/2018/lisini_SIDRA19.pdf · exploited in the last years. Closely related are the researches presented

Wearable Haptics for Human Guidance in Structured and UnstructuredEnvironments

Tommaso Lisini Baldi and Domenico Prattichizzo

Dipartimento di Ingegneria dell’Informazione e Scienze Matematiche, Universita di Siena, Siena, Italy.

Abstract— In real world scenarios, visual and auditory channelsmay be overloaded with a huge quantity of stimuli resulting insome cases inefficacious or unusable. A possible solution is toprovide information exploiting an underutilized sense, i.e., thesense of touch. As the sound, a tactile stimulus consist in asignal with varying frequency and amplitude. Differently fromthe auditory feedback, tactile sensation directly engages ourmotor learning system with extraordinary sensitivity and speed.Moreover, tactile communication can be used in situations wherevisual or auditory stimuli are distracting, impractical or unsafe.In this abstract, we report our main results showing the capabilityof wearable haptics interfaces to guide human in structured andunstructured environments.

I. INTRODUCTION

Let us assume that single or multiple humans want to reacha final location in a large environment. Demonstrative scenariosare for instance, assisting older adults or visually-impairedpersons, and helping people in a dangerous situation with poorvisibility and no way of hearing clearly due to environmentalnoise. In these contexts, haptic guidance has been successfullyexploited in the last years. Closely related are the researchespresented in [1], and [2]. In [1], a vibrotactile belt is used toguide humans towards goal points. Scheggi et al. in [2] presenteda new paradigm to assist navigation of mixed human-robotteams using haptic information. While kinesthetic feedback iscommon in haptic systems, we exploited wearable vibrotactileinterfaces, since tactile devices are generally more portable,less encumbering, and have a wider range of action than thekinesthetic ones [3]. Compared with existing strategies, ouridea has the following pros: (i) the user has the hands free,thus other physical tasks may be accomplishable; (ii) it is easyto extend the physical interaction to multiple users; (iii) sincewe are using wearable devices, the proposed approach can becombined with other body parts; iv) the user is not constrained:directions and walking speed are only suggested.

II. DESCRIPTION OF THE HAPTIC INTERFACE

Tactile vibratory sensitivity is influenced by the spatiallocation on the body, the distance between the stimulators,the frequency of stimulation, and the age of the user. Studieshave demonstrated that vibration is better sensed on hairy skindue to its thickness and nerve depth, and that vibrotactile stimuliare best perceived in bony areas [4]. In particular, wrists andspine are generally preferred for detecting vibrations, witharms and ankles next in line [5]. Due to the aforementionedconsiderations and since our aim is to design an intuitiveand non-obtrusive device which could be easily worn, weconcentrated our effort on the development of vibrotactileelastic band, that can be worn on the ankles as well as onthe wrists. Starting from the results presented by Scheggi in[6] and Lisini Baldi in [7], we decided to use the bilateralconfiguration, that required two devices, one for each body part.Please refer to [6] for further details about study procedures andresults. Technical aspects of the wearable devices and positioningevaluation are reported in [7]. Our contribution consists in the

(a) (b)

Fig. 1. Representative scenarios. In (a) a visually impaired subject is guidedby a remote operator via two vibrotactile bracelets. A mobile-phone streamsthe view of the blind user to the remote operator so that he can properly guidethe subject to the desired location. In (b), the remote social walking system:the anklet device measures user gait cadence, the information are sent to aserver using a smart phone and the server updates the companion smart-phonecommanding times vibrations at the anklets.

development of approaches for suggesting directions and gaitcadence by means of haptic interfaces. Complementary scenariosare considered: i) the environment is unknown and not structuredwith sensors/cameras; ii) the map of the environment is availableand the user can be localized into.

III. HUMAN GUIDANCE IN UNSTRUCTURED ENVIRONMENTS

For what concerns the guidance in an unstructured environ-ment, we studied and developed two guidance policies. The goalof the former is suggesting direction to blind people, the latteraims at creating a system for “feeling” a partner remotely whilewalking [8]. Both strategies exploit the internet connection asmean to exchange vibrotactile cues. In the proposed scenariosthe map of the environment is not necessary. In the firstcase a mobile phone streams a video of the surrounding toa remote operator, whereas in the second the users have novisual impairments and localization is not required.

A. Suggesting directions

As a first step, we present a remote guidance system for visualimpaired people. In fact, blinds encounter many difficulties inliving an autonomous life, due to the reduced capability toperceive the surrounding environment. In particular, navigationand orientation seem to be very challenging when moving inunknown domains. Trained guide dogs and canes are useful forcollision avoidance and obstacle warning but they do not providedirectional information to guide visually impaired toward adesired location. Based on these observation, we proposedan effective assistive system for visually impaired. The blindis equipped with a mobile phone for video streaming, twovibrotactile bracelets and a cane which is used to avoid potentialobstacles. Note that the proposed vibrotactile devices do notsubstitute the cane or the guiding dog for obstacle avoidance;they represent assistive devices which provide directionalinformation in order to properly reach the desired location.A remote volunteer is in charge of controlling the video stream

Page 2: Wearable Haptics for Human Guidance in Structured and ...sirslab.dii.unisi.it/papers/2018/lisini_SIDRA19.pdf · exploited in the last years. Closely related are the researches presented

and suggest directions according to the final destination and theawareness of surrounding risks. The haptic cues are transmittedvia TCP datagrams and 3G/4G network to the user smart-phone,which communicates with the vibrotactile bracelets using theBluetooth protocol.

B. Suggesting walking pace

Once the capability of steering humans via haptic stimuliwas assessed, we explored the possibility of suggesting walkingpeace. Walking is an essential activity for a healthy life, whichis less tiring and more enjoyable if performed together. Thedifficulties we have in performing sufficient physical exercise,principally lack of motivation, can be overcome exploiting thesocial aspect. However, our lifestyle sometimes makes it verydifficult to find time together with others who live far awayfrom us to go for a walk. In this respect, we developed anovel system enabling people to have a “remote social walk”by streaming the gait cadence between two persons walkingin different places, to increases the sense of mutual presence.The system exploits the vibrotactile devices worn at the ankleswhich provide timing cues for the cadence synchronization. Thegait cadence is measured using a pressure sensor immersedinto a silicon insole and connected with one of the hapticinterfaces. The detected steps are then sent to the user’s smart-phone that communicates with a dedicated server. Finally, eachsocial walker smart-phone receives the gait cadence updatesfrom the server and adjusts the vibrations consequently. A firstexperimental session confirmed that humans can follow a timevarying artificial rhythm perceived via anklet vibrations. Wethen assessed that the tracking performances are retained whenthe virtual reference is replaced with a human gait cadencewith a dedicated set of experiments. Finally, basing on the factsconfirmed by previous experiments, we obtained experimentalevidence that two humans, walking simultaneously but notin each other proximity, perceiving the companion rhythm,synchronize their gait cadence using our system.

IV. HUMAN GUIDANCE IN STRUCTURED ENVIRONMENTS

Studies of novel methods and policies involved also thecapability of guiding people in structured environments. Inthis context, we initially developed an algorithm for suggestingdirections [9], then we proposed a system for regulating thewalking cadence [10].

A. Suggesting directions

In this study, we employ an effective haptic policy to easilydisplay directional stimuli. In this regard, we consider that thehuman locomotion can be approximated by the motion of aunicycle system, i.e., nonholonomic constraints similar to thoseof mobile robots seem to be at work when a human is walking[11]. Without lack of generality, we assumed that the humanis free to select her/his desired walking speed. Control signals(i.e., haptic stimuli) are sent to the users in order to steer theirlocomotion. The proposed method is evaluated in three differentscenarios consisting of: (i) Two users; (ii) two users and astatic obstacle; and (iii) three users. In all scenarios, the usershave to move toward their respective goal areas, while avoidingreciprocal collisions and collisions with the environment. Inorder to provide easily-recognizable cues, the device couldelicit only three basic behaviors (turn left, turn right, andgo straight). The algorithm we developed to safely navigatethe users in dynamic environments is an the extension of theOptimal Reciprocal Collision Avoidance (ORCA), adapted fornon-holonomic robots (NH-ORCA) [12]. The algorithm providesa sufficient condition for each agent to be collision-free for atleast a fixed amount of time into the future. Each agent takes into

account the observed velocity and pose of the other agents inorder to avoid collisions with them. Then, the optimal velocity isselected by using linear programming. Then, a set of holonomicallowed velocities is computed. Following that, the algorithmcalculates the optimal holonomic velocity, which is mapped tothe corresponding non-holonomic control inputs for the agent.Positions are estimated using a camera-based tracking system,combined with an Extended Kalman Filter which provides anestimate of the variance, and hence the standard deviation, ofthe measured quantities. These values are taken into accountby the obstacle avoidance algorithm.

B. Suggesting walking pace

As a conclusive step, we present a system to guide humans instructured environments, with the aim of reaching simultaneouslya rendezvous point. Exploitation of the proposed approach areassistive and rescue scenarios, human-human collaboration, aswell as rehabilitation. Cadence is suggested to the users viatwo vibro-tactile elastic bands placed on the ankles. To reachthe same point at the same time, users have to adapt theirwalking pace to the one displayed by the haptic interfaces.The user position, is computed in predefined locations calledcheckpoints. Here the walking parameters, thus the rhythm, areupdated. The user retains complete access to audio and visualinformation from the environment, so that he/she is ready toreact to unexpected events (e.g., moving obstacles). We testedthe system with the following setup: User1 is closer to the goalpoint than User2. To reach the rendezvous point at the same time,User1 has to keep a slow pace, while User2 has to increase thewalking cadence. Every 30 meters a checkpoint was positioned.All the participants were able to reach the rendezvous pointwith a low average error both in time and space.

REFERENCES

[1] A. Cosgun, E. A. Sisbot, and H. I. Christensen, “Guidance for humannavigation using a vibro-Tactile belt interface and robot-like motionplanning,” in Proc. IEEE Int. Conf. on Robotics and Automation. IEEE,2014, pp. 6350–6355.

[2] S. Scheggi, M. Aggravi, F. Morbidi, and D. Prattichizzo, “Cooperativehuman-robot haptic navigation,” in Proc. IEEE Int. Conf. on Robotics andAutomation. IEEE, 2014, pp. 2693–2698.

[3] R. W. Lindeman, Y. Yanagida, H. Noma, and K. Hosaka, “Wearablevibrotactile systems for virtual contact and information display,” in VirtualReality. Springer, 2006, vol. 9, no. 2-3, pp. 203–213.

[4] F. Gemperle, T. Hirsch, A. Goode, J. Pearce, D. Siewiorek, andA. Smailigic, “Wearable vibro-tactile display,” 2003.

[5] I. Karuei, K. E. MacLean, Z. Foley-Fisher, R. MacKenzie, S. Koch, andM. El-Zohairy, “Detecting vibrations across the body in mobile contexts,”in Proc. ACM Int. Conf. on Human Factors in Computing Systems, 2011,p. 3267.

[6] S. Scheggi, M. Aggravi, and D. Prattichizzo, “Cooperative Navigationfor Mixed Human-Robot Teams Using Haptic Feedback,” IEEE Trans.Human-Mach. Syst., vol. 47, no. 4, pp. 462–473, 2017.

[7] T. Lisini Baldi, G. Paolocci, and D. Prattichizzo, “Human guidance:Suggesting walking pace under manual and cognitive load,” in Proc.Int. Conf. on Human Haptic Sensing and Touch Enabled ComputerApplications. Springer, 2018, pp. 416–427.

[8] T. Lisini Baldi, G. Paolocci, D. Barcelli, and D. Prattichizzo, “WearableHaptics for Remote Social Walking,” IEEE Trans. Haptics, no. Submitted,2019.

[9] T. Lisini Baldi, S. Scheggi, M. Aggravi, and D. Prattichizzo, “HapticGuidance in Dynamic Environments Using Optimal Reciprocal CollisionAvoidance,” IEEE Robot. Autom. Lett., vol. 3, no. 1, pp. 265–272, 2018.

[10] G. Paolocci, T. Lisini Baldi, and D. Prattichizzo, “Human rendezvous viahaptic suggestion,” in Haptic Interaction, H. Kajimoto, D. Lee, S.-Y. Kim,M. Konyo, and K.-U. Kyung, Eds. Singapore: Springer Singapore, 2019,pp. 262–267.

[11] G. Arechavaleta, J.-P. Laumond, H. Hicheur, and A. Berthoz, “On thenonholonomic nature of human locomotion,” Autonomous Robots, vol. 25,no. 1-2, pp. 25–35, 2008.

[12] J. Alonso-Mora, A. Breitenmoser, M. Rufli, P. Beardsley, and R. Siegwart,“Optimal reciprocal collision avoidance for multiple non-holonomic robots,”in Springer Tracts in Advanced Robotics, 2012, vol. 83 STAR, pp. 203–216.