Haptic Interaction for Mobile Assistive Robots

  • Published on
    23-Sep-2016

  • View
    215

  • Download
    0

Embed Size (px)

Transcript

  • IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 60, NO. 11, NOVEMBER 2011 3501

    Haptic Interaction for Mobile Assistive RobotsHuanran Wang, Student Member, IEEE, and Xiaoping P. Liu, Senior Member, IEEE

    AbstractThis paper presents an assistive robotic system, forwhich the user can interact with a mobile robot using a hapticinterface, for senior and disabled people. The system consists ofa nonholonomic PeopleBot mobile robot and a Phantom Omnihaptic interface. Several technical problems related to the systemare addressed and solved. Specifically, the backstepping techniqueis used, and a new tracking algorithm is developed for the robotto follow the user on a 2-D space. An adaptive fuzzy logic-basedwindowing method is proposed to solve the velocity estimationproblem in the controller. A simple haptic rendering algorithmis designed to generate the haptic feedback based on the positionerror. Experiments show that the user can guide the movement ofthe robot quite easily and smoothly using the haptic interface.

    Index TermsAssistant robot, haptics, nonholonomic mobilerobot, tracking algorithm, velocity estimation.

    I. INTRODUCTION

    THE RAPID aging population and the existence of a largenumber of people with different kinds of disabilities posea great challenge for engineers and researchers to design andbuild suitable and practical healthcare robotic systems. As-sistive robotic systems are finding many applications in thehealthcare domain and showing great potentials. In particular,assistant robotic systems are designed for people who losepartly or completely their mobility or sensory capabilities.These systems enable people, who would otherwise need help,to live independently, which is a very important index of thequality of modern life [1]. The paper [2] provides an excellentsurvey on the history and current research progress on assistiverobots.

    One problem for assistive robotic systems is that there is nouniversal or general solution to all kinds of different require-ments particularly from the clinical perspective. This meansthat the design and development of such a system must beapplication or task dependent. For instance, many differentkinds of assistive robotic systems have been built for a varietyof assistive tasks. Kitchen robot [3] was designed for helpingthe elderly and disabled users particularly on the task relatedto kitchen appliances. Helpmate [4] is one of very successfulmedical robot applications, and it has been used by about80 hospitals worldwide. As an autonomous transport system,

    Manuscript received November 18, 2010; revised April 17, 2011; acceptedJune 16, 2011. Date of publication August 4, 2011; date of current versionNovember 9, 2011. This work was supported by the Natural Sciences andEngineering Research Council of Canada. The Associate Editor coordinatingthe review process for this paper was Dr. Atif Alamri.

    The authors are with the Department of Systems and Computer Engineer-ing, Carleton University, Ottawa, ON K1S 5B6, Canada (e-mail: hrwang@sce.carleton.ca; xpliu@sce.carleton.ca).

    Color versions of one or more of the figures in this paper are available onlineat http://ieeexplore.ieee.org.

    Digital Object Identifier 10.1109/TIM.2011.2161141

    this robot carries medical data records including X-ray pho-tos, diagnostic samples, medical instruments, etc. WheelchairNavigation System [5] is an autonomous wheelchair systemproviding mobility to persons who cannot power the wheelchairdue to motor, sensory, perceptual, or cognitive impairments.The system contains three operating modes: general obstacleavoidance, door passage, and automatic wall following. A walk-ing assistance system [1] began in 1994 to develop a mobilityaid robot to satisfy the needs of the frail blind. The system actslike a wheeled walker, and it is equipped with different distancesensors (ultrasonic, laser, etc.) to detect the obstacle in front ofthe system, and brakes were used to hold back the user fromcolliding with those obstacles.

    While these robotic systems are very promising in assistingpeople to live a more independent life, there are several draw-backs in the current systems:

    1) The interaction between the user and system is not suf-ficient or not intuitive. Most of the current systems areaimed to be autonomous [3][5]. This reduces the burdenon the user, which is of course a very desirable feature,but due to the low level of current artificial intelligencetechnology, most of the current systems cannot satisfy thevarious, different and unstructured requirements for anindependent living style. In addition, if the user is lack ofphysical exercise, the physical and mental health will beadversely affected. Moreover, due to the various mobilityproblems of the user group, the input device should bea high degree of freedom input device to track any inputmovement.

    2) Human communication channels are adequately used. Inthe literature, humanrobot interaction is mostly throughvisual and sound (voice commend) channels. However,the aged and people who need healthcare accommodationusually lose partially or completely their visual or audi-tory capabilities; thus, it may be challenging for them tointeract with the assistive robot using voice or to locatethe robot visually. In addition, the changes of hearingand vision are the most dramatic comparing to the othersenses [6]. An assistive robot which has the capability ofhaptic interaction may compensate well in this regard.

    3) While the robotic systems aiming at the above assistivetasks have been quite intensively developed, there aresome other tasks which receive relatively little attentionsuch as moving heavy objects (grocery, furniture, etc.).For the people whose upper limbs power is too weak tomanipulate the heavy objects, the assistant robot shouldact as a power amplifier to cover most of the weight. Theremaining weights are assigned as an acceptable burdenfor the user as an exercise.

    0018-9456/$26.00 2011 IEEE

  • 3502 IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 60, NO. 11, NOVEMBER 2011

    Fig. 1. Overview of the developed assistant system.

    In this paper, a new assistive robotic system with hapticfeedback is presented. The proposed system is developed toassist the senior and disabled people who lose partly or com-pletely their mobility or sensory capabilities, but need to moverelatively heavy objects in a home-like environment. Our designobjective is that the user can guide the robot to move in the2-D planar with a very smooth manner but without mucheffort. In the proposed system, the user interacts with a mobilerobot locally using a haptic interface to perform these tasks.For this purpose, several technical issues were addressed. Inspecific, a nonlinear controller was developed to control themovement of the robot, a new method based on fuzzy logic wasdeveloped to estimate the velocity of the haptic device, and ahaptic rendering algorithm based on the error information wasdesigned to present force feedback to the user.

    The paper is organized as follows: Section II gives anoverview of the proposed system. Section III describes thedetails of the control algorithm. Section IV introduces the newvelocity estimation method, and Section V presents the hap-tic rendering algorithm. The implementation is given inSection VI, and the experimental results and the analysis areshown in Section VII. The final part is the conclusion andfuture work.

    II. OVERVIEW OF THE PROPOSED SYSTEM

    An overview of the proposed assistive robotic system isshown in Fig. 1. The system has two main hardware compo-nents: a mobile robot and a haptic interface. The mobile robotis a PeopleBot robot from MOBILEROBOTS which is a verypowerful robotic platform suitable for humanrobot interaction.It is nonholonomic with two different types of wheels: twodriving wheels and one omnidirectional wheel for balance [7].The height of the mobile robot is 1 115 mm which is suitable forthe user to manipulate the haptic device. The haptic interfaceis a Phantom Omni arm from Sensable Technology which haslight weight, and it is easy to be mounted on the top of the robot.A special light wood panel is used to fix the Phantom Omni onthe top of the robot.

    Fig. 2. Coordinate of the mobile robot and haptic device.

    III. CONTROLLER DESIGN FOR THE MOBILE ROBOT

    One of the technical problems in the development of the pro-posed assistive robotic system is to design a stabilizing controlalgorithm for the mobile robot in 2-D planar. In this paper,we formulate this control problem as a mobile robot trackingproblem. Y. Kanayama conducted pioneering work on the non-holonomic mobile robot tracking problem [8]. Then, the back-stepping methodology is adopted to design the controller. Thismethodology was first adopted in the mobile robot control in[9] and then developed by several other researchers [10][13].Because the mobile base is a differential-drive noholonomicmobile robot, the control signals are translational and rotationalvelocities rather than motor torques. The controller is designedbased on the kinematics model of the mobile robot.

    The coordinate arrangement of the system is shown inFig. 2. The point Ph = [xh, yh]T and Pm = [xm, ym]T arethe reference positions of the human (also the position of thePhantom stylus) and mobile robot in the world coordinate XY ,respectively.

    Consider the following kinematics model for the nonholo-nomic mobile robot:

    Pm =

    xmym

    m

    =

    cos m 0sin m 0

    0 1

    [ vm

    m

    ](1)

    where Pm is the position and posture vector of the mobile robot.xm and ym are the Cartesian coordinates for the robot. m isthe angle between the robot heading direction and X axis. vmis the translational velocity and m is the rotating velocity ofthe robot, respectively.

    We define the error Pe as

    Pe =

    xeye

    e

    =

    xh xmyh ym

    V I m

    . (2)

    Here, V I is a virtual angle compared to the real angle m. Ife = 0, V I = m. Now, we assume the position of the mobile

  • WANG AND LIU: HAPTIC INTERACTION FOR MOBILE ASSISTIVE ROBOTS 3503

    robot is [xmym

    ]=[vm cosV Ivm sinV I

    ]. (3)

    Considering the following candidate Lyaponuv function:

    V1 =12x2e +

    12y2e . (4)

    The derivative of (4) isV1 =xexe + yeye = xe(xh xm) + ye(yh ym)

    =xe(xh vm cosV I) + ye(yh vm sinV I). (5)If we make[vm cosV Ivm sinV I

    ]=[xh + cxxe + cV F xV Fyh + cyye + cV F yV F

    ]

    =[xh + cxxe + sign(xe)cV F

    x2e + y2e

    yh + cyye + sign(ye)cV F

    x2e + y2e

    ]. (6)

    Here, we introduce a virtual force xV F (on the X axis) andyV F (on the Y axis), which acts like a force to push the mobilerobot moving to the direction which can minimize the error.This force utilizes the error from both X and Y axis. cx, cy , andcV F are the gains for the error xe, ye, and xV F , respectively.

    Then, substitute (6) into (5), we can getV1 =xe(xh vm cosV I) + ye(yh vm sinV I)

    =xe(cxxe sign(xe)cV F

    x2e + y2e

    )+ ye

    (cyye sign(ye)cV F

    x2e + y2e

    )= cxx2e sign(xe)xecV F

    x2e + y2e

    cyy2e sign(ye)yecV F

    x2e + y2e < 0. (7)For the rotational error, we have

    e = V I m. (8)Then, we choose the composite candidate Lyapunov func-

    tion as

    V = V1 +122e . (9)

    Take (8) into (9) and getV = V1 + ee = V1 + e(V I m)= V1 + e(V I m).

    (10)Then, we can choose the rotational controller as

    m = V I + ce (11)substitute (11) into (10), we haveV = V1 + e(v m)

    = cxx2e sign(xe)xecV F

    x2e + y2e

    cyy2e sign(ye)yecV F

    x2e + y2e c2e < 0. (12)

    The derivation of the composite Lyaponuv function is smallerthan 0. Based on the Lyapunov stable theory, the system isstable.

    From (6), we can obtain the controller for the translationalvelocity of the mobile robot. Together with the rotational con-troller (11), we have the controller for the mobile robot{

    vm =

    (xh + cxxe + xV F )2 + (yh + cyye + yV F )2m = V I + ce. (13)

    Here

    xV F = sign(xe)cV F

    x2e + y2e

    yV F = sign(ye)cV F

    x2e + y2e

    V I = arctan

    (yh + cyy2e + sign(ye)yecV F

    x2e + y2e

    xh + cxx2e + sign(xe)xecV F

    x2e + y2e

    ).

    In controller (13), the actual controller inputs are the positionerror xe, ye in Cartesian coordinate, and the rotation error e.The position error signal is the difference between the robotposition and the Phantom stylus which can be obtained fromthe robots and Phantoms encoders separately. Moreover, therotation error is the difference between the virtual angle V Iand the desired posture for the mobile robot which can be readfrom the mobile robots gyroscope.

    IV. ADAPTIVE WINDOWING VELOCITYESTIMATION BASED ON FUZZY LOGIC

    One important issue for the designed controller is the es-timation of the phantom velocity in (13). xh and yh are thevelocities of the Phantom which also represent the velocity ofhuman input in each direction. The Phantom haptic device onlyprovides us the encoder information that can be converted to theposition in the Cartesian space. Then, the velocity informationis derived from this position information which is so-calledvelocity estimation problem. This problem exists in a verywide range, particularly for the control problems, such as motorcontrol, robot manipulator control, etc. [14]. There are twogeneral approaches to solve the velocity estimation problem:model-based or nonmodel-based techniques. The detailed liter-ature review can be found in [15].

    How to choose a suitable method to solve the velocity estima-tion problem depends on the specific system. For the proposedsystem in this paper, several perspectives are considered asdistinguishing features comparing to the other systems.

    The refresh rate of the mobile robot control loop is100 Hza relatively low refresh rate compared to somereal-time system. Hence, lots of computation can be doneat this refresh rate.

    The developing software is Visual Studio C++ 2005 whichis a powerful platform for implementing the nonmodel-based techniques. While the model-based approachesare more suitable to be implemented in the MATLAB/Simulink Environment.

  • 3504 IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 60, NO. 11, NOVEMBER 2011

    Fig. 3. Diagram of fuzzy logic windowing velocity estimation method.

    Based on the above considerations, the nonmodel-basedmethod is chosen. The velocity V can be calculated using thefirst-order differentiation as

    V =Pk Pkn

    nT. (14)

    Here, Pk is the current position of the mobile robot. Pkn isthe next nth position of the mobile robot. n is the size of thewindow to estimate the velocity. T is the sampling time.

    One of the nonmodel-based methods for the haptic deviceis the discrete adaptive window technique [16]. The main ideaof this technique is using an adaptive law based on the un-certainty band to change the size of window for the velocityestimation. However, this adaptive law cannot be implementedon our proposed system. Because the refresh rate of the mobilecontrol loop is low, the phantom velocity will change greatlyduring the sampling period comparing to the high refresh ratesystem. It is very hard to set a reasonable uncertainty band forthis type of system.

    In this paper, a novel adaptive law for changing the estima-tion window size is proposed. The current position error andits first-order differentia...

Recommended

View more >