7
Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operations Hajime Uchiyama and Walter D. Potter Artificial Intelligence Center, University of Georgia, Athens, GA, USA AbstractThis paper describes an overview of our semi- autonomous (SA) wheelchair prototype and emphasizes the design of the perceptual navigation system. The goal of our project is to increase independent mobility of individuals who are wheelchair-bound and severely vision impaired, and the scope of the project focuses on developing the perceptual navigation system that is equipped with sonar sensors, a motorized camera system, and a tactile feedback system. We present an overview of a customized Behavior- based control architecture: Behavior Cell and Behavior Network components, and implementation of behaviors such as obstacle notification, free-space finding, and doorway navigation. Keywords: Robotics, Semi-autonomous Wheelchair, Per- ceptual Navigation, Behavior-based Control, Tactile Feed- back 1. Introduction Studies have shown that individuals with disabilities, regardless of their age, benefit substantially from access to independent mobility, including electrically powered wheel- chairs ([1], [2], [3]). Independent mobility increases vo- cational or educational opportunities, reduces dependence on caregivers, and promotes a feeling of self-reliance and self-esteem. Unfortunately, some individuals are unable to independently operate a powered wheelchair due to their impairments of motor, sensory, perceptual, or a combination of these capabilities. In order to assist those individuals, a number of studies have been conducted in the field of assistive technology which combine robotics and artificial intelligence to develop robotized wheelchairs; some try to achieve autonomous control over the wheelchairs, and others aim for semi- autonomous control. Most of these robotized wheelchairs are equipped with a computer and a set of sensors, such as cameras, infrared sensors, ultrasonic sensors, and laser rangers. This assortment of equipment is used to address a number of specific problems such as: obstacle avoidance, local environment mapping, and route navigation. The CALL Centre of the University of Edinburgh has developed a wheelchair intended to be used by children who do not have the physical, perceptual or cognitive abilities to control an ordinary powered mobility aid [4]. The Smart Wheelchair exhibits collision detection followed by a maneuvering action, following the line tracks laid along the floor, and communication between the user via a speech synthesizer or other feedback system. The TAO series (Applied AI Systems Inc.) is mainly designed for research and development purposes. TAO-7 is equipped with a voice recognition/synthesis interface and performs free- space detection in a crowd and landmark-based navigation in addition to the common tasks such as obstacle avoidance [5]. The Tin Man project at the Kiss Institute is aimed at the development for a low-cost robotic wheelchair to aid people with impaired mobility [6]. Tin Man II exhibits manual navigation with obstacle avoidance override, autonomous control, and manual mode. The Wheelesley project at MIT intends to be used by users unable to manipulate the standard motorized wheelchair and aims to establish the system for both indoor and outdoor navigation. Wheelesley was developed based on the Tin Man model and interacts with the seated user via a graphical user interface [7]. The NavChair system (University of Michigan) , one of the most successful systems in the ’90s, intends to provide mobility support to users who are unable to drive standard motorized wheelchairs by autonomously selecting three different modes (tasks): obstacle avoidance, door passage, and wall following [8]. Despite these efforts, very few robotized wheelchairs are currently observed in the market, and none are intended to be used outside of a research lab or a training facility. The above studies are mainly issuing an autonomous control system. With autonomous control, the system probes the environment, detects an obstacle, plans a navigation route, makes decisions, and fully controls the mobility of the wheelchair, which leaves the user totally dependent upon the equipment. While some users might be comfortable with an au- tonomous wheelchair transportation system, others want to be more involved with the process. It is essential for them to feel as if, or actually, they are in control, while being respon- sible for both the decision-making and motion of everyday transportation activities rather than being a passenger. 1.1 Semi-autonomous Wheelchairs The purpose of developing a semi-autonomous (SA) wheelchair system, while it comprises hardware equipment similar to autonomous wheelchairs, focuses on providing just

Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

Behavior-based Perceptual Navigation for Semi-autonomousWheelchair Operations

Hajime Uchiyama and Walter D. PotterArtificial Intelligence Center, University of Georgia, Athens, GA, USA

Abstract— This paper describes an overview of our semi-autonomous (SA) wheelchair prototype and emphasizes thedesign of the perceptual navigation system. The goal of ourproject is to increase independent mobility of individualswho are wheelchair-bound and severely vision impaired,and the scope of the project focuses on developing theperceptual navigation system that is equipped with sonarsensors, a motorized camera system, and a tactile feedbacksystem. We present an overview of a customized Behavior-based control architecture: Behavior Cell and BehaviorNetwork components, and implementation of behaviors suchas obstacle notification, free-space finding, and doorwaynavigation.

Keywords: Robotics, Semi-autonomous Wheelchair, Per-ceptual Navigation, Behavior-based Control, Tactile Feed-back

1. IntroductionStudies have shown that individuals with disabilities,

regardless of their age, benefit substantially from access toindependent mobility, including electrically powered wheel-chairs ([1], [2], [3]). Independent mobility increases vo-cational or educational opportunities, reduces dependenceon caregivers, and promotes a feeling of self-reliance andself-esteem. Unfortunately, some individuals are unable toindependently operate a powered wheelchair due to theirimpairments of motor, sensory, perceptual, or a combinationof these capabilities.

In order to assist those individuals, a number of studieshave been conducted in the field of assistive technologywhich combine robotics and artificial intelligence to developrobotized wheelchairs; some try to achieve autonomouscontrol over the wheelchairs, and others aim for semi-autonomous control. Most of these robotized wheelchairsare equipped with a computer and a set of sensors, suchas cameras, infrared sensors, ultrasonic sensors, and laserrangers. This assortment of equipment is used to address anumber of specific problems such as: obstacle avoidance,local environment mapping, and route navigation.

The CALL Centre of the University of Edinburgh hasdeveloped a wheelchair intended to be used by childrenwho do not have the physical, perceptual or cognitiveabilities to control an ordinary powered mobility aid [4].The Smart Wheelchair exhibits collision detection followed

by a maneuvering action, following the line tracks laidalong the floor, and communication between the user viaa speech synthesizer or other feedback system. The TAOseries (Applied AI Systems Inc.) is mainly designed forresearch and development purposes. TAO-7 is equipped witha voice recognition/synthesis interface and performs free-space detection in a crowd and landmark-based navigationin addition to the common tasks such as obstacle avoidance[5]. The Tin Man project at the Kiss Institute is aimed at thedevelopment for a low-cost robotic wheelchair to aid peoplewith impaired mobility [6]. Tin Man II exhibits manualnavigation with obstacle avoidance override, autonomouscontrol, and manual mode. The Wheelesley project at MITintends to be used by users unable to manipulate thestandard motorized wheelchair and aims to establish thesystem for both indoor and outdoor navigation. Wheelesleywas developed based on the Tin Man model and interactswith the seated user via a graphical user interface [7]. TheNavChair system (University of Michigan) , one of the mostsuccessful systems in the ’90s, intends to provide mobilitysupport to users who are unable to drive standard motorizedwheelchairs by autonomously selecting three different modes(tasks): obstacle avoidance, door passage, and wall following[8].

Despite these efforts, very few robotized wheelchairs arecurrently observed in the market, and none are intendedto be used outside of a research lab or a training facility.The above studies are mainly issuing an autonomous controlsystem. With autonomous control, the system probes theenvironment, detects an obstacle, plans a navigation route,makes decisions, and fully controls the mobility of thewheelchair, which leaves the user totally dependent uponthe equipment.

While some users might be comfortable with an au-tonomous wheelchair transportation system, others want tobe more involved with the process. It is essential for them tofeel as if, or actually, they are in control, while being respon-sible for both the decision-making and motion of everydaytransportation activities rather than being a passenger.

1.1 Semi-autonomous WheelchairsThe purpose of developing a semi-autonomous (SA)

wheelchair system, while it comprises hardware equipmentsimilar to autonomous wheelchairs, focuses on providing just

Page 2: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

as much assistance as the user really needs, in order to en-hance the level of autonomy of the user. Generally, the typeof assistance for a user varies from one to another; therefore,the specification of an SA wheelchair system needs to bedetermined based on the user’s physical conditions.

A number of research projects have been actively con-ducted to provide various types of semi-autonomous wheel-chairs. Borgolte et al. provided omnidirectional maneu-verability to allow an intuitive semi-autonomous controlsystem to assist people with severe or multiple handicap[9]. Argyros et al. presented a semi-autonomous navigationsystem comprising a panoramic vision system to support theusers with limited motor control of the upper extremities[10]. The Mobile Internet Connected Assistant (MICA)project designed a system in which the user either locally orremotely controls the chair by means of head-held sensors[11]. Although using a slightly different platform, a power-assisted manual wheelchair,1 Simpson et al. demonstratedthe first prototype which attempts to provide users who havevisual impairments with some basic navigation tasks such ascollision avoidance [12] .

However, none of these systems provide or enhanceperceptual capabilities for the seated user who has severevisual impairment. The goal of our SA wheelchair projectis to increase independent mobility of individuals who arewheelchair-bound and severely vision impaired by providinga perceptual navigation system. The scope of the projectfocuses on developing the perceptual navigation system withcertain autonomy, but not on controlling the movement ofthe wheelchair, in order to enable the user to maintain themaximum control over the wheelchair movement.

2. Design2.1 Behavior Cell Approach

Since the SA wheelchair system needs to be individualizedto the user, our design principle shall be based upon modu-larity and flexibility. Further, due to the nature of the navi-gational tasks required during wheelchair operations, whichmust be resolved quickly and handled concurrently, thebehavior-based control (BBC) architecture ([13], [14], [15])is suitable for the base architecture of our design. Utilizinga customized BBC architecture, we define behaviors whichemerge from an agent-environment interaction based on anumber of loosely tied processes that run asynchronouslyand respond to problems in a parallel manner.

In our project, behaviors are realized to accomplish theirtasks through user-machine cooperation. Two cooperatingbehaviors, the Perceptual Behaviors and Navigational Be-haviors, acquire the state of the environment through sen-sors, interpret or respond to the state, and pass the naviga-

1Based on a manual wheelchair, the rear wheel hubs are replaced withmotorized ones that magnify and/or adjust the propulsive force provided bythe user.

Fig. 1: The basic structure of a Behavior Cell.

tional signals to the user. Finally the user manipulates thewheelchair in combination with his/her own perception andjudgment (Manipulative Behaviors).

2.1.1 Behavior Cell

We present a unit of the behavioral structure, a BehaviorCell, which is based upon the BBC architecture with theextended input/output feature. A Behavior Cell consistsof an input/output (I/O) component, a behavioral functioncomponent, and an internal storage component (Figure 1).It structurally resembles an artificial neuron; however, ithas a logical gate in addition to widely extended functionssuch that the innervation link between cells can run by bothBoolean and numeric means.

The I/O component consists of a subset of the I/O portscharacterized as follows: Port-EB, excitatory inputs; Port-IB,inhibitory inputs; Port-DI, sensory/behavioral inputs; Port-RS, a reset signal; Port-IS, an innervation signal; Port-AB,an activation output; Port-EO, an effect output; and Port-AO, actuator outputs. The excitatory and inhibitory inputsare linked to the corresponding behaviors’ activation outputports. When any activation/inhibition conditions are met,the behavior is activated/deactivated. Our architecture allowsboth Port-EB and Port-IB to specify activation (inhibition)conditions by using logical expressions.

Port-DI takes various types of data inputs from sensors orother behaviors (effect outputs). When Port-IS receives aninnervation signal from outside, the behavior checks or sendsits inputs and outputs. If Port-RS receives a reset signal,the behavior will clear all or specified dynamic data. Port-AB contains an activation value (binary) that is linked tothe value of Port-EB. Port-EO contains an effect value thatis derived from the behavioral function. If the behavior isconnected to its effector(s), Port-AO sends Action Outputsto them.

The behavioral function component provides a flexibleactivation/computation functionality, such as algebraic sum,sigmoid, Gaussian, and logical expressions, as well as asimple by-pass function (e.g. a direct link between inputs andoutputs). More complicated functionalities, such as fuzzylogic inference operators or artificial neural networks, can

Page 3: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

also be implemented.The storage component provides a storing capability of

the current state onto its dynamic data, which enables thebehavior to achieve goals that contain temporal sequences.It may also contain internal static data which all instantiatedbehaviors can share and refer to, as well as individualconstant data that a behavior utilizes as permanent referenceinformation, such as threshold values or look-up tables.

The activation/computation process performed by a Be-havior Cell is as follows:

1) When Innervation Input (Port-IS) receives a signal,check the value of Effect Inputs (Port-EB). If true,set Activation Output (Port-AB) value to 1 (true) andgo to the next step, otherwise return.

2) Check the value of Inhibitory Inputs (Port-IB) to seewhether the behavior is inhibited. If false, go to thenext step, otherwise set Activation Output (Port-AB)to 0 (false) and return.

3) Check the value of Reset input (Port-RS), and if true,clear the dynamic data.

4) In case of using Port-EO: Using the information fromSensors/Behavior Inputs (Port-DI), derive the returnvalue from the behavioral function and write thisvalue to Effect Output (Port-EO) and return. Store thenecessary data in the internal memory if so designed.

5) In case of using Port-AO: Similar to (4), derive thereturn action commands from the behavioral functionand send the commands to the effectors via ActionOutputs (Port-AO) and return. Store the necessary datain the internal memory if so designed.

2.1.2 Behavior NetworkSimilar to other Behavior-Based architectures (for in-

stance, [16]), our approach also enables behaviors to consistof other behaviors (Behavior Network). In a Behavior Net-work, behaviors communicate with each other through theirport-to-port links, and precondition dependence character-izes the links; thus, the activation of a behavior is dependenton its pre-conditional links.

An individual Behavior Cell can connect to multipleBehavior Cells, and similarly, multiple Behavior Cells can belinked to a single Behavior Cell. This multiple-connectivityallows a Behavior Cell to be a member of multiple BehaviorNetworks, which makes component-like behaviors, suchas interface to the sensors, reusable. Containing multipletask-oriented/reactive behaviors (functional cells) enables aBehavior Network to accomplish various tasks, such ascommand arbitration, learning, and planning, while asyn-chronously performing tasks within the distributed architec-ture.

In order for a Behavior Network to behave as structurallyequivalent as a Behavior Cell when observed from outside,each Behavior Network must contain a set of specifictypes of Behavior Cells: Boolean I/O cells to handle the

Innervation

ExcitatoryInputs

InhibitoryInputs

ActivationOutput

Reset

Excitatory linkInhibitory linkData linkInnervation linkActuator link

Functional Cell

I/O Cell

Fig. 2: Schematic diagram of I/O links in a BehaviorNetwork.

excitatory/inhibitory signals, and activation I/O cells forscheduling the sequence of activating functional cells in theBehavior Network.

The functional cells deal with data and/or actuator com-munication. Connections between functional cells consist ofexcitatory links (between Port-AB and Port-EB), inhibitorylinks (between Port-AB and Port-IB), (sensory and/or behav-ioral) data links (between Port-EO and Port-DI), and actuatorlinks (between Port-AO and effectors). A functional cell ina Behavior Network can also be another Behavior Network.

Figure 2 illustrates a generic Behavior Network that con-sists of I/O cells and functional cells. When an innervationcell first receives a signal, it sends signals to the ExcitatoryInputs cell, Inhibitory Inputs cell, and Activation Output cellin this order. When the Activation Output cell receives the in-nervation signal, it also receives the excitatory and inhibitorysignals from Excitatory/Inhibitory Inputs cells; therefore, itcan compute its activation value, which in turn representsthe activation status of the whole Behavior Network. Theactivation value is written in Port-AO (activation output)of the Activation Output cell, and if the value is true, theActivation Output cell starts sending the innervation signalsto the functional cells in a predefined order, otherwise thewhole process will return; thus the Behavior Network willbe deactivated.

2.2 System ComponentsIn this section, we briefly overview the system compo-

nents of our SA wheelchair prototype. The major add-onhardware to the base wheelchair (Invacare Nutron® R32)comprises stationary ranging modules, motorized visionmodules, and the tactile feedback module.

The stationary ranging module is a collection of ultra-sonic sensors (Devantech SRF-08) which are attached to thewheelchair and constantly measure distances to any objectsthat are within the scope of the sensors. Six sonars areequipped to the front side, and two sonars are placed tothe rear side, as illustrated in Figure 3.

Page 4: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

Wheelchair

Back

Front

Front Left Front Right

Front-Side Right

Front-Side Left

Side RightSide Left

BackRight

BackLeft

Fig. 3: Schematic diagram of installation of sonars.

Servo (Pan)

Servo (Tilt)

Laser LineGenerator

Camera

Sonar

Fig. 4: Motorized vision module.

We designed a customized motorized vision module (Fig-ure 4) which consists of low cost components: a webcam(Logitech QuickCam® Pro 4000), laser line generator, andtilt/pan servo motors. It is capable of tilting (0–135°) andpanning (0–360°) and acquires depth data as well as sceneryimages.

The above sensor system is controlled by a micro-controller (Rowely Associates Ltd. CrossFire® LPC-2138evaluation kit) which is connected to the host computer viaa USB port.

The tactile feedback module is designed to convey vibro-tactile signals to assist users to manipulate the SA wheel-chair. We have developed a Vibrotactile Glove which consistsof an array of vibrotactors inside the glove on the back sideof the hand (Figure 5). Each vibrotactor is iconically mappedto represent the spatial positioning around the wheelchair,such as vibrotactor 1 for front; 3, left; 5, right; 7, rear; 4,center; and so forth. A vibrotactor generates a vibrotactilestimulus consisting of a rapid vibration lateral to the skinsurface, and its characteristic is determined by the durationof stimulus and the inter-stimuli interval. A sequential stim-ulus consists of a sequence of multiple vibrotactile stimuli

Fig. 5: Vibrotactors on the Vibrotactile Glove.

generated by different vibrotactors. We also designed amotor-array controller which controls the vibrotactors basedon commands sent from the host computer via a RS-232C.

3. Behavior ImplementationThe perceptual behaviors of interest in this paper are

Obstacle notification, Free-space finding,and Doorway navigation, each of which is a Behav-ior Network consisting of a set of other Behavior Cellsor Behavior Networks. The navigational behavior is rep-resented as the Navigation command manager, andthe Sensor command manager manages the commandsfor the sensor-related effectors (servo motors and the laserline generator). Figure 6 illustrates the schematic diagramof the perceptual and navigational behaviors accompaniedwith the sensor-effector systems. The thin arrows repre-sent data links, and the thick arrows represent actuatorlinks. Other port-to-port links are not shown. The fol-lowing subsections briefly describes the implementation ofObstacle notification, Free-space finding,Doorway navigation, Sensor command manager,and Navigation command manager behaviors.

3.1 Obstacle NotificationThe Obstacle notification behavior is a reac-

tive behavior which takes sensor readings from the ultra-sonic sensors and directly maps into vibrotactile represen-tations. It consists of one subordinate behavior, Sonarreader, and two internal functions: decodePacket andtranslateData (Figure 7). The Sonar reader behav-ior is a component-like behavior which reads ranging datafrom the ultrasonic sensors. The decodePacket functiondecodes the data packet from the Sonar reader behaviorinto an array of ranging data. The translateData func-tion interprets a numeric value of ranging data into a discreterepresentation which expresses the distance of an obstaclewithin the scope of each sensor.

3.2 Free-space FindingThe Free-space finding behavior is, by default,

another reactive behavior relating to the sensor readings, but

Page 5: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

Camera Input

MotionSensor Input

Sonar Input

Servos

Laser

VibrotactileGlove

SensorController

Motor ArrayController

Doorway Navigation

Obstacle Notification

Free-space Finding

NavigationCommand Manager

Sensor CommandManager

Fig. 6: Schematic diagram of the perceptual and navigational behaviors (thin arrows: data links, thick arrows: actuator links).

Obstacle Notification

translateDataSensorreader

Sonar sensors

Output datadecodePacket

Data linkActuator link

function

Behavior

Fig. 7: Schematic diagram of the Obstacle notification be-havior.

its task is different from the Obstacle notificationbehavior. Instead of notifying of obstacles, it reports a listof orientations in which no immediate obstacles are found.This behavior is particularly effective to find a way out whenthe SA wheelchair is surrounded by many obstacles.

It consists of an internal function decodePacketand three subordinate behaviors: Sonar reader, Rangefinder, and Way finder (Figure 8). The Free-spacefinding behavior primarily relies on sonar readings fromthe Sonar reader behavior, and utilizes the Way finderbehavior to find a list of obstacle-free orientations. If theWay finder behavior finds that the sonar readings donot provide enough information, it will send a request tothe Sensor command manager behavior to move thepan servo motor and will also invoke the Range finderbehavior to acquire more accurate depth information.

Since the Range finder behavior is connected to mul-tiple behaviors, the Sensor command manager behaviormay not allow the Way finder behavior to control theRange finder behavior in order to avoid conflicts. If thatis the case, the Sensor command manager behavior willsend an inhibitory signal to Port-IB of the Free-spacefinding behavior, which is directly connected to the sameport of the Way finder behavior. The Way finderbehavior will then return the list that contains a null valueto indicate that no available orientations are found.

3.3 Doorway NavigationThe task of passing through a doorway, in general, com-

prises several subtasks addressing dynamic determinationof a maneuvering trajectory and decision making whichrelies on high level cognitive processes. Some previous

Sensorreader

Sonar sensors

Output datadecodePacket

Rangefinder

Camera

Free-space Finding

Excitatory linkInhibitory linkData linkActuator link

function

BehaviorWayfinder

Sensorcommandmanager

Port-IB

Fig. 8: Schematic diagram of the Free-space findingbehavior.

Fig. 9: Schematic diagram of a maneuvering trajectory forpassing through a doorway.

studies demonstrate the creation of a local map in order togenerate a maneuvering trajectory for an autonomous vehicle([17], [18]). Such maneuvering trajectories typically shape asmooth curvature as illustrated in Figure 9. However, gener-ating such trajectories requires much finer angular guidancethan the one that the vibrotactile signals can convery.

Our guidance approach utilizes the Pivotal Point/Zone, anintermediate goal point (area) from which the wheelchaircan straightforwardly move toward the doorway, to create aPivotal-zone trajectory that consists of a series of simplifiedwheelchair movements. In contrast with the maneuveringtrajectory in Figure 9, the Pivotal-zone trajectory in Fig-ure 10 consists of distinctly separated components: a seriesof nearly straight lines and a sharp turning point. By splittingthe trajectory into those components, our approach enableseach step of the wheelchair movement to be translatable intoa simple and intuitive vibrotactile representation, such as “go

Page 6: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

Fig. 10: Schematic diagram of a Pivotal-zone trajectory forpassing through a doorway.

Excitatory linkInhibitory linkData linkActuator link

Doorwaydetection

Camerainput

Sensoryinput*

Laser Module

Rangefinder

Pan/tiltservos

Objecttracker

BooleanI/O*

Doorway navigation

Doorwayvalidation

Pathplanner

BooleanI/O*

Sensoryinput*

Output data

* Identical component/behavior

Sensorcommandmanager

External

Fig. 11: Schematic diagram of the Doorway navigationBehavior Network.

front-left,” “swivel clockwise,” or “go forward.”Figure 11 illustrates the Doorway navigation be-

havior which comprises the following behaviors: Doorwaydetection, Range finder, Door validation,Object tracker, and Path planner. The Doorwaydetection behavior searches for a doorway candidate inthe image stream by classifying the presence of a doorframe.If it finds a candidate, the Door validation behavioris invoked to confirm the validation of the candidate. TheDoor validation behavior then activates the Rangefinder behavior that measures the depth information in theimage by detecting the horizontally emitted laser line basedon the triangulation principle. Both Doorway detectionand Range finder behaviors constitutes computer-visionbased algorithms to accomplish their tasks, but the detailsare beyond the scope of this paper.

Once the doorway is confirmed, the Object trackerbehavior traces the door in order to fixate the camera to thedoorway. In the mean time, the Path planner behavioris invoked to undertake navigation to a pivotal zone, whichcomprises a repetitive process of localizing the wheelchair,adjusting the path, and sending a signal indicating theorientation to the user, until the wheelchair reaches that zone.At the pivotal zone, the Path planner guides the userto swivel until the wheelchair straightforwardly faces the

doorway and then to move forward.

3.4 Sensor command managerIn case a functional cell connected to multiple cells

must uniquely communicate with one cell at a time, forinstance, if a camera-controlling cell is connected to twodifferent possibly conflicting Behavior Cells, the Sensorcommand manager behavior will handle the tasks sim-ilar to a hardware interrupt based on a list of predefinedpriority of linked Behavior Cells. The Sensor commandmanager behavior receives input commands from theDoorway navigation and the Free-space findingbehaviors, more specifically, the Object tracker andRange finder behaviors in the Doorway navigationbehavior, and the Way finder and Range finder be-haviors in the Free-space finding behavior.

Since the Doorway navigation and Free-spacefinding behaviors have different areas of interest forranging, the Range finder behavior must be exclusivelycontrolled by either one of them. The priority is given tothe Doorway navigation behavior; therefore, when theObject tracker and Range finder behaviors in theDoorway navigation behavior are active, a control re-quest of the servo motors from the Free-space findingbehavior is suppressed. The Sensor command managerbehavior achieves this task by sending an inhibitory signalto the Free-space finding behavior.

3.5 Navigation command managerThe Navigation command manager behavior

accepts navigation commands from the Doorwaynavigation, Free-space finding, and Obstaclenotification behaviors, arbitrates the commands, andgenerates a vibrotactile representation. It consists of fourinternal functions: gateSignals, selectSignal,encodePacket, and mergeCommands (Figure 12). LetCommanddoor denote the command from the Doorwaynavigation behavior; Commandfree, the Free-spacefinding behavior; and Commandobs, the Obstaclenotification behavior, respectively.

The gateSignals function accepts Commandfree,which will be one of the following: a list of obstacle-free orientations, the “stop-the-chair” signal (no free-spaceis found), or a null code (the Doorway navigationbehavior is active). The “stop-the-chair” signal is sent tothe selectSignal function, and the null code is sent tothe mergeCommands function. If Commandfree consists ofa list of obstacle-free orientations, it will be sent to both.

By default, Commandobs is sent to the encodePacketfunction to generate a series of control codes (a vibrotac-tile representation) to be passed to the motor-array con-troller. However, when more than two immediate obstaclesare found in Commandobs, the selectSignal function

Page 7: Behavior-based Perceptual Navigation for Semi-autonomous Wheelchair Operationscobweb.cs.uga.edu/~potter/robotics/ICA4432final.pdf · 2008-05-07 · Behavior-based Perceptual Navigation

Vibrotactilemotor

controller

encodePacket

Navigation command manager

Data link

function

Behavior

Obstaclenotification

Free-spacefinding

Doorwaynavigation

gateSignals

selectSignal

mergeCommands

Fig. 12: Schematic diagram of the Navigation commandmanager behavior.

chooses the most appropriate orientation from the list ofobstacle-free orientations in Commandfree.

When the Doorway navigation behavior is acti-vated, both Commanddoor and Commandfree are sent tothe mergeCommands function to find an orientation thatsatisfies both constraints. If Commandfree consists of anull code, the mergeCommands function simply choosesCommanddoor to be sent to the encodePacket function.If Commandfree consists of a list of obstacle-free orien-tations, the mergeCommands function tries to find theclosest orientation to the one from Commanddoor. Once themergeCommands function issues a new command, theencodePacket function only takes that command andignores Commandobs.

The encodePacket function generates a control code tobe executed by the Vibrotactile motor controllerbehavior to generate a vibrotactile signal to the VibrotactileGlove. The vibrotactile signals consist of a warning signal(e.g., “stop-the-chair”), spatial representation (the presenceof obstacles), and directional guidance.

4. Current StatusCurrently we are in the process of implementing the

behaviors and are close to completion. Experimental fieldtests of the SA wheelchair operation are underway.

5. ConclusionIn this paper, we proposed a customized BBC architecture

to design the perceptual behaviors for the SA wheelchairoperations. Although the experimental field tests are stillunderway, the Behavior Cell and Behavior Network com-ponents provide flexibility, extensibility, and polymorphismin design which enables more intuitive implementation ofa planning oriented task for our SA wheelchair projectcompared to a standard BBC architecture.

References[1] J. Douglass and M. Ryan, “A pre-school severely disabled boy and his

powered wheelchair: A case study,” Child Care, Health Development,vol. 13, pp. 303–309, 1987.

[2] K. Paulsson and M. Christoffersen, “Psychosocial aspects of technicalaids - how does independent mobility affect the psychosocial andintellectual development of children with physical difficulties?” inProc. 2nd Int. Conf. of the Rehabilitation Engineering Society of NorthAmerica, Ottawa, 1989, pp. 282–285.

[3] G. Verburg, L. Balfour, E. Snell, and S. Naumann, “Mobility trainingin the home and school environment for persons with developmentaldelay,” in Final Report to Ontario Mental Health Foundation andMinistry of Community and Social Services’ Research and ProgramEvaluation Unit, 1991.

[4] J. Odor and M. Watson, “Learning through smart wheelchairs: Aformative evaluation of the effective use of the call centre’s smartwheelchairs as part of children’s emerging mobility, communication,education and personal development,” Final Report to The NuffieldFoundation and the Scottish Office Education Department, Tech.Rep., May 1994.

[5] T. Gomi and A. Griffith, “Developing intelligent wheelchairsfor the handicapped,” in Assistive Technology and ArtificialIntelligence, Applications in Robotics, User Interfaces and NaturalLanguage Processing, ser. Lecture Notes in Computer Science, V. O.Mittal, H. A. Yanco, J. M. Aronis, and R. C. Simpson, Eds., vol.1458. New York: Springer, 1998, pp. 150–178.

[6] D. P. Miller and M. G. Slack, “Design and testing of a low-costrobotic wheelchair prototype,” Autonomous Robots, vol. 2, no. 1, pp.77–88, Mar. 1995.

[7] H. A. Yanco, “Wheelesley: A robotic wheelchair system: Indoornavigation and user interface,” Lecture Notes in Computer Science,vol. 1458, p. 256, 1998.

[8] S. P. Levine, D. A. Bell, L. A. Jaros, R. C. Simpson, Y. Koren,S. Member, and J. Borenstein, “The NavChair assistive wheelchairnavigation system,” IEEE Transactions on Rehabilitation Engineering,vol. 7, pp. 443–451, June 21 1999.

[9] U. Borgolte, H. Hoyer, C. Bühler, H. Heck, and R. Hoelper,“Architectural concepts of a semi-autonomous wheelchair,” Journalof Intelligent and Robotic Systems, vol. 22, no. 3, pp. 233–253, Jul.1998.

[10] A. A. Argyros, P. Georgiadis, P. Trahanias, and D. P. Tsakiris,“Semi-autonomous navigation of a robotic wheelchair,” Journal ofIntelligent and Robotic Systems, vol. 34, pp. 315–329, Mar. 02 2002.

[11] S. Rönnbäck, J. Piekkari, K. Hyyppä, T. Berglund, and S. Koskinen,“A semi-autonomous wheelchair towards user-centered design,” inICCHP, ser. Lecture Notes in Computer Science, K. Miesenberger,J. Klaus, W. L. Zagler, and A. I. Karshmer, Eds., vol. 4061.Springer, 2006, pp. 701–708.

[12] R. Simpson, E. LoPresti, S. Hayashi, S. Guo, D. Ding, W. Ammer,V. Sharma, and R. Cooper, “A prototype power assist wheelchairthat provides for obstacle detection and avoidance for those withvisual impairments,” Journal of NeuroEngineering and Rehabilitation,vol. 2, no. 1, p. 30, 2005.

[13] R. C. Arkin, Behavior-Based Robots. Cambridge, Massachusetts:The MIT Press, 1998.

[14] R. A. Brooks, “Integrated Systems Based on Behaviors,” SIGARTBulletin, vol. 2, no. 4, pp. 46–50, 1991.

[15] M. J. Mataric, “Behavior-based control: Main propertiesand implications,” in IEEE International Conference on Roboticsand Automation, Workshop on Architectures for Intelligent ControlSystems, Nice, France, may 1992, pp. 46–54.

[16] M. N. Nicolescu and M. J. Mataric, “A hierarchical architecture forbehavior-based robots,” in Proceedings of the First InternationalJoint Conference on Autonomous Agents and Multiagent Systems(AAMAS’02), M. Gini, T. Ishida, C. Castelfranchi, and W. L.Johnson, Eds. ACM Press, Jan. 17 2002.

[17] S. Patel, S.-H. Jung, J. P. Ostrowski, R. Rao, and C. J. Taylor,“Sensor based door navigation for a nonholonomic vehicle,” in IEEEInternational Conference on Robotics & Automation, Washington,DC., May 2002, pp. 3081–3086.

[18] H. Surmann, A. Nüchter, and J. Hertzberg, “An autonomousmobile robot with a 3D laser range finder for 3D exploration anddigitalization of indoor environments,” Robotics and AutonomousSystems, vol. 45, no. 3-4, pp. 181–198, 2003.