11
WP4 - 4:30 PATTERN RECOGNITION AND CONTROL IN MANIPULATION*) Anta1 K. Bejczy**) Member of the Technical Staff Jet Propulsion Laboratory California Institute of Technology Pasadena, California Abstract A new approach to the use of sensors in mani- pulator or robot control is discussed. The concept addresses the problem of contact or near-contact type of recognition of three-dimensional forms of objects by proprioceptive and/or exteroceptive sensors integrated with the terminal device. This recognition of object shapes both enhances and simplifies the automation of object handling. Several examples have been worked out for the "Belgrade hand" and for a parallel jaw terminal device, both equipped with proprioceptive (position) and exteroceptive (proximity) sensors. The control applications are discussed in the frameworkof a multilevel man-machine system control. The control applications create interesting new issues which, in turn, invite novel theoretical considerations. An important issue is the problem of stability in control when the control is referenced to patterns. Introduction Non-contact pattern recognition, especially the classification of visual forms, has been so far the focus of research interest.This is understand- able since the visual channel is transmitting very rich and valuable information from the outside world. Tactile and force information are certainly less rich in information content than the visual one, but in certain instances their role becomes very important and critical. This is the case in robotics and man- ipulator control problems in general, and especially so in medical applications where anthropomorphic robotic devices play an increasing role in rehabili- tation engineering. Advances in robotics and in manipulator con- trol have stimulated pattern recognition research of the contact type. In general, man-environment physical interaction is detected by two types of sensors: proprioceptive and exteroceptive ones. The first type of sensors provide information about static and dynamic joint states like joint angles, speed, acceleration, and forces or torques acting at the joints. Exteroceptive sensors provide information about the physical features of the environment: contact pressure distribution or distribution of contact forces and torques, temper- Ra j ko Tomovic visiting Professor Department of Computer Science University of California Los Angeles, California ***) ature, stiffness, roughness, and slippage. Both types of sensors are needed in robotics and rnanipu- lator control although their principles of operation and physical aspects may have nothing in comnon with corresponding biological transducers. Certain types ,of proprioceptive and extero- ceptive sensors have already been used in manipu- lator development. The effect of exteroceptive sensors on manipulator control performance has been reviewed elsewhere recently (Ref. 1). The impor- tance of force feedback in industrial modular assembly has been extensively studied (Ref. 2). Shape recognition by a rotating belt terminal device for object orientation was proposed recently (Ref. 3). The use of non-contact, visual pattern recognition methods has been part of most projects in modern robotics (see e.g. Refs. 4 and 5). In this paper a new approach to the use of sensors in manipulator or robot control is consider- ed. Emphasis is lafd on the use of contact or near- contact type recognition of three-dimensional forms of objects by proprioceptive and/or exteroceptive sensors integrated with the terminal device, This type of object shape recognition enhances as well as simplifies the automation of object handling in two ways: (1) manipulator control algorithms can directly be referenced to "patterns," and (2) log- ical rather than state space models can be employed for control. The versatility of contact or near- contact type pattern recognition and control algor- ithms depend directly on the interaction between the mechanical dexterity (degrees of freedom) of the terminal device and the richness of proprio- ceptive and exteroceptive information generated by sensors integrated with the terminal device. The first part of the paper discusses pattern recognition by upper extremities. Examples are described which have been worked out for the "Belgrade hand" and for a parallel jaw terminal device, both equipped with proprioceptive (position) hand exteroceptive (proximity) sensors. It is pointed out that the control applications create interesting new issues which, in turn, invite novel theoretical considerations. An important problem is stability when the control is referenced to patterns. *)part of this work represents one phase of research carried out at the Jet Propulsion Laboratory, Cali Institute of Technology, under Contract No, NAS7-100, sponsored by the National Aeronautics & Space Administration. **)Senior Member IEEE. ***)Prof. of Control Engineering, Faculty of Electrical Engineering, University of Belgrade, yuRoslrvia, 374

[IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

  • Upload
    rajko

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

Page 1: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

WP4 - 4:30 PATTERN RECOGNITION AND CONTROL IN MANIPULATION*)

Anta1 K. Bejczy**) Member of the Technical Staff Jet Propulsion Laboratory California Institute of Technology Pasadena, California

Abstract

A new approach to the use of sensors in mani- pulator or robot control is discussed. The concept addresses the problem of contact or near-contact type of recognition of three-dimensional forms of objects by proprioceptive and/or exteroceptive sensors integrated with the terminal device. This recognition of object shapes both enhances and simplifies the automation of object handling. Several examples have been worked out for the "Belgrade hand" and for a parallel jaw terminal device, both equipped with proprioceptive (position) and exteroceptive (proximity) sensors. The control applications are discussed in the framework of a multilevel man-machine system control. The control applications create interesting new issues which, in turn, invite novel theoretical considerations. An important issue is the problem of stability in control when the control is referenced to patterns.

Introduction

Non-contact pattern recognition, especially the classification of visual forms, has been so far the focus of research interest. This is understand- able since the visual channel is transmitting very rich and valuable information from the outside world. Tactile and force information are certainly less rich in information content than the visual one, but in certain instances their role becomes very important and critical. This is the case in robotics and man- ipulator control problems in general, and especially so in medical applications where anthropomorphic robotic devices play an increasing role in rehabili- tation engineering.

Advances in robotics and in manipulator con- trol have stimulated pattern recognition research of the contact type. In general, man-environment physical interaction is detected by two types of sensors: proprioceptive and exteroceptive ones. The first type of sensors provide information about static and dynamic joint states like joint angles, speed, acceleration, and forces or torques acting at the joints. Exteroceptive sensors provide information about the physical features of the environment: contact pressure distribution or distribution of contact forces and torques, temper-

Ra j ko Tomovic visiting Professor Department of Computer Science University of California Los Angeles, California

***)

ature, stiffness, roughness, and slippage. Both types of sensors are needed in robotics and rnanipu- lator control although their principles of operation and physical aspects may have nothing in comnon with corresponding biological transducers.

Certain types ,of proprioceptive and extero- ceptive sensors have already been used in manipu- lator development. The effect of exteroceptive sensors on manipulator control performance has been reviewed elsewhere recently (Ref. 1). The impor- tance of force feedback in industrial modular assembly has been extensively studied (Ref. 2 ) . Shape recognition by a rotating belt terminal device for object orientation was proposed recently (Ref. 3 ) . The use of non-contact, visual pattern recognition methods has been part of most projects in modern robotics (see e.g. Refs. 4 and 5).

In this paper a new approach to the use of sensors in manipulator or robot control is consider- ed. Emphasis is lafd on the use of contact or near- contact type recognition of three-dimensional forms of objects by proprioceptive and/or exteroceptive sensors integrated with the terminal device, This type of object shape recognition enhances as well as simplifies the automation of object handling in two ways: (1) manipulator control algorithms can directly be referenced to "patterns," and (2) log- ical rather than state space models can be employed for control. The versatility of contact or near- contact type pattern recognition and control algor- ithms depend directly on the interaction between the mechanical dexterity (degrees of freedom) of the terminal device and the richness of proprio- ceptive and exteroceptive information generated by sensors integrated with the terminal device.

The first part of the paper discusses pattern recognition by upper extremities. Examples are described which have been worked out for the "Belgrade hand" and for a parallel jaw terminal device, both equipped with proprioceptive (position) hand exteroceptive (proximity) sensors. It is pointed out that the control applications create interesting new issues which, in turn, invite novel theoretical considerations. An important problem is stability when the control is referenced to patterns.

*)part of this work represents one phase of research carried out at the Jet Propulsion Laboratory, California Institute of Technology, under Contract No, NAS7-100, sponsored by the National Aeronautics & Space Administration.

**)Senior Member IEEE. ***)Prof. of Control Engineering, Faculty of Electrical Engineering, University of Belgrade, yuRoslrvia,

374

Page 2: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

The secorld part of the paper briefly discusses pattern recognition by lower extremities, Although we are not used to thinking in this way, the foot is used, in fac:t, as another terminal device for pattern recognrtion, supported by the leg as is the hand by the arm. It is pointed out that arti- ficial locomotion systems for neuromuscular dis- functions of lower extremities depend on data processing generated by foot-ground contact (Ref. 6 ) . Therefore, a new possibility of applying pattern recognition techniques to features as level ground, slope, uneven terrain, etc. is discussed.

The third part of the paper deals very briefly with recognition of non-geometric patterns or features as they are related to manipulation, especially in medical applications.

In conclusion, the control examples and issues are briefly discussed in the framework of a multi- level man-machine system control.

A. Pattern Recognition by Upper Extremities

Pattern recognition research in this area has started from very modest level. In order to under- stand better the above statement, the pattern recognition problem by upper extremities is de- composed into several hierarchical levels according to degrees of freedom of the terminal device and availability of sensors.

The lowest level in hardware complexity can be illustrated as follows. The terminal device is just one rigid finger equipped only with point-MTee contact sensor as shown in Fig. 1. Since in the

Figure 1. Primitive Tactfle Senstng

static case such a terminal device can provide only one bit of information to the operator or to the computer, the recognition process can produce only the simple answer (yes or no) if in front of the terminal device there is free space or barrier. However, by transforming the static situation into the dynamic one, more complex pattern recognition problems can be solved even with such a primitive terminal arrangement. The obvious solution to derive more information about the shape of an object is to use contour scanning in two or three dimen- sions. It is also clear that non-contact scanning methods such as, for example, electro-optical proximity sensors or ultrasonic devices can replace contact scanning.

At the next level of complexity one can use a terminal device with several degrees of freedom equipped with just one type of sensor (propriocep- tive or exteroceptive). Research with such an arrangement has been started and is extended to cover more sophisticated hardware (Ref. 7 ) .

1. Anthropomorphic Hand with Proprioceptive Sensors

The basic component of an advanced contact pattern recognition system is the anthropomorphic terminal device having articulated fingers, as in Fig. 2. In the first experiments, the artifi- cial hand was provided only with positional proprio- ceptive sensors. Following angles were used as proprioceptive data: x1 - measure of the relative position of the thumb and the forefinger; x2 - measure of the relative position of the ring finger and the middle finger; x - measure of the relative position of the little fanger and the ring finger. Thus one gets the feature vector X = 1x1, x2’ x31 *

Figure 2. Anthropomorphic Terminal Device with Articulated Fingers

The pattern recognition task was defined as a two class problem. Two classes of objects, spheres and cylinders, had to be classified by one static grasp irrespective of their size; naturally, withLn the range of the hand opening. Essential advantages of the parallel grasp with respect to the previous one digit case are easily understood by looking at Figs. 3a and 3b. They represent the components of the feature vector, X, as functions of the cylinder and sphere diameters’. Characteristic function, F alxl + a x2 + a x for cylinders and spheres related to tfe Belg?aa: hand, is represented in Fig. 4. A simple decision criterion has been derived and used for automatic shape and size recognition of grasped objects as described in detail in Ref. 7.

The above example was presented in order t o demonstrate the interesting interaction between the terminal device and sensor availability. With the anthropomorphic terminal device and very poor pro- prioceptive information, one can solve in an elegant way pattern recognition problems which otherwise

Page 3: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

::

OJ CYLINDER (GRASPING BY FINGER TIPS)

V

7

6 -

5 -

4

3

2

5

b) SPHERE (GRASPING BY FINGER TIPS)

Figure 3. Components of 'Feature Vector for Sphere and Cylinder

would require extensive data processing and scanning. In our example, s t a t i c p a r a l l e l g r a s p w i t h p r o p r i o - cep t ive s enso r s r ep laced t h ree4 iwns iona l s cann ing with one contact sensor. Speaking generally, the unique feature af the human hand, which is adapt ive t o o b j e c t s of any shape, makes i t a t the same time a

:I

Figure 4. Charac ter i s t ic Funct ions for Sphere and Cylinder

very efficient three-dimensional shape recognition system. Articulated f ingers provided with posi- t iona l p ropr iocept ive sensors when in contac t wi th the object can sense the imprint of the external forms. The r o l e of the pos i t iona l p ropr iocept ive feedback has been unduly overlooked for this appl ica t ion .

Working up t o more complex mechanism-sensor terminal system, one can think of the anthropo- morphic hand provided with both proprioceptive and ex terocept ive sensors . It is a l so poss ib l e t o u se t h e p a r a l l e l g r a s p i n a scanning way by moving the hand around the object i n several d i s c r e t e s t e p s o r even i n continuous way. Thus, an ever increasing mount of information may be generated which can then be processed by t h e computer f o r p a t t e r n recognition and, consequently, for control purposes.

2. Athropomorphic Hand with Exteroceptive Sensors

Fig. 5 shows the Belgrade hand equipped with three miniatur ized proximity sensors . (The lower pa r t o f t he f i gu re shows the hand mounted to the JPL/Ames anthropomorphic arm.) The three proximity sensors are mounted to the thumb, foref inger , and middle f inger , respect ively. The sensors have been developed a t JPL and are d e s c r i b e d i n d e t a i l i n Ref. 8. This sensor is an electro-opt ical device and produces a vo l t age s igna l when the sensor ' s s e n s i t i v e uoIume--which is permanently focused a t a distance of about 10 cm i n f r o n t of the sensor head--"touches" a so r id su r f ace as t h e terminol: device approaches the surface. The sensor is a spec ia l ex t e rocep t tve s enso r since ft provides information about short range &stances between des igna ted po in ts of the terminal &vice (e .& fingertips]. and the environment.

It is convenient to fn te rpre t the voitsgc output "V" of the proximity sensors i n term of dis tance reg ions ud " as ind ica t ed i n F ig . 6 r a t h e r t h a n i n term$ o f spec i f ic ranges . Thus, f o r a set of i = 1, ..., m sensors on a terminal device. and fo r d i s t ance r eg ions j - O,.. . ,n uniformly assfgned t o each sensor we have the proximity feature vector P - I d 1. Obviously, three single-point proximity sefi$ors at taehed t o the terminal device can define a p l ane r e l a t ive t o the terminal device.

A proximity feature vector constructed from the ou tput o f th ree s ing le-poin t sensors has a r ich information content and can be used to define

376

Page 4: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

Figure 5. Anthropomorphic Terminal Device with Miniaturized Proximity Sensors

3 -

2 -

IF 1

I I b 0. c DISTANCE

3t - LEVEL I

: c DISTANCE 4 d

REClONc 4

I .

3 2 I 0

Figure 6. Proximity Sensor Voltage Levels Versus Distance Regions

va r ious cha rac t e r i s t i c func t ions . A category of Charac t e r i s t i c func t ions i s r e l a t e d t o t h e mapping of contours or shapes of objects and environment. Another category is r e l a t e d t o t h e p o s i t i o n o r a l ignmen t s t a t e o f t he t e rmina l dev ice r e l a t ive t o €he objects and environment. This last category of character is t ic funct ions can immediately be

u t i l i z e d t o g u i d e and control motion of t h e t e r - minal device near sol id objects , e .g . , f o r t h e purpose of an undisturbing pick-up of a g l a s s of l i q u i d .

It should be pointed out that the motion of t h e a r t i c u l a t e d f i n g e r s and t h e thumb will cause a change i n t h e o r i e n t a t i o n of t he r e f e rence su r f ace a s i t is defined by t h e finger-mounted proximity senso r s r e l a t ive t o t he base frame of the Belgrade hand. Thus , increas ingly r icher fea ture vec tors and cha rac t e r i s t i c func t ions can be cons t ruc t ed i f t h e i n t e r n a l state of the Belgrade hand is monitored by propr iocept ive sensors in conjunct ion wi th the monitoring of the exteroceptive (proximity) sensor s igna ls . In tha t case , a new dimension can be added t o t h e f e a t u r e v e c t o r s a n d c h a r a c t e r i s t i c func t ions , namely the i n t e rna l con f igu ra t iona l s t a t e k = l , . . . . , ~ of t he a r t i cu la t ed f i nge r s . That is, the proximity feature vector w i l l have t h e form P =

l d i j k l * A t t he p re sen t s.tate of development, the experi-

mentation with the Belgrade hand equipped with proximity sensors is i n a manual cont ro l mode. The f ea tu re vec to r is displayed to the operator who a l s o has a v isua l observa t ion of t h e s t a t e o f t h e hand. The opera tor then in te rpre ts the fea ture vec tor together with the visual ly observed scene and commands the motion necessary to complete a grasp- ing t ask , The main purpose of the manual experi- ments is t o - a r r i v e a t a usefu l set o f c h a r a c t e r i s t i c func t ions i n t e rms of proximity feature vectors .

3. P a r a l l e l Jaw Hand with Exteroceptive Sensors

Fig. 7 shows a p a r a l l e l jaw terminal device equipped with four proximity sensors, two sensors on each jaw (or f inger). The f i g u r e a l s o i n d i c a t e s

SENSITIYE VOLUMES DOWWLRO

F i g u r e 7 . P a r a l l e l Jaw Terminal Device with Proximity Sensors

377

Page 5: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

the position and pointing direction of the four sensitive volumes relative to the fingers. The state of the fingers (opening-closing) is monitored by a proprioceptive sensor (position transducer). Thus, us ing the notations explained previously, for the k-th opening-closing state of the fingers we 'have a pmximity feature vector

where j = 0, 1, 2 , 3, 4 (see Figure 6) and the sensor indexing is as follows: 1--forward left (as looking along the fingers from the hand base); 2--downward left; 3--forward right; 4--downward right. From the proximity feature vector given by Eq. 1, a variety of characteristic functions can be constructed for shape/environment recog- nition or for automatic positioning of terminal device for grasping an object.

A simple control example can best illustrate the utility of the proximity feature vector given by Eq. 1. Let the task be the grasping of a regular object (a block) at a fixed horizontal level above a table and so that the object remains stationary during the grasping action. This task involves the following control problems: (a) Horizontal align- ment of the grasp plane of the parallel jaw terminal device at a fixed level above the table. (b) Align- ment of the pointing direction of the terminal device (that is, alignment of the jaw) parallel to that side of the object where the object will be grasped. (c) Centering the terminal device on the object, that is, positioning the center axis of the terminal device along the center plane of the object (see Figure 8 ) . Obviously, control tasks (a) and (b) together are equivalent to a full three-dimensional alignment of the terminal device, and control tasks (a) and (c) together are equivalent to a two- dimensional positioning of the terminal device.

y OBJECT CENTER PLANE

AND GRASP PUNE OF POINTING DIRECTION

TERMINAL DEVICE

PITCH

Figure 8. Grasping of a Regular Object Referenced to Proximity Features

Let us assume that the fingers are closed initially and that the front wall of the object is wider than the separation of the two forward pointing proximity sensors' sensitive volume. Let us also assume that the horizontal pointing direction of the terminal device can be established and controlled independently of proximity sensor signals. In that case, the characteristic function Fa constructed

from the proximity feature vector defined by Eq. 1 and which satisfies control tasks (a) and (b) will be given by simple simultaneous equalities:

where the star above "j" signifies that, in general, the distance region of the downward point- ing sensors can be different from that of the forward pointing sensors and we still obtain a full three-dimensional alignment of the terminal device relative to the block.

Having satisfied the characteristic function F by appropriate motion conrmands, the characteristic a function F which assures the centering of the terminal dgvice on the object can be formulated for three motion steps directly as follows:

Step 1:

Step 2 :

Step 3:

Open fingers until F = [dl or d

centered if dlj - d3j = 0 simultaneously.)

Move terminal device t o the right if d - 0 or to the left if d = 0 until 1j F =Id = 0 or F tgen'cadti ue 1 thePo$kdg of the fingers

3j= 0 1 , respectively,

until F - d = 0 and monitor the dietand $'dy whgd the fingers have opened during this time. (Note: if the last F cannot be satisfied with full opening of the fingers, then the block is wider than the hand and cannot be grasped.)

Move terminal device by 0.5~ in the direction opposite to the motion direction of Step 2. This last motion will center the terminal device on the object. Now, the terminal device can be moved toward the object until a predefined range is reached, and then the fingers can be closed for grasping.

(Note that the termigal deace

It is noted that the control procedure described above only monitors finger opening to measure the centering offset of the terminal device relative to the block, and does not require a directional integra- tion of joint motions for that purpose during Step 2. It is also noted that the bias distances between sensor head mounts and inner surface of fingers must be properly accounted for in the implementation of the centering control procedure referenced to proximity feature vectors.

4 . Stability in Pattern-Referenced Control

Pattern-referenced control can conceptually be described by a simple generalization of the well- known servo feedback diagram as shown in Fig. 9. The significant point shown in Fig. 9 is that, in the case of pattern-referenced control, the "feedback logic" is equivalent to the "feedback gain." Hence, by analogy, the stability in pattern-referenced control is essentially deter- mined by the design of the "feedback logic." It is, therefore, meaningful to ask the following questions: What overshoot or undershoot will a given "feedback logic" produce? Will a given

378

Page 6: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

1

REFERENCE PATTERN *NO

I

Figure 9 . General Pattern-Referenced Control Diagram

"feedback logic" result in an oscillatory behavior of the system? To what extent does the stability property of a given "feedback logic" depend on the physical properties of the reference patterns and of the system to be controlled?

As of now, no general theories exist which are aimed to analyze stability problems in pattern- referenced control, or to synthesize stable "feedback logic." Each case is based on ad hoc considerations and requires extensive experimental or simulation work for verification. A most challenging task would be the creation of a theoretical framework for the analysis and synthesis of pattern-referenced control systems.

B. Pattern Recognition by Lower Extremities

Although we are not used to think in this way, the foot is used, in fact, as another terminal device for pattern recognition, supported by the leg as is the hand by the arm. During locomotion, we rely very much on pattern recognition capabilities of the lower limbs. It is true that they are much poorer than the performances of upper extremities but none- theless they are important.

As a matter of fact, this phenomenon is very well recognized but under a different name. The role of sensory feedback is essential in all our skeletal activities. By using the term pattern recognition we wish to emphasize that it is not just the simple sensory feedback which matters. Pattern recognition is also involved in order to derive more elaborate information for skeletal control. Simple ground contact of the heel in locomotion pro- vides already information that a solid barrier is hit. Such information as level ground, positive or negative slope, uneveq ground, hole, staircase, is essential for control purposes.

It will be shown that the combination of hardware-sensor elements for lower extremities can

10. Walking on Ground Level depending if the heel is in contact with the ground or not,

depending if the mid-foot is in contact with the ground or not,

aa, ak, ah are the ankle, knee and hip angles respectively.

The same expression, Eq. 3, is used to recognize positive or negative slopes. In this case a # 90' and the value $x is proportional to the slope. Note that Eq. 3 fs valid in dynamic situations as well. Evidently, in that case, the storage of a requires dynamic memory system.

Fig. 11 explains the logical expression for recognition of stairs and their height. In this case a simultaneous set of two logical equations must be solved,

41

recognize very efficiently ground patterns. Fig. 10 is almost self-explanatory for the automatic discrimin- ation between the level ground and the slope. In the case of level ground, the following simple logical Figure 11. Walking on Stairs expression can serve as decision criterion:

h m ak ah ola = level ground

a = 30°, ak = 180', ah = 180",

hil m 9. aRk aRh aa = 1

( 3 ) h r * m r * a * a * a a = l rk rh where

where

379

Page 7: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

aRk = aRh = 180", a = 90"

a = a # 180", a = 9 0 " . rk rh By combining given elementary expressions and patterns, more complex environmental features can be derived.

C. Recognition of Non-Geometric Patterns

So far the role of positional proprioceptive and special exteroceptive (proximity) sensors in pattern recognition by upper and lower limbs has been discussed. Additional information is, however, available. Consider that the human skin generates a large amount of exteroceptive information, and plays an important part in pattern recognition in the everyday human life. This fact should also be explored in order to improve manipulators,

In the previous text, the emphasis was on shape recognition. With exteroceptive sensors, it is possible to solve pattern recognition tasks of different kinds. They are not related directly to geometry of objects but rather to physical properties of the contact surface such as roughness, stiffness, softness, slippage, Evidently, this is also a pattern recognition problem related to non-geometric properties of the environment. The term feature recognition may be conveniently used for this purpose.

Initial experiments in automatic softness detection have been already made. It was demonstrated that an efficient way to classify objects into several softness categories is to derive the curve p = f(s), where E is the force exercised on the extended forefinger (or any other finger) and 5 is the finger travel into the barrier after the tactile contact has been established, (see Ref. 9 ) . New types of slippage detectors for anthropomorphic terminal device have been recently described in Ref. 10.

It must be pointed out that feature detection by exteroceptive sensors is strongly conditioned by the complexity of the terminal device as was the case with proprioceptive sensors. An anthropomorphic terminal device requires exteroceptive sensors arranged in matrix form spread over the whole flexible surface so that it can be wrapped around skeletal type structures. A very convenient solution in this respect is to produce some kind of artificial skin in terms of sensor organization as well as certain mechanical properties. Advances in the research on artificial skin have been published in Ref. 11.

the interaction between the complexity of the hardware and the richness of proprioceptive and exteroceptive information. The more degrees of freedom are available in hardware and in corresponding sensory information, the more complex features can be recognized and with less data processing. This trade-off between the complexity of the terminal device and simplicity of pattern recognition algorithms may have an optimum for upper and lower extremities of the man. If the biological system is really optimized in that way remains to be answered by further basic research.

It has been pointed out already that control of locomotion can be greatly simplified if the system description is shifted from the state space models to logical and algorithmic models (Ref. 1 2 ) . As seen, logical expressions play again an important role in anthropomorphic contact pattern recognition. On the basis of all these results, as well as neurophysiological facts, it is justified to assume as a working hypothesis that the control languages in living matter are basically different from the ones used in mechanics. While differential equations and numerical integration play an important role in the description and control of the mechanical world, similar operations, like control of skeletal activities, pattern recognition, may be described more efficiently by non-numerical expressions.

General studies have shown (Ref. 13) that three main levels can be distinguished in the man-machine control of complex multi-variable systems. For manipulator control application, this multi-level control organization can be summarized as shown in Fig. 12. The first or bottom level--represented by block D of Figure 12--is embedded in the time continuum of the physical world. The actual motion of the manipu- lator, the interaction between terminal device and environment, and all sensor signals are generated at this level. This level outputs the greatest amount of information and involves both the control and processing of continuous variables. The second level--represented by blocks B and C of Figure 12--can be called the algorithmic level. At this second level, control and information variables are handled by a finite number of computer algorithms, and messages are transmitted to the lower and higher levels at finite number of time instants. Transformation of the operator's task description both to actuator reference commands and to control context of exteroceptive sensor data are generated by computer algorithms at this second control level. The third and highest control level--

D. Conclusion represented by block A of Figure 12--is the human operator. In the context of Figure 12, the main function of the human operator is task description and "supervisory" monitoring of task execution, He selects, quantifies, or modifies the algorithms for the lower control levels. Of course, this highest control level is based on the human attributes of thinking, learning, judging, and setting "goals" for a machine with known characteristics.

The paper was intended to draw attention to a new area of research in pattern recognition relevant to problems in robotics and manipulator control. In this new research area, data to be processed for pattern recognition are not generated by interaction of sensors and wave energy, but by physical contact of the mechanical-sensor system with the environment. A s Fig. 12 indicates, the algorithmic control

It is interesting to compare differences be- tween the contact and non-contact pattern recognition processes. The contact case depends very much on

level plays a central role in a multilevel man- machine control organization. Pattern-referenced control logic forms a substantial part of this

380

Page 8: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision

HUMAN OPERATOR: DISPLAY TASK SELECTION OBSERVATION

1 TASK DESCRIPTION

TRANSFORMATION TO: 1. END-POINT MOVE L- 1 2 . EXTEROCEPTIVE

SENSOR GATING I MOTION GOAL

TRANSFORMATION TO: 1. JO I NT DISPLACEMENTS 2. BRANCHING LOGK

I ACTUATOR REFERENCE COMMANDS

I

HUMAN OPERATOR: TASK SELECTION OBSERVATION

1

-

TRANSFORMATION TO: ACTUATOR SERVOING t-

TASK EXECUTION

Figure 12. Multilevel Organization for Manipulator Control

algorithmic level. The detailed performance benefits of algorithmic utilization of exteroceptive (proximity) sensor data for manipulator control have been shown and discussed elsewhere (Ref. 1 5 ) .

Research in contact pattern recognition leads to important practical applications in industrial and medical robotics. Combined non-contact (visual) and contact pattern recognition systems for industrial manipulation, which at the present time hardly exist, should allow for highly versatile, hierarchical, human- oriented solutions. Such systems would provide for maximum autonomy of the manipulator and locomotion systems leaving the conscious decision-making to the man.

References

1. Bejczy, A. K., "Effect of Hand-Based Sensors On Manipulator Control Performance," Proceedings of Second Conference on Remotely Manned Systems, Technology and Applications, University of Southern California, Los Angeles, California, June 9-11, 1975. (To appear in a special issue of Mechanism and Machine Theory.)

2 .

3.

4.

5.

6.

7.

a.

9.

10.

11.

12.

13.

14.

15.

Nevins, J., Whitney, D., et al., "Exploratory Research in Industrial Modular Assembly," The Charles Stark Draper Laboratory, Inc., Cambridge, Massachusetts, Report R-800, March 1974 (and subsequent reports).

Birk, J., "A Computer-Controlled Rotating Belt Hand for Object Orientation," IEEE Trans. on Systems, Man and Cybernetics, pp. 186-191, March 1974.

Rosen, C., Nitzan, D., "Some Developments in Programmable Automation," Proc. IEEE Intercon 75, New York, April 1975.

Binford, T., "Survey of Computer Vision," Proc. 4th Joint International Conference on Artificial Intelligence, USSR, September 1975.

Rabishong, P., Tomovic, R., et al., "AMOLL Project," Proceedings of the Fifth Conference on External Control of Human Extremities, Dubrovnik, Yugoslavia, 1975, published by Yugoslav Committee of Electronics-ETAN, Beograd, Yugslavia, POB 356.

Stojiljkovic, Z., Saletic, D., "Tactile Pattern Recognition by Belgrade Hand Prosthesis," Pro- ceedings of the Fifth Conference on External Control of Human Extremities, Dubrovnik, Yugoslavia, 1975, published by Yugoslav Ommittee of Electronics-ETAN, Beograd, Yugoslavia, POB 356.

Johnston, A. R., "Optical Proximity Sensors for Manipulators," JPL TM 33-678, April 1, 1974.

Stojiljkovic, Z., "Softness Recognition by Terminal Device,'' Proc. 18 National Electronics Conference, 1974, Yugoslav Committee ETAN.

Tomovic, R., Stojiljkovic, Z., "New Type of Slippage Detector," Automatica, Pergamon Press, November 1975.

Clot, C., Stojiljkovic, Z., "Advances in Artificial Skin Design," V-Conference on External Control of Human Extremities, Dubrovnik, Yugoslavia, 1975.

Tomovic, R., McGhee, R., "Finite State Approach to the Synthesis of Bioengineering Control Systems," IEEE Trans, on Human Factors in Electronics, V6l. WE-7, No. 2 , pp. 65-69, June 1966.

Tomovic, R., "On Man-Machine Control," Automatica, Vol. 5, pp. 401-404, Pergamon Press, 1969.

Bejczy, A. K., "Issues in Advanced Automation for Manipulator Control," Proceedings of the 1976 Joint Automatic Control Conference, Purdue University, W. Lafayette, Indiana, July 27-30, 1976.

Bejczy, A. K., "Performance Evaluation of Computer- Aided Manipulator Control," 1976 IEEE Systems, Man and Cybernetics Conference, Washington, D. C., November 1-3, 1976.

381

Page 9: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision
Page 10: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision
Page 11: [IEEE 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes - Clearwater, FL, USA (1976.12.1-1976.12.3)] 1976 IEEE Conference on Decision