5
XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE Vision-Based Autonomous Walking in a Lower- Limb Powered Exoskeleton Wenkai Bao Electrical and Computer Engineering Southern Methodist University Dallas, TX, USA [email protected] Dario Villarreal Electrical and Computer Engineering Southern Methodist University Dallas, TX, USA [email protected] J.-C. Chiao Electrical and Computer Engineering Southern Methodist University Dallas, TX, USA [email protected] Abstract— Lower-limb powered exoskeletons have the potential to help patients who suffer from mobility impairments to regain their independence. Unlike traditional exoskeletons with predefined gait patterns, some recently developed exoskeletons are capable of planning gait patterns autonomously based on environmental information and human perception. However, patients may have difficulty navigating through unfamiliar environments due to the slow or miscommunication between the exoskeletons and users. Verbal or manual adjustments cannot continuously or adaptively manage the mechanical motions either. In this study, we propose and demonstrate an autonomous walking pattern generator based on shared visual information between human and a powered lower-limb exoskeleton. This control scheme is to understand the user’s intention during walking and help the patient who has trouble walking to overcome obstacles. Human gazing positions in a 3-D environment are measured and evaluated via an eye tracker system to detect objects in the walking path. A model predictive controller (MPC) is formulated, which generates and modifies the footsteps and the center of mass trajectory of the powered exoskeleton in real time considering the visual feedback. Preliminary experiments are performed by having a human subject walking toward a target sign indicating a physical obstacle. Our results show the feasibility of integrating the walking pattern generator with visual feedbacks to achieve autonomous control for a lower- limb powered exoskeleton. Keywords— Lower-Limb Exoskeletons, vision-based control, autonomous walking I. INTRODUCTION In the United States alone there are 11.7 million individuals with difficulty walking [1]. As an emerging technology, lower-limb powered exoskeletons can physically help patients with mobility impairments to regain their independence [2], [3]. In recent years, research on the topic of lower-limb robotic exoskeletons has increased to a point where the technology is being commercialized [4]. For example, lower-limb powered exoskeletons such as ReWalk [5], HAL [6], Ekso [7], and Indego [8] are playing an important role in healthcare and rehabilitation. Unlike the laboratory environments, real-life scenarios that patients face during their daily uses are more complex and unexpectable. Therefore, it is important for the lower-limb exoskeletons to be able to synchronize to the user’s rhythmic walking and provide user with human-like behaviors in scenarios where the surrounding environments could be unfamiliar [9], [10]. In addition, healthy people can generate motion intensions and transmit them through neural networks to the corresponding muscles based on the perceived environments. For patients with lower-limb deficits that have difficulty conveying their motion intentions to the exoskeleton in a quick and accurate manner, appropriate control strategies are essential to assist them to communicate with the exoskeletons. Currently, torque-based and position-based controls are commonly used by lower-limb powered exoskeletons to support the patients [11]. Torque-based control is typically achieved by proportionally applying torques based on the biological data captured from certain sensors such as EMG and EEG signals. Position-based control, in contrast, does not rely on real time bio-signals and uses a series of predetermined gait parameters and joint trajectories to assist users. However, both torque-based control and position- based control have some shortcomings in real-world application. Torque-based control is sometimes unreliable due to the lower accuracy and stability of EEG and EMG signals, whereas position-based control cannot easily realize online adaptation to gait parameters. Thus, a combination of position-based and force-based control may achieve the transition between several gait patterns in real time based on a set of bio-sensors. Some studies have investigated the role of vision, hearing, and other senses during gait planning of human locomotion [12]–[14]. Among them, vision exhibits its potential in providing the nervous system with feedforward information from the environment that may help users navigate through unexpected environments, similar to how our eye, brain and body function as a system. In addition, Van der Kooij et al. demonstrated that model predictive controller is capable of generating a stable gait trajectory in real time based on basic gait parameters, and can incorporate with biological signals such as vision to achieve autonomous walking for the lower-limb exoskeleton [15]. In this research, we propose a novel autonomous MPC walking pattern generator that integrates the visual information shared between a human and a lower-limb exoskeleton, as shown in Fig. 1. To detect and extract the features of objects in the walking path, an eye-tracking system is used to measure and evaluate 3-D human gazing Fig. 1. Autonomous gait pattern planning framework.

Vision Based Autonomous Walking in a Lower Limb Powered

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Vision Based Autonomous Walking in a Lower Limb Powered

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE

Vision-Based Autonomous Walking in a Lower-Limb Powered Exoskeleton

Wenkai Bao Electrical and Computer Engineering

Southern Methodist University Dallas, TX, USA

[email protected]

Dario Villarreal Electrical and Computer Engineering

Southern Methodist University Dallas, TX, USA

[email protected]

J.-C. Chiao Electrical and Computer Engineering

Southern Methodist University Dallas, TX, USA [email protected]

Abstract— Lower-limb powered exoskeletons have the potential to help patients who suffer from mobility impairments to regain their independence. Unlike traditional exoskeletons with predefined gait patterns, some recently developed exoskeletons are capable of planning gait patterns autonomously based on environmental information and human perception. However, patients may have difficulty navigating through unfamiliar environments due to the slow or miscommunication between the exoskeletons and users. Verbal or manual adjustments cannot continuously or adaptively manage the mechanical motions either. In this study, we propose and demonstrate an autonomous walking pattern generator based on shared visual information between human and a powered lower-limb exoskeleton. This control scheme is to understand the user’s intention during walking and help the patient who has trouble walking to overcome obstacles. Human gazing positions in a 3-D environment are measured and evaluated via an eye tracker system to detect objects in the walking path. A model predictive controller (MPC) is formulated, which generates and modifies the footsteps and the center of mass trajectory of the powered exoskeleton in real time considering the visual feedback. Preliminary experiments are performed by having a human subject walking toward a target sign indicating a physical obstacle. Our results show the feasibility of integrating the walking pattern generator with visual feedbacks to achieve autonomous control for a lower-limb powered exoskeleton.

Keywords— Lower-Limb Exoskeletons, vision-based control, autonomous walking

I. INTRODUCTION In the United States alone there are 11.7 million

individuals with difficulty walking [1]. As an emerging technology, lower-limb powered exoskeletons can physically help patients with mobility impairments to regain their independence [2], [3]. In recent years, research on the topic of lower-limb robotic exoskeletons has increased to a point where the technology is being commercialized [4]. For example, lower-limb powered exoskeletons such as ReWalk [5], HAL [6], Ekso [7], and Indego [8] are playing an important role in healthcare and rehabilitation. Unlike the laboratory environments, real-life scenarios that patients face during their daily uses are more complex and unexpectable. Therefore, it is important for the lower-limb exoskeletons to be able to synchronize to the user’s rhythmic walking and provide user with human-like behaviors in scenarios where the surrounding environments could be unfamiliar [9], [10]. In addition, healthy people can generate motion intensions and transmit them through neural networks to the corresponding muscles based on the perceived environments. For patients with lower-limb deficits that have difficulty conveying their motion intentions to the exoskeleton in a quick and accurate manner, appropriate control strategies are

essential to assist them to communicate with the exoskeletons.

Currently, torque-based and position-based controls are commonly used by lower-limb powered exoskeletons to support the patients [11]. Torque-based control is typically achieved by proportionally applying torques based on the biological data captured from certain sensors such as EMG and EEG signals. Position-based control, in contrast, does not rely on real time bio-signals and uses a series of predetermined gait parameters and joint trajectories to assist users. However, both torque-based control and position-based control have some shortcomings in real-world application. Torque-based control is sometimes unreliable due to the lower accuracy and stability of EEG and EMG signals, whereas position-based control cannot easily realize online adaptation to gait parameters. Thus, a combination of position-based and force-based control may achieve the transition between several gait patterns in real time based on a set of bio-sensors. Some studies have investigated the role of vision, hearing, and other senses during gait planning of human locomotion [12]–[14]. Among them, vision exhibits its potential in providing the nervous system with feedforward information from the environment that may help users navigate through unexpected environments, similar to how our eye, brain and body function as a system. In addition, Van der Kooij et al. demonstrated that model predictive controller is capable of generating a stable gait trajectory in real time based on basic gait parameters, and can incorporate with biological signals such as vision to achieve autonomous walking for the lower-limb exoskeleton [15].

In this research, we propose a novel autonomous MPC walking pattern generator that integrates the visual information shared between a human and a lower-limb exoskeleton, as shown in Fig. 1. To detect and extract the features of objects in the walking path, an eye-tracking system is used to measure and evaluate 3-D human gazing

Fig. 1. Autonomous gait pattern planning framework.

Page 2: Vision Based Autonomous Walking in a Lower Limb Powered

positions. Then, an autonomous gait pattern planning mechanism based on the projected vision angle is proposed. Finally, a model predictive controller (MPC) is formulated, which modifies the footsteps and the center of mass (CoM) trajectory of the exoskeleton to track a reference walking pattern considering the visual feedback.

In the Section II, we first review the principles of the modeling and footstep placement of the MPC walking pattern generator. Then we propose utilizing the visual feedback and autonomous gait pattern planning with the control part. In the Section III, the experiments with three different cases are presented. Then experimental results are analyzed, evaluated, and discussed.

II. METHODS

A. Modeling of a Human-Exoskeleton System The dynamics of human-exoskeleton system on one leg

can be simply modeled as a double inverted pendulum connecting the foot and the center of mass of the exoskeleton. In this research, Indego Explore Exoskeleton (Parker Hannifin Corporation, OH) is used to derive the model. The center of mass of the entire system is approximately at the hip. The thigh and shank of the exoskeleton are modeled as a double inverted pendulum, as shown in Fig. 2. Since the exoskeleton has only two degrees of freedom on one leg and we merely focus on the bipedal locomotion in the sagittal plane. The hip and knee joints flexion can be represented as

𝑞 = 𝜃!𝜃!

(1)

where θ1 and θ2 refer to the hip and knee joint angles, respectively.

Euler-Lagrangian equation [16] is used to model the locomotion in sagittal plane

𝑀 𝑞 𝑞 + 𝐺 𝑞 + 𝐶 𝑞, 𝑞 = 𝐵𝑢 (2)

where M is the mass matrix, G is the gravitational matrix, C refers to the coriolis matrix, and B defines the vector with external joint torques.

B. Model Predictive Control Scheme The overall control scheme is to understand the user’s

intention during walking and adjust gait patterns accordingly. The model predictive controller is adopted to actuate the exoskeleton by constantly solving the feedforward optimization using the feedback of current states. The model predictive control scheme consists of three subsections: step planning, MPC controller, and inverse kinematics, as shown in Fig. 3. The input and output of the system are reference gait parameters and real time joint angles, correspondingly.

The first part is the step planning which generates desired foot trajectory using a set of predefined footstep parameters, such as step length, step height, and step time. Model predictive controller is the core for the entire control scheme, where gait patterns are continuously updated and optimized considering the real-time perturbations and systematic errors. In the last block shown in Fig. 4, inverse kinematics is performed to calculate joint angles in the sagittal plane. The vertical and horizontal equations are specified as:

𝑝! − 𝑥 = −𝐿! sin 𝜃! − 𝐿! sin(𝜃! + 𝜃!) (3)

ℎ = 𝐿! cos 𝜃! + 𝐿! cos(𝜃! + 𝜃!) (4)

where ℎ and 𝑥 are the vertical and horizontal displacements of the center of mass, 𝑝! is the horizontal displacement of foot trajectory, 𝐿! and 𝐿! are the lengths of thigh and shank in one leg.

C. Visual Feedback As discussed above, utilization of vision can have

significant advantages in perceiving environment with a quick and accurate manner. In this work, the visual feedback of the subjects is captured through the Pupil Labs eye tracking system (Pupil Labs, Berlin, Germany). The measurement of walking environment is realized via an RGB-D camera (RealSense D415, Intel Corporation, USA) which records the 3-D information in real-time and the estimation of eye movement is achieved by two eye cameras that directly record images of the eyes. Using appropriate calibration methods developed by Pupil Labs, the gaze point and direction can be mapped to the viewed scene based on the 3-D information and eye movement, as shown in Fig. 5.

Fig. 2. Indego exoskeleton and double pendulum in the sagittal plane.

Fig. 3. The block diagram of the MPC control scheme.

Fig. 4. Inverse kinematics model.

Fig. 5. Gaze mapping.

Page 3: Vision Based Autonomous Walking in a Lower Limb Powered

The relationship between the gaze point distance d, eye-tracker height H, and horizontal distance D can be expressed as

𝐷 = 𝑑! − 𝐻! (5)

To verify the effectiveness of the proposed method, some assumptions are made: (1) the height of the eye-tracking system during walking is a constant; and (2) the horizontal position of the eye-tracking system and the user’s center of mass is the same.

D. Autonomous Gait Pattern Generator We propose an autonomous gait pattern generator

integrating visual recognition of 3D environment, autonomous decision-making mechanism, and parameterized gait pattern planning. The visual feedback recorded by the eye tracking system serves as the input to the entire system. To simplify the algorithm, we design the autonomous decision-making mechanism based on the distance between a target and the patient. The details of gait pattern generation using the model predictive controller are discussed in the section II-B. Two gait patterns, Normal and Passing patterns, are proposed to adjust the interaction as well as ensure user safety. These two patterns are defined as

(1) Normal Pattern: The exoskeleton starts and keeps walking at constant step length, step height and step time.

(2) Passing Pattern: The exoskeleton changes to a new gait pattern with different footstep parameters when approaching or passing the target ground sign.

When the horizontal distance D is less than a certain value, the pattern switches from Normal to Passing. The decision value is set as twice of the preprogrammed step length. The autonomous decision-making model determines the next step length, height, and time according to the continuous visual feedback.

III. EXPERIMENTS AND RESULTS Simulink Model synchronizing the exoskeleton and the

results from the eye tracking system in Matlab (MathWorks, MA) is designed. In the experiments, the model predictive controller (MPC) for the walking pattern generator is first verified by performing exoskeleton-assisted walking across different gait parameters. Then, the accuracy of the visual feedback system is evaluated via the depth perception of objects at various distances. Finally, we validate the autonomous gait pattern planning system by asking the

human-exoskeleton system to recognize the signs, make decisions, and generate gait patterns accordingly. The validation is assessed by judging if the patterns are correct.

A. MPC Walking Pattern Generator The purpose of this experiment is to verify the accuracy

of the walking pattern generator through different gait parameters, including step length and step time. The model predictive control algorithm is applied on the Indego Explore Exoskeleton using the Simulink model. The sampling frequency for the entire control system is 200 Hz. A healthy subject is asked to conduct fifteen repetitive groups of walking experiments wearing the powered exoskeleton. In each group, six trials are performed. The first three trials are based on same desired step length L = 0.75 m while the step time T varied from 0.5 s to 1.0 s with 0.25-s intervals. The last three trials have the same desired step time T = 0.75 s whereas the desired step length varied from 0.5 m to 1.0 m with 0.25-m intervals.

The hip and knee joint trajectories of all the six groups are shown in Fig. 7. In Figs. 7(a) and (b), the variation ranges of hip and knee joint angles increase proportionally within in one gait cycle as the step length changes from 0.5 to 1.0 m. The general trends of both hip and knee joint trajectories remain consistent, which indicates the variation in step length using the MPC walking pattern generator does not interfere with the rhythmic walking. Figs. 7(c) and (d) show the variation ranges as well as the actual step times for both hip and knee joints within one gait cycle, as the desired step times increase from 0.5 to 1.0 s.

It is noted that some features and details disappear as the step time decreases. The desired and actual step time T and step lengths L are shown in Table I. It can be seen that desired step times T and step lengths L are generally obtained for both hip and knee joints. Although the accuracy of MPC walking pattern generator is relatively high, the average value of actual step length L and step time T for all six groups of experimental parameters are lower than the desired values, which may be caused by two factors. First, unlike a traditional humanoid robot, the lower-limb exoskeleton requires human-machine interfacing in which the subjects wearing exoskeleton are likely to limit by themselves the range of motion by the exoskeleton, inducing lower values of the actual step length and step height. Secondly, the systematic latency and mechanical system perturbation can occur during the experiment and lead to the errors.

Fig. 6. System framework of the vision-based lower-limb exoskeleton.

Page 4: Vision Based Autonomous Walking in a Lower Limb Powered

B. Visual Recognition of Ground Signs In the experiments, the accuracy of ground sign

recognition using the Pupil Labs eye tracking system (Pupil Labs, Berlin, Germany) is validated. A healthy subject wearing eye-tracking system is asked to remain static and recognize different target ground signs at various distances. Three objects with different shapes and colors including cardboard, steel ramp, and ArUco markers are chosen as the ground signs to mimic path destinations or obstacles that patients may face or encounter.

The actual and measured distances of the ground signs in all the three scenes are presented in Fig. 8. The real values of features are measured manually with a ruler on the ground and the measured values are the averages of the results obtained in five repeated trials using the eye-tracking system. As indicated in the table, the measured distances are smaller than the real values for all three respective scenes, possibly due to the forward-leaning gesture of the subject during walking with crutches. At the same time, recognition errors rise as the distances increase, which could result from the insufficient visual information gathering in a farther distance. Although different calibration methods are recommended for corresponding distances, only one of them can be applied for one set of trials, because it is impossible for the subject to estimate the distances ahead of time or change the eye-tracking calibration method in the middle of walking. In the end, it is noted that the recognition accuracies for different ground signs are different, with the ArUco markers and the steel ramp being the most and least recognizable objects, respectively.

C. Autonomous Gait Pattern Planning System An effective decision-making mechanism is essential for the autonomous gait pattern generator, as it translates the visual feedback into the real-time gait parameters. To simplify the mechanism, we set a double step length of 1.5 m as the switching criterion from the Normal stepping pattern to the Passing pattern. To validate the performance of the autonomous decision-making mechanism, in this experiment a healthy subject is asked to perform fifteen walking routines with exoskeleton facing three different ground targets.

The system autonomously completes the task of recognition, decision-making, planning steps, and actuation of the exoskeleton while approaching and passing the target ground signs. The subject wearing the pupil tracker system walks passively with the assistance of the exoskeleton and does not participate in decision-making. During the experiments, visual recognition and gait pattern decision results are recorded at three predetermined distances from targets at distances of 4, 2, and 1 m. As shown in Fig. 9, when facing with different obstacles at different distances,

Fig. 7. Joint trajectories of the hip and knee for different desired step lengths and step times within one gait cycle. (a) Different step lengths over the hip joint. (b) Different step lengths over the knee joint. (c) Different step times over the hip joint. (d) Different step times over the knee joint.

TABLE I

THE DESIRED AND ACTUAL VALUE OF DIFFERENT STEP TIME T AND STEP LENGTHS L

Gait Pattern Desired Value Actual Value

Variation on Step Length

Fast 1.00 m 0.93 ± 0.14 m Medium 0.75 m 0.69 ± 0.10 m

Slow 0.50 m 0.45 ± 0.07 m Variation on Step Time

Fast 0.50 s 0.47 ±  0.05 s Medium 0.75 s 0.71 ± 0.07 s

Slow 1.00 s 0.94 ± 0.11 s

Fig. 8. Recognition results and errors in different scenes.

Page 5: Vision Based Autonomous Walking in a Lower Limb Powered

the system can make correct decisions and generate corresponding gait patterns based on the accurate visual recognition. In all the fifteen trials, the autonomous gait planning system succeeds in switching the gait pattern between 2 and 1 m, which is consistent with the 1.5-m criterion we set.

Hip and knee trajectories of ten gait cycles are presented in Fig. 10, which show the system autonomously switches from the Normal pattern to Passing pattern after the fifth gait cycle. It is indicated that the hip and knee joint trajectories are relatively stable before and after the gait pattern switching. However, it takes at least another full gait cycle after switching to reach the rhythmic and periodic walking pattern, as shown between the 10th and 14th seconds.

IV. CONCLUSION In this research, we proposed a novel autonomous walking pattern generator based on visual information shared between a human and the worn lower-limb powered exoskeleton. The system improves exoskeleton adaptability to walking environments. The autonomous decision-making mechanism for exoskeleton is established with eye-tracking feedbacks, and the appropriate gait pattern for the current walking environment is planned based on the closed-loop decision-making result. The demonstrations show that the autonomous walking of exoskeleton based on visual feedback can be achieved for various walking patterns. However, this research is conducted within a controlled indoor environment with limited environmental obstacles. The exoskeleton has limited controls restricting flexibility to optimize motions. The mechanism cannot adapt to a more complex, outdoor or fast-changing terrain. The vision feedback needs to provide more details about the

surrounding environment and the indication of the human when he/she desires to change directions. At the same time, only vision-feedback is applied as the system input as human comfort level and other means for commands are neglected in the system loop. Therefore, the future work should focus on optimizing the human-machine interfacing by introducing more physiological signals into the feedback loop, as well as increasing the degrees of freedom for the exoskeletons to navigate in more complex walking environments.

ACKNOWLEDGMENT

The authors appreciate the sponsorship by National Science Foundation grant CMMI-1929953.

REFERENCES [1] K. Ziegler-Graham, E. J. MacKenzie, P. L. Ephraim, T. G. Travison,

and R. Brookmeyer, “Estimating the prevalence of limb loss in the United States: 2005 to 2050,” Arch. Phys. Med. Rehabil., vol. 89, no. 3, pp. 422–429, 2008.

[2] S. Maggioni et al., “Robot-aided assessment of lower extremity functions: A review,” Journal of NeuroEngineering and Rehabilitation, vol. 13, no. 1. BioMed Central Ltd., 02-Aug-2016.

[3] A. M. Dollar and H. Herr, “Lower extremity exoskeletons and active orthoses: Challenges and state-of-the-art,” IEEE Trans. Robot., vol. 24, no. 1, pp. 144–158, Aug. 2008.

[4] P. Gwynne, “Technology: Mobility machines,” Nature, vol. 503, no. 7475, pp. S16–S17, Nov. 2013.

[5] A. Esquenazi, M. Talaty, A. Packel, M. Saulino, and F. Mossrehab, “The ReWalk powered exoskeleton to restore ambulatory function to individuals with thoracic-level motor-complete spinal cord injury affiliations,” journals.lww.com, 2012.

[6] A. Tsukahara, Y. Hasegawa, K. Eguchi, and Y. Sankai, “Restoration of gait for spinal cord injury patients using HAL with intention estimator for preferable swing speed,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 23, no. 2, pp. 308–318, 2015.

[7] A. J. Kozlowski, T. N. Bryce, and M. P. Dijkers, “Time and effort required by persons with spinal cord injury to learn to use a powered exoskeleton for assisted walking,” Top. Spinal Cord Inj. Rehabil., vol. 21, no. 2, pp. 110–121, Mar. 2015.

[8] C. Hartigan et al., “Mobility outcomes following five training sessions with a powered exoskeleton,” Top. Spinal Cord Inj. Rehabil., vol. 21, no. 2, pp. 93–99, Mar. 2015.

[9] R. J. Farris, H. A. Quintero, S. A. Murray, K. H. Ha, C. Hartigan, and M. Goldfarb, “A preliminary assessment of legged mobility provided by a lower limb exoskeleton for persons with paraplegia,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 22, no. 3, pp. 482–490, 2014.

[10] J. Jang, K. Kim, J. Lee, B. Lim, and Y. Shim, “Assistance strategy for stair ascent with a robotic hip exoskeleton,” in IEEE International Conference on Intelligent Robots and Systems, 2016, vol. 2016-Novem, pp. 5658–5663.

[11] A. J. Young and D. P. Ferris, “State of the art and future directions for lower limb robotic exoskeletons,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 25, no. 2, pp. 171–182, 2017.

[12] A. E. Patla, “Understanding the roles of vision in the control of human locomotion,” Gait and Posture, vol. 5, no. 1. Elsevier, pp. 54–69, 01-Feb-1997.

[13] B. R. Fajen, W. H. Warren, S. Temizer, and L. P. Kaelbing, “A dynamical model of visually-guided steering, obstacle avoidance, and route selection,” Int. J. Comput. Vis., vol. 54, no. 1–3, pp. 13–34, Aug. 2003.

[14] B. R. Fajen and W. H. Warren, “Behavioral dynamics of steering, obstacle avoidance, and route selection,” J. Exp. Psychol. Hum. Percept. Perform., vol. 29, no. 2, pp. 343–362, Apr. 2003.

[15] H. Van der Kooij, R. Jacobs, B. Koopman, and F. Van der Helm, “An alternative approach to synthesizing bipedal walking,” Biol. Cybern., vol. 88, no. 1, pp. 46–59, Jan. 2003.

[16] M. W. Spong, S. Hutchinson, and M. Vidyasagar, “Robot modeling and control,” IEEE Control Systems, vol. 26, no. 6. pp. 113–115, 2006.

Fig. 9. Autonomous decision-making results while facing different target signs at different distances.

Fig. 10. Autonomous gait pattern switches from the Normal pattern (step time T = 1 s, step length L = 0.5 m) to the Passing pattern (step time T = 1 s, step length L = 0.75 m). The decision is made at the 10th s.