Vision-Based Motion Control of Robots Azad Shademan Guest Lecturer CMPUT 412 – Experimental...

Preview:

Citation preview

Vision-Based Motion Control of Vision-Based Motion Control of RobotsRobots

Azad ShademanGuest Lecturer

CMPUT 412 – Experimental RoboticsComputing Science, University of Alberta

Edmonton, Alberta, CANADA

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Vision-Based Control

current

desired

Left Image Right Image

A

A

A

B

BB

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Vision-Based Control

Left Image Right Image

B

BB

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Vision-Based Control Feedback from visual sensor (camera) to

control a robot Also called “Visual Servoing” Is it any difficult?

Images are 2D, the robot workspace is 3D 2D data 3D geometry

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Where is the camera located? Eye-to-Hand

e.g.,hand/eye coordination

Eye-in-Hand

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Visual Servo Control law Position-Based:

Robust and real-time pose estimation + robot’s world-space (Cartesian) controller

Image-Based:Desired image features seen from cameraControl law entirely based on image features

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Position-Based

Desired pose

Estimated pose

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Image-Based

Desired Image feature

Extracted image feature

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Visual-motor Equationx1

x2

x3 x4

q=[q1 … q6]

Visual-Motor Equation

This Jacobian is important for motion control.

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Visual-motor JacobianImage spacevelocity

Joint spacevelocity

A

A

B

B

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Image-Based Control Law

1. Measure the error in image space

2. Calculate/Estimate the inverse Jacobian

3. Update new joint values

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Image-Based Control Law

Desired Image feature

Extracted image feature

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Jacobian calculation Analytic form available if model is known.

Known model Calibrated

Must be estimated if model is not known

Unknown model Uncalibrated

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Image Jacobian (calibrated) Analytic form depends on depth estimates.

Camera/Robot transform required. No flexibility.

CameraVelocity

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Image Jacobian (uncalibrated) A popular local estimator:

Recursive secant method (Broyden update):

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Relaxed model assumptions

Traditionally: Local methods No global planning Difficult to show

asymptotic stability condition is ensured

The problem of traditional methods is the locality.

Calibrated vs. Uncalibrated

Model derived analytically Global asymptotic

stability Optimal planning is

possible A lot of prior knowledge

on the model

Global Model Estimation (Research result)

Optimal trajectory planning

Global stability guarantee

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Synopsis of Global Visual Servoing Model Estimation (Uncalibrated) Visual-Motor Kinematics Model Global Model

Extending Linear Estimation (Visual-Motor Jacobian) to Nonlinear Estimation

Our contributions:K-NN Regression-Based EstimationLocally Least Squares Estimation

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Local vs. Global

Key idea: using only the previous estimation to estimate the Jacobian

1. RLS with forgetting factor Hosoda and Asada ’94

2. 1st Rank Broyden update: Jägersand et al. ’97

3. Exploratory motion: Sutanto et al. ‘98

4. Quasi-Newton Jacobian estimation of moving object: Piepmeier et al. ‘04

Key idea: using all of the interaction history to estimate the Jacobian

Globally-Stable controller design

Optimal path planning Local methods don’t!

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

3 NN

K-NN Regression-based Method

q1

q2

x1

q1

q2

?

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

q1

q2

x1 ?

KNN(q)

(X,q)

Locally Least Squares Method

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Experimental Setup

Puma 560 Eye-to-hand configuration Stereo vision Features: projection of the end-effector’s

position on image planes (4-dim) 3 DOF for control

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Measuring the Estimation Error

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Global Estimation Error

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Noise on Estimation Quality

KNN

LLS

With increasing noise level, the error decreases

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Effect of Number of Neighbors

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Conclusions Presented two global methods to learn the visual-

motor function LLS (global) works better than the KNN (global) and

local updates. KNN suffers from the bias in local estimations Noise helps system identification

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Eye-in-Hand Simulator

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Eye-in-Hand Simulator

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Eye-in-Hand Simulator

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Eye-in-Hand Simulator

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Mean-Squared-Error

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Task Errors

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Questions?

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Position-Based Robust and real-time relative pose

estimation Extended Kalman Filter to solve the

nonlinear relative pose equations. Cons:

EKF is not the optimal estimator.Performance and the convergence of pose

estimates are highly sensitive to EKF parameters.

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Overview of PBVS

2D-3D nonlinear point correspondences

IEKFWhat kind of nonlinearity?

T. Lefebvre et al. “Kalman Filters for Nonlinear Systems: A Comparison of Performance,” Intl. J. of Control, vol.

77, no. 7, pp. 639-653, May 2004.

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

EKF Pose Estimation

Measurement noise

State variable

Process noise

yaw pitch roll

Measurement equation is nonlinear and must be linearized.

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Visual-Servoing Based on the Estimated Global Model

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Control Based on Local Models

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

In practice: Broyden 1st-rank estimation, RLS with forgetting factor, etc.

Estimation for Local Methods

A. Shademan. CMPUT 412, Vision-based motion control of robots March 4, 2008

Recommended