7
1 1 © Thomas Hellström 2009 Sensors for autonomous vehicles Sensors for autonomous vehicles Thomas Hellström Dept. of Computing Science Umeå University Sweden 2 © Thomas Hellström 2009 Problems with mobility Autonomous Navigation ”Where am I ?” - Localization ”Where have I been” - Map building ”Where am I going?” - Path planning ”How do I get there?” - Path tracking Additional problems: ”Will I hit anything?” - Obstacle detection ”What will I hit?” - Classification ”How can I avoide it?” - Obstacle avoidance Sensors 3 © Thomas Hellström 2009 Sensors are essential Object sensors for detection and identification Pose sensors for localization 4 © Thomas Hellström 2009 Object sensors Ultrasonic sonars Regular cameras Laser scanners 3D cameras Radars 5 © Thomas Hellström 2009 Ultrasonic sonars Emits a “chirp” (50 KHz) and “listens” for bounce back Used to determine range based on time of flight The object can be anywhere on the lobe perimeter 6 © Thomas Hellström 2009 Ultrasonic sonars Calculate distance based Calculate distance based on the time of flight on the time of flight t t : d = ½ c t c = c 0 + 0.6 T c 0 = 331 m/s T- temperature in degrees Celsius Sensitive to dirt Sensitive to dirt WIDE lobe WIDE lobe Not very suitable for heavy outdoor use Not very suitable for heavy outdoor use

Sensors for Autonomous Vehicles

Embed Size (px)

DESCRIPTION

sensors fr autonomous vehicles

Citation preview

Page 1: Sensors for Autonomous Vehicles

1

1© Thomas Hellström 2009

Sensors for

autonomous vehicles

Sensors for

autonomous vehicles

Thomas Hellström

Dept. of Computing Science Umeå UniversitySweden

2© Thomas Hellström 2009

Problems with mobility

Autonomous Navigation� ”Where am I ?” - Localization� ”Where have I been” - Map building � ”Where am I going?” - Path planning� ”How do I get there?” - Path tracking

Additional problems:� ”Will I hit anything?” - Obstacle detection� ”What will I hit?” - Classification� ”How can I avoide it?” - Obstacle avoidance

Sensors

3© Thomas Hellström 2009

Sensors are essential

Object sensors for detection

and identification

Pose sensors for localization

4© Thomas Hellström 2009

Object sensors

� Ultrasonic sonars

� Regular cameras

� Laser scanners

� 3D cameras

� Radars

5© Thomas Hellström 2009

Ultrasonic sonars

• Emits a “chirp” (50 KHz) and “listens” for bounce back

• Used to determine range based on time of flight

• The object can be anywhere on the lobe perimeter6

© Thomas Hellström 2009

Ultrasonic sonars

Calculate distance based Calculate distance based

on the time of flighton the time of flight t t :: d = ½ c t

c = c0 + 0.6 T

c0 = 331 m/sT- temperature in degrees Celsius

Sensitive to dirtSensitive to dirt

WIDE lobeWIDE lobe

Not very suitable for heavy outdoor useNot very suitable for heavy outdoor use

Page 2: Sensors for Autonomous Vehicles

2

7© Thomas Hellström 2009

Regular cameras

� An image is a huge array of values of individual pixels

� Taken individually, these numbers are almost meaningless

� A robot needs information like

"object ahead",

"table to the left", or

"person approaching"

to perform its tasks

8© Thomas Hellström 2009

Computer Vision

� is the conversion of low level information picture information into high level information

� Separate field of study from robotics

� Includes analysis of signals from: • cameras

• thermal sensors

• X-ray detectors

• laser range finders

• synthetic aperture radar

9© Thomas Hellström 2009

Computer Vision

� Segmentation

(where are the physical objects?)

� Classification

(what are these objects?)

� 3D reconstruction

(estimating ranges from 2D pictures)

� …

10© Thomas Hellström 2009

Range from Vision

• Stereo camera pairs

• Light stripers

• Laser scanners

• 3D cameras

11© Thomas Hellström 2009

Stereo Camera Pairs

� A 3-D structure is computed from two or more images taken from different viewpoints

� Same technique as humans and animals use

� Heavy computations

� Delicate hardware

12© Thomas Hellström 2009

Laser Scanner

� Measures distance in a plane

� 181 (or 361) thin laser beams are emitted in a plane

� The time of flight of the signal is measured

� Max distance: 50m

� Resolution: 5cm

� Most often eye safe

Page 3: Sensors for Autonomous Vehicles

3

13© Thomas Hellström 2009

Laser Scannerexample:

Laser scanner

: photo coverage

14© Thomas Hellström 2009

3D Laser scannerRiegl LMS-Z210

3D laser scanner

� Measurement range up to 150-350 m

depending on reflectivity

� Minimum range 2 m

� Measurement accuracy typ. +/- 2.5 cm

� Eye safety Class 1 for the scanned beam

15© Thomas Hellström 2009

� CSEM SwissRanger(two other manufacturers exist)

� Works like this:• Illuminates the scene with frequency

modulated infrared light

• Measures the phase shift of reflected light

• This gives distance for each pixel

• Reconstructs the relative xyz coordinates

� Maximum range of 7.5 m

� 144 x 176 pixels

� Field of view: 40 degrees

� Frame rate ~ 30 Hz

� No moving parts (except for a fan)

� 5000 euro

3D camera

16© Thomas Hellström 2009

3D camera CSEM

Standard IR intensity image

Distance for each pixel.

Blue: closeRed: distant

17© Thomas Hellström 2009

� Principle of radar: The radar emits a radio signal (green) which is scattered in all directions (blue). The “time-of-flight” t for the signal, back to the radar gives the distance d:

� Typically used in ships and airplanes

� Also as robot sensors:

Radar

d = c t / 2

18© Thomas Hellström 2009

� If the object moves, the frequence of the scattered wave changes.

� A doppler radar measures the shift in frequency and computes the speed (in

addition to distance)

• Ignores all stationary objects

• Used as back-up alarms and also as robot

sensors

Radar

Page 4: Sensors for Autonomous Vehicles

4

19© Thomas Hellström 2009

Radar

Sequential Lobing Radar (Tyco)• Normally used as frontal radar for cars • Can track 10 obstacles closer than 30 m• Range calculated by time of flight• Bearing calculated by comparing two different patterns sent out by the antenna

20© Thomas Hellström 2009

Sensor fusion

� Most object sensors are noisy and do not give crisp information about the presence

of objects or pose

� Common to use a probabilisitc approach:

Each sensor readout gives a probabilityfor objects occupying areas or for a

certain pose

� Many readouts are fused into a combined opinion

21© Thomas Hellström 2009

An occupancy grid

• Sensor: laser scanner.

• The cell size :0.1 meter

• Map update frequency : 5 Hz

is used to fuse object sensor readings and to

create a map of the surrounding terrain.

22© Thomas Hellström 2009

� A little more than ”Where am I?”

� A vehicle has (at least) 6 degrees of freedom (DOF) expressed by the pose:

(x, y, z, φ , θ, ψ)

- Position = (x,y,z)

- Attitude = roll, pitch, yaw)

= (φ, θ, ψ)

= (phi, theta, psi)

Sensors for localization

23© Thomas Hellström 2009

Definitions:

� Roll (a.k.a. ”bank angle”) φ : the angle between

y’ and the x-y plane. - π < φ - π

� Pitch (a.k.a. ”elevation”) θ : the angle between

x’ and the x-y plane. - π < θ < π

� Yaw (a.k.a. ”heading” or ”azimuth”) ψ : the angle

between x and the projection of x’on the x-y plane. - π < ψ < π

The signs of the angles are defined in a right handed coord. system

Pose

24© Thomas Hellström 2009

Using the Right-Hand Rule

To remember positive and negative rotation directions:

- Open your right hand

- Stick out your thumb

- Aim your thumb in an axis positive direction

- Curl your fingers around the axis

- The curl direction is a positive rotation

PitchRoll

Yaw

Page 5: Sensors for Autonomous Vehicles

5

25© Thomas Hellström 2009

Pose sensors

Abolute:

� GPS

� -

Dead reckoning (relative):

� Wheel odometry

�� AccelerometersAccelerometers

�� GyroscopesGyroscopes

26© Thomas Hellström 2009

GPS

� Delivers position for a moving receiver:Latitude, Longitude, Altitude

� Speed and direction of movement can be estimated

27© Thomas Hellström 2009

Dead reckoning

Relative motion: Advance previous pose through displacement information:

•• The simplest kind is called The simplest kind is called odometryodometry::Steering angle and velocity from actuators and shaft Steering angle and velocity from actuators and shaft encoders.encoders.

•• Accelerometers:Accelerometers:use Newtons 2:nd law: F=ma where a is use Newtons 2:nd law: F=ma where a is the second derivative of the displacment, i.e. changes the second derivative of the displacment, i.e. changes in (z,y,z).in (z,y,z).

•• Gyroscopes:Gyroscopes:Measures rotations, i.e. changes in Measures rotations, i.e. changes in (φ, θ, ψ)

•• CamerasCameras

•• Laser scannersLaser scanners

Dea

d reck

oning

28© Thomas Hellström 2009

Wheel Odometry

� Compute changes in the 2D-pose (x,y,θ) from steering angle and velocity • Steering angle from angle sensor

• Velocity from shaft encoders or speed sensor

� Translate steering angle and velocity to (x,y,θ): Kinematics equations!

� Problems with wheel odometry:• The wheels move but not the robot (spinning)

• The robot moves but not the wheels (slipping, sliding)

• The ground is not flat

Causes drift!

Dea

d reck

oning

29© Thomas Hellström 2009

Shaft encoders

Counts rotation steps for the wheels

or engine

Dea

d reck

oning

30© Thomas Hellström 2009

Localization with odometry

Position from GPS and odometry Mean difference between the position from GPS and wheel odometry for a Valmet 830

forwarder

Dea

d reck

oning

Page 6: Sensors for Autonomous Vehicles

6

31© Thomas Hellström 2009

Accelerometers

� A variant of ”Dead reckoning” for measuring change in position (x,y,z)

� Common in air planes, missiles and sub-marinessince the 50´s.

1. F is measured with 3 accelerometers

2. F = m a

3. Compute displacement by integrating twice a twice:

Often implememented in silicon

Fx

Fy

Fz

m

Dea

d reck

oning

32© Thomas Hellström 2009

Gyroscopes

� Measures rotation with one, two or three DOF

� The inertia of the spinning wheel gives a reference for rotations

� Estimates (φ, θ, ψ) by summing up gyroscope rotations

� Fiber optical or solid state

x

y z

Dea

d reck

oning

33© Thomas Hellström 2009

INS (Inertial Navigation Systems)

Illustration: Greg Welsh University of North Carolina at Chapel Hill, Eric Foxlin InterSense

� 3 accelerometers and 3 gyroscopes

� Measures changes in the pose (x, y, z, φ, θ, ψ)

� Good accuracy for short periods

� I.e. same problem as wheel odometry, but not sensitive to slipping and sliding

Dea

d reck

oning

34© Thomas Hellström 2009

Ground Speed Radar

Measures the real ground speed

Change in position can be computed

Dea

d reck

oning

35© Thomas Hellström 2009

Localization with Laser scannersLocalization with Laser scanners

Scan from 1 1 meter

Scan from 5 meters

Basic idea:The pose change (∆∆∆∆ x, ∆∆∆∆ y, ∆∆∆∆ΦΦΦΦ)

between scans can be computed from the scans

36© Thomas Hellström 2009

Absolute localization using laserAbsolute localization using laser

Manually drive the pathSave laser snapshots and laser scanner poses in a database

Page 7: Sensors for Autonomous Vehicles

7

37© Thomas Hellström 2009

Autonomously repeat the pathAt each step:

1. Take a snapshot

2. Find the most matching snapshot in the DB3. Find the optimal ∆ x, ∆ y, ∆Φ4. Estimate pose (x, y, Φ) of robot

Absolute localization using laserAbsolute localization using laser

xxxx’’’’, y, y, y, y’’’’, , , , ΦΦΦΦ’’’’∆∆∆∆ x, x, x, x, ∆∆∆∆ y, y, y, y, ∆∆∆∆ΦΦΦΦx, y, x, y, x, y, x, y, ΦΦΦΦ38

© Thomas Hellström 2009

Other pose sensors

� Mechanical tilt sensors• Measures absolute attitude (φ , θ)

� Magnetic compass• Measures the earth magnetic fields

along 3 axes. This gives three angles

• Very sensitive to metal objects!