Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
15 May 2013
Robust Visual Odometry for Space Exploration Dr. Andy Shaw
XROB VO Breadboard 24 June 2013 2
Presenter 1 & Presenter 2 Monday, 24 June 2013
Overview
• Background and Component Description • Reference Requirements • Visual Odometry Technical Description • Test Site • Platform • Ground Truth • Visual Odometry Results • Conclusions • Questions
XROB VO Breadboard 24 June 2013 3
Background – Autonomy Overview
TASK
ACTION
Camera: Set Parameters Get Parameters Get Image Calibrate Classify
Capture Image
Interaction Management
Goal Specification
Goal Arbitration
Resource State Estimation
Planning & Re-planning
Plan Dispatch
Autonomy Management
Plan Validation
Localise Navigate ARM Placement
Science Assessment
Science Response
End Effector Exchange Drill
Power Management Analysis Sample
PTU: Set Limits Get Limits Set Angle Get Angle
Battery: Get State Get Charge Rate
Sensors:
Compass IMU Solar Flux Temperature Star Tracker
Locomotion: Get/Set Direction Get/Set Distance
ARM: DOF Move Calculate Align Compliance
EE: Disconnect Reconnect (De)Activate Type
Analytical Instruments:
Spectrometer Microscope Raman/Libs X-ray Organic Analyser Radiation
Drill: Get/Set Speed Get/Set Depth Get/Set Torque
Radar: Get/Set Power Get/Set Depth
MISSION
XROB VO Breadboard 24 June 2013 4
Background
Wheel Odometry
TASK: Localisation
IMU/Tilt
Compass
ACTION: Platform Specific Localisation Sensors
Sensor Interface
Sensor History
Internal Sensors
Sun/Star tracker
GPS/Radio beacon
Visual Odometry
Sensor Configuration
Sensor Feedback Safety Monitor
Local Map Global Map Sensor Fusion Filtering
Trajectory tracker Localisation
External Sensors
XROB VO Breadboard 24 June 2013 5
Reference Requirements • Shall provide a 6DOF relative pose estimate at a rate of 1Hz. Pose includes
translation and rotation information using the x axis as straight ahead, y axis as to the right and z as downwards
• Shall provide a pose with an accuracy equal or better than 1% of the distance travelled
» Maximum rover speed of 10cm/s » Maximum rover rotation of 10deg/s
• Shall provide a pose with an accuracy equal or better than 5% of the distance travelled
» Maximum rover speed of 200cm/s » Maximum rover rotation of 20deg/s
• The Rover shall know its relative location, w.r.t. the last stopping place where topographic information was acquired for path generation, to better than 10 cm at all times during it locomotion
• Pose accuracy shall be verified in a quarry with Martian analogue and at the ESTEC site. Traverses shall consist of 10m, 50m, and 100m ranges
• Shall provide an indicator to show the confidence of the estimate accuracy level is less than 1%
XROB VO Breadboard 24 June 2013 6
Reference Requirements – Cont.
• Cameras shall have auto exposure, shutter and gain adjustments with out manual input
• The system shall not be sensitive to projected shadows • Shall provide pose estimated in daylight conditions. During
external trials the time duration will be influenced by seasonal and environmental conditions
• Shall provide a strategy to continue operations when the sun is in the FOV
• Shall have the following modes; a health check and self calibration, Odometry acquisition, Video acquisition and image acquisition
• VOS shall provide a pose estimate in Vision Sensor reference frame. Note the localisation system will do the conversation between reference frames
XROB VO Breadboard 24 June 2013 7
Visual Odometry - High Level Overview
• Motion estimation from images can be achieved by feature point locations easily identifiable and repeatable across multiple images, for example corners.
• By tracking (~100) very good performance can be achieved.
• This imposes a static world assumption on the system although in reality, some amount of dynamic world motion can be tolerated by the system.
XROB VO Breadboard 24 June 2013 8
Visual Odometry – Camera Selection
• In the estimation it is assumed that all observations of feature points within a single image happen at exactly the same time.
• Cameras are used with global shutter systems, instead of rolling shutters.
• Choosing to use a stereo camera alleviates the depth perception problem encountered with a mono camera
• When dealing with a stereo pair, it is important they are both captured at exactly the same time.
• Assumption that the rows across the pair of images are exactly aligned, as this aids feature matching.
XROB VO Breadboard 24 June 2013 9
Visual Odometry – Image Rectification • Images from cameras are not perfect;
» they typically suffer from lens distortion. To fix this, both images are rectified. » Chromatic aberration. Dependant on wavelength (not an issue here)
• An example of image rectification carried out during the Tubney Quarry trials: the left image is un-rectified, the right image has been rectified.
XROB VO Breadboard 24 June 2013 10
Visual Odometry – Feature Extraction • Feature points are identified within the image, In our system we use the FAST corner extractor. FAST
produces are large number of candidate corners at small computational cost. • Obviously some of these corners will be better than others, so they are ranked based on a “good corner”
score (here we use the Harris score). • To achieve robustness to motion blur we run FAST at different scales of the original image, known as
“pyramid levels”. • If we just took the top N features, we typically find them clustered around a few strong corners in the
image. Instead we force the features to spread out across the image using a quad tree. This constrains the number of features in any particular point in the image, as well as the maximum number of features tracked.
XROB VO Breadboard 24 June 2013 11
Visual Odometry – Feature Matching • When features are initialized for the first time, they need to be found in
both the left and right image of the stereo pair. • A feature on row k in the left image, can be assumed to line up row k in
the right image. • 1D search using mean SAD (sum of absolute differences) to compute the
best matching score.
XROB VO Breadboard 24 June 2013 12
Visual Odometry – Temporal Matching • Temporal matching is the process of matching two frames from time t to t+1. • The camera could have undergone arbitrary motion between the two capture
times. • A method called Binary Robust Independent Elementary Features (BRIEF) is
used to match the features. • BRIEF provides a integer pixel location within the image, however better
performance is achieved if we refine this estimate if with sub-pixel matching. This refinement is performed using Efficient Second Order Minimization (ESM).
XROB VO Breadboard 24 June 2013 13
Visual Odometry – Motion Estimate • Once a set of refined feature matches between two stereo frames have
been generated, the 6-DoF motion between the two frames can be computed starting with an initial RANSAC (RANdom SAmple Consensus) step that highlights and removes any outliers.
• A Least squares minimization is then performed using an m-estimator for robustness. After this step, any new outliers are also removed.
• Once the motion estimation has been completed and outliers removed, new features are added in the appropriate locations.
XROB VO Breadboard 24 June 2013 14
Test Site
• UK site • 300x200m • Sandy • Little vegetation
XROB VO Breadboard 24 June 2013 15
Platform - INDIE
• 6 wheel drive, 2 wheel steer • On-board processing with
Autonomous » Localisation » Navigation » Planning » Sensor acquisition
• Passive suspension • 3.5 hour operation time • Good terrain traversability • Stereo cameras on PTU • IMU • GPS data tagging
XROB VO Breadboard 24 June 2013 16
Ground Truth
XROB VO Breadboard 24 June 2013 16
• DGPS NovAtel sub-centimetre @ 1Hz
XROB VO Breadboard 24 June 2013 17
Visual Odometry Tests • Generate a variety of test that address the requirements
» Traverse in a straight line over 100m distance » Traverse with 360 degree rotation along the path » Traverse into Direct sunlight » Traverse across a Slope to produce side slip » Traverse containing all previous components » Traverse in a “Snake” motion » Traverse two large loops » Approach a gully face » Astronaut follow » Shadows changing in the field of view
• Indoor environment » Astronaut follow » 90 traverse and return to the start
XROB VO Breadboard 24 June 2013 18
Visual Odometry Results - 100m traverse
XROB VO Breadboard 24 June 2013 19
VOR - Small traverse with 360 degree rotation
-3 -2 -1 0 1 2 3 4-0.6
-0.5
-0.4
-0.3
-0.2
-0.1
0
0.1
-30 -25 -20 -15 -10 -5 0 5-0.6
-0.5
-0.4
-0.3
-0.2
-0.1
0
0.1
-3 -2 -1 0 1 2 3 4-30
-25
-20
-15
-10
-5
0
5
DGPSVO
XROB VO Breadboard 24 June 2013 20
VOR - Traverse into direct sunlight
-2 0 2 4 6 8 10 12 14-0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
-9 -8 -7 -6 -5 -4 -3 -2 -1 0 1-0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
-2 0 2 4 6 8 10 12 14-10
-8
-6
-4
-2
0
2
DGPSVO
XROB VO Breadboard 24 June 2013 21
VOR – 256m Traverse
-10 0 10 20 30 40 50 60 70-3
-2
-1
0
1
2
-90 -80 -70 -60 -50 -40 -30 -20 -10 0 10-3
-2
-1
0
1
2
-10 0 10 20 30 40 50 60 70-100
-80
-60
-40
-20
0
20
DGPSVO
XROB VO Breadboard 24 June 2013 22
VOR – Snake Traverse
-5 0 5 10 15 20 25 30-0.2
-0.1
0
0.1
0.2
0.3
-1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5-0.2
-0.1
0
0.1
0.2
0.3
-5 0 5 10 15 20 25 30-1
0
1
2
3
4
DGPSVO
XROB VO Breadboard 24 June 2013 23
VOR – Approach Numerous times
XROB VO Breadboard 24 June 2013 24
VOR – Astronaut Follow
XROB VO Breadboard 24 June 2013 25
VOR - Shadows
XROB VO Breadboard 24 June 2013 26
VOR - TASI
TAS-I
XROB VO Breadboard 24 June 2013 27
VOR – TASI 90m
XROB VO Breadboard 24 June 2013 28
Visual Odometry - Summary Description Total Distance (m) RMS error (m) Percentage
100m straight line traverse 106.39 0.33 0.31
Traverse with rotation 37.36 0.017 0.1
Direct sunlight 16.57 0.06 0.4
Slope 7.68 0.01 0.13
Around the quarry 256.13 0.99 0.38
Snake 64.85 0.72 1.1
High Speed 50 1.65 3.3
Loops 94.96 0.18 0.2
Gully 60.87 0.17 0.3
Approach 10.08 0.03 0.3
Astronaut follow 31.59 0.04 0.12
Long left 51.48 0.14 0.27
Shadows 0 0.44 N/A
Return to tent 5.3 0.26 0.47
Total 1843.26 (total) 44.94 (total) 0.6 (average)
XROB VO Breadboard 24 June 2013 29
Further Work - Chile
Further Work - GFreeNav
XROB VO Breadboard 24 June 2013 30
Further Work – Representative Hardware
XROB VO Breadboard 24 June 2013 31
• Evaluation the suitability of the OVO and localisation architecture on space qualified hardware (ExoMars breadboard (75MIPS Leon2) as a baseline).
• Testing on flight-representative hardware using image data captured during a trials in the Atacama Desert showed covering a distance of 0.85km
• Results showed mean time to process image frames below the required 10s per image frame
• Positional estimates providing less than ~1.5% RMS error.
XROB VO Breadboard 24 June 2013 32
Conclusions • This work sought to evaluate a state of the art visual odometry
breadboard for EGP localisation. • The extensive in results demonstrate that the component satisfies the
XROB requirements for mobile platform missions with a high level of performance.
• Tested in a variety of environments, » local quarry trials carried out in the UK » Clean room at TAS-I » A complementary version of the component was tested in the Atacama
Desert. • Provides a high degree of confidence in the proposed technology as this
environment offered the complete range of vision/terrain conditions expected on a Moon/Mars flight mission
• Given the results of this activity where the component was integrated as part of a complete GNC solution it indicates that the technology is at TRL 5 – “System/subsystem model or prototype demonstration in a relevant environment (ground or space)”.
XROB VO Breadboard 24 June 2013 33
VOR - Chile
Chile 1050 17.35 1.6