79
Introduction to Sensor Data Fusion Methods and Applications Last lecture: “Why Sensor Data Fusion?”’ Motivation, general context Discussion of examples oral examination: 6 credit points after the end of the semester prerequisite: participation in the excercises, running programs continuation in Summer: lectures and seminar on advanced topics job opportunities as research assistant in ongoing projects, practicum subsequently: master theses at Fraunhofer FKIE, PhD theses possible slides/script: email to [email protected], download Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

  • Upload
    others

  • View
    32

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Introduction to Sensor Data FusionMethods and Applications

• Last lecture: “Why Sensor Data Fusion?”’

– Motivation, general context– Discussion of examples

• oral examination: 6 credit points after the end of the semester

• prerequisite: participation in the excercises, running programs

• continuation in Summer: lectures and seminar on advanced topics

• job opportunities as research assistant in ongoing projects, practicum

• subsequently: master theses at Fraunhofer FKIE, PhD theses possible

• slides/script: email to [email protected], download

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 2: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Sensor & Information Fusion: Basic Task

-/

information sources: defined by operational requirements

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 3: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Sensor & Information Fusion: Basic Task

information to be fused: imprecise, incomplete, ambiguous, un-resolved, false, deceptive, hard to formalize, contradictory . . .

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 4: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Sensor & Information Fusion: Basic Task

information to be fused: imprecise, incomplete, ambiguous, un-resolved, false, deceptive, hard to formalize, contradictory . . .

information sources: defined by operational requirementsSensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 5: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

• single sensors/networkmeasurements

– kinematical parameters– classification attributes

• data processing/fusion

– temporal integration / logical analysis– statistical estimation / data association– combination with a priori information

• condensed information

– objects represented by “track structures”– quantitative accuracy/reliability measures

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 6: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

A Generic Tracking and Sensor Data Fusion System

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

- Object Environment- Object Characteristics

A Priori Knowledge:

- Sensor Performance

- Track Cancellation- Object Classification / ID- Track-to-Track Fusion

Track Processing:

- Interaction Facilities

Man-Machine Interface:

- Displaying Functions- Object Representation

Tracking & Fusion System

Sensor System Sensor System

SensorData

SensorControl

Sensor System

Track Extraction

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 7: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

source of information: received waveforms (space-time)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 8: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

detection: a decision process for data rate reduction

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 9: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

data fusion input: estimated target parameters, false plots

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 10: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

core function: associate sensor data to established tracks

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 11: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

track maintenance: updating of established target tracks

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 12: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

initiation: establish new tracks, re-initiate lost tracks

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 13: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

exploit available context information (e.g. topography)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 14: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

fusion/management of pre-processed track information

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 15: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Track AssociationSensor Data to Track File

Storage

Track Maintenance:

Retrodiction Prediction, Filtering

Sensing Hardware:

Signal Processing:

Parameter Estimation

Received Waveforms

Detection Process:

Data Rate Reduction

Track Initiation:

Multiple Frame

− Object Environment− Object Characteristics

A Priori Knowledge:

− Sensor Performance

Track Processing:

− Interaction Facilities− Displaying Functions− Object Representation

SensorData

SensorControl

Sensor System

Track Extraction

Man−Machine Interface:

− Track−to−Track Fusion

− Track Cancellation− Object Classification / ID

Tracking & Fusion System

user interface: presentation, interaction, decision support

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 16: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Target Tracking: Basic Idea, Demonstration

Problem-inherent uncertainties and ambiguities!BAYES: processing scheme for ‘soft’, ‘delayed’ decision

sensor performance: • resolution conflicts • DOPPLER blindness

environment: • dense situations • clutter • jamming/deception

target characteristics: • qualitatively distinct maneuvering phases

background knowledge • vehicles on road networks • tactics

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 17: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

‘Probability densities functions (pdf)’ p(x

k�1|Zk�1) represent imprecise

knowledge on the ‘state’ xk�1 based on imprecise measurements Zk�1.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 18: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(x

k

|Zk�1)

| {z }prediction

=

Rdx

k�1 p(x

k

|xk�1)| {z }

dynamics

p(x

k�1|Zk�1)

| {z }old knowledge

.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 19: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

tk: kein plot

missing sensor detection: ‘data processing’ = prediction(not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 20: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 21: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 22: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

Prädiktion: tk+1

likelihood(Sensormodell)

BAYES’ formula: p(x

k+1

|Zk+1

)

| {z }new knowledge

=

p(z

k+1

|xk+1

) p(x

k+1

|Zk

)

Rdx

k+1

p(z

k+1|{z}plot

|xk+1

) p(x

k+1

|Zk

)

| {z }prediction

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 23: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 24: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

tk+1: drei plots

ambiguities by false plots: 1 + 3 data interpretation hypotheses(‘detection probability’, false alarm statistics)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 25: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1

Multimodal pdfs reflect ambiguities inherent in the data.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 26: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1

Prädiktion: tk+2

temporal propagation: dissipation of the probability densities

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 27: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1

tk+2: ein plot

association tasks: sensor data$ interpretation hypotheses

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 28: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1

Prädiktion: tk+2

likelihood

BAYES: p(x

k+2

|Zk+2

) =

p(z

k+2

|xk+2

) p(x

k+2

|Zk+1

)Rdx

k+2

p(z

k+2

|xk+2

) p(x

k+2

|Zk+1

)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 29: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1

pdf: tk+2

in particular: re-calculation of the hypothesis weights

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 30: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

pdf: tk+1

pdf: tk+2

How does new knowledge affect the knowledge in the past of a past state?

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 31: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

pdf: tk−1

pdf: tk

Retrodiktion: tk+1

pdf: tk+2

‘retrodiction’: a retrospective analysis of the past

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 32: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

tk−1tk

tk+1

tk+2

optimal information processing at present and for the past

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 33: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Multiple Hypothesis Tracking: Basic IdeaIterative updating of conditional probability densities!

kinematic target state x

k

at time t

k

, accumulated sensor data Zk

a priori knowledge: target dynamics models, sensor model, road maps

• prediction: p(x

k�1|Zk�1)

dynamics model����������!road maps

p(x

k

|Zk�1)

• filtering: p(x

k

|Zk�1)

sensor data Z

k����������!sensor model

p(x

k

|Zk

)

• retrodiction: p(x

l�1|Zk

)

filtering output ����������dynamics model

p(x

l

|Zk

)

– finite mixture: inherent ambiguity (data, model, road network )– optimal estimators: e.g. minimum mean squared error (MMSE)– initiation of pdf iteration: multiple hypothesis track extraction

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 34: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 35: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

measurements:– false returns (residual clutter, birds, clouds)– low data update rates (long-range radar, e.g.)– measurement errors, overlapping gates

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 36: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

measurements:– false returns (residual clutter, birds, clouds)– low data update rates (long-range radar, e.g.)– measurement errors, overlapping gates

sensor resolution:– characteristic parameters band-/beam width– group measurements: resolution probability– important: qualitatively correct modeling

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 37: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Difficult Operational Conditions

object detection:– small objects: detection probability P

D

< 1– fading: consecutive missing plots (interference)– moving platforms: minimum detectable velocity

measurements:– false returns (residual clutter, birds, clouds)– low data update rates (long-range radar, e.g.)– measurement errors, overlapping gates

sensor resolution:– characteristic parameters band-/beam width– group measurements: resolution probability– important: qualitatively correct modeling

object behavior:– applications: high maneuvering capability– qualitatively distinct maneuvering phases– dynamic object parameters a priori unknown

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 38: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Demonstration: Multiple Hypothesis Tracking

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 39: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 40: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 41: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Tracking Application: Ground Picture Production

GMTI Radar: Ground Moving Target Indicator

wide area, all-weather, day/night, real-time surveillance ofa dynamically evolving ground or near-to-ground situation

GMTI Tracking: Some Characteristic Aspectsbackbone of a ground picture: moving target tracks

• airborne, dislocated, mobile sensor platforms• vehicles, ships, ‘low-flyers’, radars, convoys• occlusions: Doppler-blindness, topography• road maps, terrain information, tactical rules• dense target / dense clutter situations: MHT

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 42: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Examples of GMTI Tracks (live exercise)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 43: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Research Institute for Communications, Information Processing, and Ergonomics KIEKIE

1

For OFFICIAL USE ONLY

Exploit Heterogeneous Multiple Sensor Systems.

Covert & Automated Surveillance of a PersonStream: Identification of Anomalous Behavior

Towards a SolutionTowards a Solution

General Task General Task

Multiple Sensor Security Assistance Systems

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 44: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

2

For OFFICIAL USE ONLY

Exploit Heterogeneous Multiple Sensor Systems.

Covert & Automated Surveillance of a PersonStream: Identification of Anomalous Behavior

Towards a SolutionTowards a Solution

General Task General Task

DataSensor Surveillance

FusionDat

a

PersonClassification

Attributes:What? When?

Kinematics:Where? When?

Multiple Sensor Security Assistance Systems

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 45: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Research Institute for Communications, Information Processing, and Ergonomics KIEKIE

3

For OFFICIAL USE ONLY

Security Applications: Consider Well-defined Access Regions.

Fundamental Problem: Limited Spatio-temporal Resolution of Chemical Sensors

Key to Solution: Compensate poor resolution by Space-time Sensor Data Fusion

Tunnels / UndergroundEscalators / Stairways

Important Task: Detect persons carrying hazardous materials in a person flow.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 46: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

4

For OFFICIAL USE ONLY

Security Applications: Consider Well-defined Access Regions.

Track Extraction / Maintenance

Laser-Range-Scanner Sensors

Track Extraction / Maintenance

Laser-Range-Scanner SensorsAttributes

Chemical Sensors

Attributes

Chemical SensorsVideo Data

Supporting Information

Video Data

Supporting Information

Fundamental Problem: Limited Spatio-temporal Resolution of Chemical Sensors

Key to Solution: Compensate poor resolution by Space-time Sensor Data Fusion

EU Project HAMLeT: Hazardous Material Localization and Person Tracking

Tunnels / UndergroundEscalators / Stairways

Important Task: Detect persons carrying hazardous materials in a person flow.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 47: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

5

For OFFICIAL USE ONLY

• Exploit the full potential of specific sensors (attributes)• Associate measured attributes/signatures to individuals• Tracking and classification of individuals in person streams• Covert operation: avoid interference with “normal” public live• Avoid fatigue in situations with low frequency of suspicious events

Laser

Video

Indoor Radar

Chemical Sensors

IR Camera

Detection of Persons and Goods with High Threat Potential

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 48: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Multiple Sensor Security Assistance Systems

• Experience of human security personnel remains indispensable.

• Technical assistance by focusing their attention to critical situations.

• Security assistance systems: covert operation, continuous time.

• Combination of strengths of automated and human data exploitation.

– screening: real time analysis of large data streams

– high decision confidence in individual situations

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 49: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

On Characterizing Tracking / Fusion Performance

a well-understood paradigm: air surveillance with multiple radars

Many results can be transfered to other sensors (IR, E/O, sonar, acoustics).

Sensor Data Fusion: ‘tracks’ represent the available information on the targetsassociated to them with appropriate quality measures, thus providing answers to:

When? Where? How many? To which direction? How fast, accelerating? What?

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 50: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

On Characterizing Tracking / Fusion Performance

a well-understood paradigm: air surveillance with multiple radars

Many results can be transfered to other sensors (IR, E/O, sonar, acoustics).

Sensor Data Fusion: ‘tracks’ represent the available information on the targetsassociated to them with appropriate quality measures, thus providing answers to:

When? Where? How many? To which direction? How fast, accelerating? What?

By sensor data fusion we wish to establish one-to-one associations between:

targets in the field of view $ identified tracks in the tracking computer

Strictly speaking, this is only possible under ideal conditions regarding the sensorperformance and underlying target situation. The tracking/fusion performance can thus

be measured by its deficiencies when compared with this ideal goal.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 51: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

1. Let a target be detected at first by a sensor at time t

a

. Usually, a timedelay is involved until a confirmed track has finally been establishedat time t

e

(track extraction). A ‘measure of deficiency’ is thus:

• extraction delay t

e

� t

a

.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 52: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

1. Let a target be detected at first by a sensor at time t

a

. Usually, a timedelay is involved until a confirmed track has finally been establishedat time t

e

(track extraction). A ‘measure of deficiency’ is thus:

• extraction delay t

e

� t

a

.

2. Unavoidably, false tracks will be extracted in case of a high falsereturn density (e.g. clutter, jamming/detection), i.e. tracks related tounreal or unwanted targets. Corresponding ‘deficiencies’ are:

• mean number of falsely extracted targets per time,• mean life time of a false track before its deletion.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 53: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

1. Let a target be detected at first by a sensor at time t

a

. Usually, a timedelay is involved until a confirmed track has finally been establishedat time t

e

(track extraction). A ‘measure of deficiency’ is thus:

• extraction delay t

e

� t

a

.

2. Unavoidably, false tracks will be extracted in case of a high falsereturn density (e.g. clutter, jamming/detection), i.e. tracks related tounreal or unwanted targets. Corresponding ‘deficiencies’ are:

• mean number of falsely extracted targets per time,• mean life time of a false track before its deletion.

3. A target should be represented by one and the same track until leav-ing the field of view. Related performance measures/deficiencies:

• mean life time of tracks related to true targets,• probability of an ‘identity switch’ between targets,• probability of a target not being represented by a track.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 54: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

4. The track inaccuracy (error covariance of a state estimate) should beas small as possible. The deviations between estimated and actualtarget states should at least correspond with the error covariancesproduced (consistency). If this is not the case, we speak of a ‘trackloss’.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 55: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

4. The track inaccuracy (error covariance of a state estimate) should beas small as possible. The deviations between estimated and actualtarget states should at least correspond with the error covariances pro-duced (consistency). If this is not the case, we speak of a ‘track loss’.

• A track must really represent a target!

Challenges:

• low detection probability • high clutter density • low update rate• agile targets • dense target situations • formations, convoys• target-split events (formation, weapons) • jamming, deception

Basic Tasks:

• models: sensor, target, environment ! physics

• data association problems ! combinatorics

• estimation problems ! probability, statistics

• process control, realization ! computer science

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 56: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 57: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 58: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 59: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

• Approach: Interpret measurements and target vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 60: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

• Approach: Interpret measurements and target vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.

• Solution: Derive iteration formulae for calculating the pdfs! Develop a mech-anism for initiation! By doing so, exploit all background information available!Derive state estimates from the pdfs along with appropriate quality measures!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 61: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 62: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 63: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 64: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y)| {z }=p(y|x) p(x)

=

Rdy p(x|y) p(y): Enter y!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 65: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 66: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0

1p2⇡�

e

�1

2

(x�y)2�

2

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 67: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0

1p2⇡�

e

�1

2

(x�y)2�

2

• transformed RV y = t[x]: p(y) =

Rdx p(y, x) =

Rdx p(y|x) p

x

(x)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 68: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)

• conditional pdf p(x|y) =

p(x,y)

p(y)

: Impact of information on y on RV x?

• marginal density p(x) =

Rdy p(x, y) =

Rdy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y)

=

p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!

• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0

1p2⇡�

e

�1

2

(x�y)2�

2

• transformed RV y = t[x]: p(y) =

Rdxp(y, x) =

Rdxp(y|x)p

x

(x) =

Rdx �(y � t[x]) p

x

(x) =: [T p

x

](y) (T : p

x

7! p, “transfer operator”)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 69: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Characterize an object by quantitativelydescribable properties: object state

Examples:

– object position x on a strait line: x 2 R– kinematic state x = (r

>, r

>, r

>)

>, x 2 R9

position r = (x, y, z)

>, velocity r, acceleration r

– joint state of two objects: x = (x

>1

,x

>2

)

>

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements anddescribe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 70: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Interpret unknown object states as random variables, x [1D] or x,X [vector / matrixvariate]), characterized by corresponding probability density functions (pdf).

The concrete shape of the pdf p(x) contains the full knowledge on x!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 71: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Information on a random variable (RV) can be extractedby integration from the corresponding pdf. !

at present: one dimensional case:

How probable is it that x 2 (a, b) ✓ R holds?

Answer: P{x 2 (a, b)} =

Zb

a

dx p(x) ) p(x) � 0

in particular: P{x 2 R} =

Z 1

�1dx p(x) = 1 (normalzation)

intuitive interpretation: “the object is somewhere in R”

loosely: p(x) dx is probabity for x having a value between x and x+ dx

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 72: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 73: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful! (! examples)

instead: Calculate the centroid of the pdf!

E[x] =Z 1

�1dx x p(x) = x “expectation value”

more generally: Consider functions g : x 7! g(x) of the RV x!

E[g(x)] =Z 1

�1dx g(x) p(x), “expectation value of the observable g”’

Example: Consider the observable 1

2

mx

2 (kinetic energy, x = speed)

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 74: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

An important observable: the “error” of an estimate

• Quality: How useful is an expectation value x = E[x]?

Consider special obervables as distance measure:

g(x) = |x� x| oder g(x) = (x� x)

2

quadratic measures: computationally more comfortable!

‘expected error’ of the expectation value x:

V[x] = E[(x� x)

2

], �x

=

pV[x]

variance, standard deviation

Exercise 2.1Show that

V[x] = E[x2]� E[x]2holds.

Expectation value of the observable x

2 also called “2nd moment” of the pdf of x.

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 75: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Exercise 2.2

Calculate expectation and variance of the uniform densityof a RV x 2 R in the intervall [a, b].

p(x) = U( x

|{z}ZV

; a, b

|{z}Parameter

) =

8<

:

1

b�a x 2 [a, b]

0 sonst!

Pdf correctly normalized?Z 1

�1dx U(x; a, b) =

1

b� a

Zb

a

dx = 1

E[x] =Z 1

�1dx x U(x; a, b) =

b+ a

2

V[x] = 1

b� a

Zb

a

dx x

2 � E[x]2 =

1

12

(b� a)

2

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 76: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Important example: x normally distributed over R (Gauß)

– wanted: probabilities concentrated around µ

– quadratic distance: ||x�µ||2 =

1

2

(x�µ)

2

/�

2 (mathematically convenient!)

– Parameter � is a measure of the “width” of the pdf: ||�||2 =

1

2

– for ‘large’ distances, i.e. ||x� µ||2 � 1

2

, the pdf shall decay quickly.

– simplest approach: p(x) = e

�||x�µ||2 (> 0 8x 2 R, normalization?)

– Normalized for p(x) = p(x)/

R1�1 dx p(x)!

Formula collection delivers:Z 1

�1dx p(x) =

p2⇡�

An admissible pdf with the required properties is obviously given by:

N (x;µ,�) =

1p2⇡�

exp

�(x� µ)

2

2�

2

!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 77: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Exercise 2.3Show for the Gauß density p(x) = N (x;µ,�):

E[x] = µ, V[x] = �

2

E[x] =Z 1

�1dx xN (x;µ,�) = µ

V[x] = E[x2]� E[x]2 = �

2

Use substitution and partial integration!

UseR1�1 dx e

�1

2

x

2

=

p2⇡!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 78: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

How to incorporate certain knowledge into pdfs?

Consider the special pdf: 1ly sharp peak at µ

�(x;µ) ‘=’

8<

:1 x = µ

0 x 6= µ

holding:Z 1

�1dx �(x;µ) = 1

Intuitively interpretable as a limit:

�(x;µ) ‘=’ lim

�!0

N (x;µ,�) = lim

�!0

1p2⇡�

e

�1

2

(x�µ)2�

2

Alternative: �(x; a) ‘=’ lim

b!a

U(x; a, b)

For observables g this holds: E[g(x)] =Z 1

�1dx g(x) �(x; y) = g(y)

in particular: V[x] = E[(x� E[x])2] = E[x2]� E[x]2 = 0

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017

Page 79: Introduction to Sensor Data Fusion Methods and …...Introduction to Sensor Data Fusion Methods and Applications • Last lecture: “Why Sensor Data Fusion?”’ – Motivation,

Characterize an object by quantitativelydescribable properties: object state

Examples:

– object position x on a strait line: x 2 R X– kinematic state x = (r

>, r

>, r

>)

>, x 2 R9

position r = (x, y, z)

>, velocity r, acceleration r

– joint state of two objects: x = (x

>1

,x

>2

)

>

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements anddescribe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 1st Lecture on October 25, 2017