40
Computational Neuroscience for Technology: Event-based Vision Sensors and Information Processing Jörg Conradt Neuroscientific System Theory Center of Competence on Neuro-Engineering Technische Universität München http://www.nst.ei.tum.de 28.11.2014 Qualcomm Vienna

Computational Neuroscience for Technology: … Neuroscience for Technology: Event-based Vision Sensors and Information Processing Jörg Conradt Neuroscientific System Theory Center

  • Upload
    voquynh

  • View
    218

  • Download
    5

Embed Size (px)

Citation preview

Computational Neuroscience for Technology:Event-based Vision Sensors and Information Processing

Jörg Conradt

Neuroscientific System TheoryCenter of Competence on Neuro-Engineering

Technische Universität München

http://www.nst.ei.tum.de

28.11.2014 Qualcomm Vienna

Neuromorphic Engineering

Neuro - morphic

“to do with neurons”,i.e. neurally inspired “structure or form”

Discovering key principles by which brains workand implementing them in technical systems

that intelligently interact with their environment.

• analog VLSI Circuits to implement neural processing• Sensors, such as Silicon Retinae or Cochleae• Distributed Adaptive Control, e.g. CPG based locomotion• Massively Parallel Self-Growing Computing Architectures• Neuro-Electronic Implants, Rehabilitation

Examples:

Computation in Brains

Getting to know your Brain• 1.3 Kg, about 2% of body weight• 1011 neurons• Neuron growth:

250.000 / min (early pregnancy)… but also … loss 1 neuron/second

Computation in Brains

Getting to know your Brain• 1.3 Kg, about 2% of body weight• 1011 neurons• Neuron growth:

250.000 / min (early pregnancy)… but also … loss 1 neuron/second

“Operating Mode” of Neurons• Analog leaky integration in soma• Digital pulses (spikes) along neurites• 1014 stochastic synapses• Typical operating “frequency”:

≤100 Hz, typically ~10Hz, asynchronous

Computation in Brains ... and Machines

Getting to know your Brain• 1.3 Kg, about 2% of body weight• 1011 neurons• Neuron growth:

250.000 / min (early pregnancy)… but also … loss 1 neuron/second

“Operating Mode” of Neurons• Analog leaky integration in soma• Digital pulses (spikes) along neurites• 1014 stochastic synapses• Typical operating “frequency”:

≤100 Hz, typically ~10Hz, asynchronous

Getting to know your Computer’s CPU:• 50g, irrelevant for most applications• 2 • 109 transistors (Intel Itanium)• ideally no modification over lifetime

“Operating Mode” of CPUs• No analog components• Digital signal propagation• Reliable signal propagation• Typical operating frequency:

Several GHz, synchronous

10120 103 106 109

Elephant

Cat

Frog

Mouse

Honeybee

Zebrafish

Sponge

C. E

legans

Hum

an

Chim

panzee

Getting to know your Brain• 1.3 Kg, about 2% of body weight• 1011 neurons• Neuron growth:

250.000 / min (early pregnancy)… but also … loss 1 neuron/second

“Operating Mode” of Neurons• Analog leaky integration in soma• Digital pulses (spikes) along neurites• 1014 stochastic synapses• Typical operating “frequency”:

≤100 Hz, typically ~10Hz, asynchronous

Getting to know your Computer’s CPU:• 50g, irrelevant for most applications• 2 • 109 transistors (Intel Itanium)• ideally no modification over lifetime

“Operating Mode” of CPUs• No analog components• Digital signal propagation• Reliable signal propagation• Typical operating frequency:

Several GHz, synchronous

Computation in Brains ... and Machines

Research Area «Neuroscientific System Theory»

High-Speed Event-Based Vision

On-Board Vision-Based Control of Miniature UAVs

Distributed Local Cognitive Maps for

Spatial Representation

Event-Based Navigation

Neuronal-Style Information Processingin Closed-Control-Loop Systems

• Distributed Local Information Processing• Growing and Adaptive Networks of Computational Units• Neuromorphic Sensor Fusion and Distributed Actuator Networks• Event-Based Perception, Cognition, and Action

Distributed Sensing and Actuation

ARM7 μC

DVS Sensor

UART/SPI

52m

m

30mm• 128 x 128 pixels, each signalstemporal changes of illumination (“events”)

• Asynchronous updates (no image frames)• 15 μs latency, up to 1Mevents/sec

Joint work with the Neuromorphic Sensors Group, ETH/University Zurich

Retinal Inspired Dynamic Vision Sensor

Event - Based Vision

Frame - based Event - based

discrete time

full frames

continuous time

singular pixels

Advantages• Sparse data stream: ~0.1 MB/s bandwidth• ~10us response time• Local contrast adjustment per pixel• Automatic preprocessing

Challenges• Static scenes• New vision algorithms

for tracking, localization, ...

Event - Based Vision

An Asynchronous Temporal Difference Vision Sensor

Conradt et al, ISCAS 2009

An Asynchronous Temporal Difference Vision Sensor

Conradt et al, ISCAS 2009

26mm

36m

m

Target LED

RechargeableBattery

Microcontrollerfor PWM Signal

Power Switch

eDVS Event Time: tE = 5612μsEvent Position: PE = (3 /2)

Previous Event TimeMemory: tM = 4600μs

Target Marker LED

expectedΔtexp = 1000μs

2215 1650 4132 13 4231

3254 4132 4600 3432 2144

3124 5027 4301 4644 2234

2301 3113 1285 341 4233

128x128

tD = ΔtE - Δtexp = 12μs

ΔtE=tE-tM

20 40 60 80 100-100 -80 -60 -40 -20

1

tD

wT

time deviation tD in μs

GT: relative weight wT

Position Update: PT+1 = (1 – η(wT+wP))•PT + η(wT+wP)•PE

20 40 60 80 100

1

PD

wP

position deviation PD in pixel

GP: relative weight wP

PD = (PE – PT)2 = 17ΣCurrentTrackedPositionPT = (7 /3)

…128x128

0

6

12

Pixel X

Pixe

l Y

on event count

Müller et al, ROBIO 2011

Application Example: Tracking an Active Object

eDVS Sensor

Pan Servo

Tilt ServoLaser Diode

240mm

Marker

Application Example: Tracking an Active Object

Müller et al, ROBIO 2011

eDVS Sensor

Pan Servo

Tilt ServoLaser Diode

time

/ s0

2

4

6

8

10

Georg MüllerApplication Example: Tracking an Active Object

Müller et al, ROBIO 2011

(Fun) Application Example: High Speed Balancing of Short Poles

Conradt et al, ISCAS 2009

Incoming events

Current estimate

Converting Events into Meaning

(Fun) Application Example: High Speed Balancing of Short Poles

Conradt et al, ECV 2009

Extended Application Example: High Speed Tracking

Bamford et al, submitted EBCCSP’15

Application Example: SLAM for Event-Based Vision SystemsFramework: Particle Filter

Weikersdorfer et al, ROBIO 2012

Application Example: SLAM for Event-Based Vision SystemsEvent Based Localization

Weikersdorfer et al, ROBIO 2012

Application Example: SLAM for Event-Based Vision SystemsEvent Based Mapping

Weikersdorfer et al, ICVS 2013

Application Example: SLAM for Event-Based Vision SystemsEvent Based Mapping

Weikersdorfer et al, ICVS 2013

?

Application Example: SLAM for Event-Based Vision Systems

Hoffmann et al, ECMR 2013

5x real time

Event-based 3D SLAMwith a depth-augmented Dynamic Vision Sensor

• Sensor combination: event-based dynamic vision and PrimeSense object distance

• Sparse sensory data allows real-time processing and integration

• Localization update rates of > 100 Hzon standard PC

• Suitable for small autonomous mobile robotsin high-speed application scenarios

David Weikersdorfer, David B. Adrian,Daniel Cremers, Jörg Conradt

Technische Universität München, Germany

Weikersdorfer et al, ICRA 2014

An Autonomous Indoor Quadrocopter (eb3D SLAM)

Selected hardware components : eDVS (red), Mini-PC (blue),Primesense IR projector (yellow) and IR camera (green), and drone control board (magenta).

Parrot AR.Dronediam. ca. 80 cm

An Autonomous Indoor Quadrocopter (eb3D SLAM)

Evaluation against ground truth: eb-3DSLAMdesired trajectory ext. tracker (OptiTrack Flex13)

Outlook: An Autonomous Indoor Micro Quadrocopter (3D SLAM)

Micro - eDVS

Microcontroller

UART Port

Lightweight Lenses

Key Specs:• 20x20mm, 3g• 75mW power• 64KB SRAM• 32bit, 64MHz

DVS Sensor

16cm

Key Technical Specs:• Diameter 16cm• Weight 38g• Autonomy 14min

Neuromorphic Computing Landscape

• A million mobile phone processors in one computer

• Able to model about 1% of the human brain…

• …or 10 mice!

SpiNNaker Project

MobileDDRSDRAMinterface

Multi-chip packaging by

UNISEM Europe

SpiNNaker Chip

Distributed Neuronal Computation “on chip”Asynchronous Spiking Neural Computation Hardwarefor low-power real-time operation in Closed-Loop Systems

864 Core “Rack Version”

72 Core Evaluation

18 Core Stand-Alone Prototype

• Multi-channel spiking input and output• Stand-alone spiking computing system• Simulates ~20.000 neurons in real time• Small (~20x20mm); low power (~600mW)• Flexibly configurable, extendable, stackable

... to simulate 1 BillionSpiking Neurons

in real-time

http://www.nst.ei.tum.de

The SpiNNaker – Robot Interface Board

SpiNNakerInterface Board integrates as “additional chip”

IO BoardOmniBot

Retina

Balancer

Sens

ory/

Actu

ator

SpiN

Nak

erpa

cket

s

WLAN Interface

Embedded 768core SpiNNaker Board

Stereo eDVS sensors

Interface Board

NST SpOmniBot

Denk et al, ICANN 2013

Interface to Compliant Robotic Arm

Interface to Compliant Robotic Arm

Example Projects: SpOmniBee• 2 laterally facing retinas• Retina events feed into SpiNNaker• Distributed computation of local

optical flow• Global combination of optic flow• Computation of Motor commands

on SpiNNaker, send to robot

Goal:keep SpOmnibot centered inhallway with marking, similar to Srinivasan’s Bee experiments

Application Example: Event-Based Optic Flow for Robot Control

Example Projects: Ball Balancer• One Retina to observe the platform in top-down view• Two motors to control platform’s pan and tilt angles• SpiNNaker Interface Board is connected to retina and motors

Goal: Keep ball (one or more!) on a certain trajectory

Live Demo at HBP (EU Flagship) Inaugural Summit, Oct. 2013, Lausanne

Neural Principles for System Control

http://www.nst.ei.tum.de