17
Augmenting Film and Augmenting Film and Video Footage with Video Footage with Sensor Data Sensor Data N. Su, H. Park, E. Bostrom, N. Su, H. Park, E. Bostrom, J. Burke, M. Srivastava, D. J. Burke, M. Srivastava, D. Estrin Estrin PerCom ’04: March 14-17, PerCom ’04: March 14-17, 2004 2004

Augmenting Film and Video Footage with Sensor Data

  • Upload
    saddam

  • View
    36

  • Download
    3

Embed Size (px)

DESCRIPTION

Augmenting Film and Video Footage with Sensor Data. N. Su, H. Park, E. Bostrom, J. Burke, M. Srivastava, D. Estrin PerCom ’04: March 14-17, 2004. Augmented Recording System. Wireless sensor network application for filmmaking and media production Seamless integration Mobility - PowerPoint PPT Presentation

Citation preview

Page 1: Augmenting Film and Video Footage with Sensor Data

Augmenting Film and Video Augmenting Film and Video Footage with Sensor DataFootage with Sensor Data

N. Su, H. Park, E. Bostrom, J. Burke, M. N. Su, H. Park, E. Bostrom, J. Burke, M. Srivastava, D. EstrinSrivastava, D. Estrin

PerCom ’04: March 14-17, 2004PerCom ’04: March 14-17, 2004

Page 2: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 2

Augmented Recording System

Wireless sensor network application for filmmaking and media production• Seamless integration

• Mobility

• Increased expressiveness Synchronize sensor data with video footage Sensor data allows post processing of video

Page 3: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 3

Why not use image processing? Image processing techniques can infer position,

motion, light condition… For example: Van Helsing

Infrared LED Marker

htt

p:/

/ww

w.f

xgu

ide

.co

m

Page 4: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 4

Why not use image processing?

htt

p:/

/ww

w.f

xgu

ide

.co

m

Page 5: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 5

Why not use image processing?

Do not offer fine grain data Cannot infer conditions outside of view, or

quantities such as wind speed or temperature

Page 6: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 6

Architecture

Sensor node neighbourhood

Serial port server Timecode generator Sylph server

middleware Jini client SQL database

Page 7: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 7

1. Sylph server lookup

2. Sylph translates query, forward to serial port server

3. Serial port server dispatches messages to base stations

4. Sensors begin collecting data

5. Base station forwards data to serial port server

6. Serial port server interpolates data

7. Sylph server announces new data

8. Jini client stores data in database

Page 8: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 8

Sensors node neighbourhood Uses CrossBow Mica 1 / Mica

2 motes• Thermistor, light sensor,

microphone, accelerometer.

Radio range of few hundred feet, ~10kbps

htt

p:/

/co

mp

ute

r.h

ow

stu

ffw

ork

s.co

m/m

ote

.htm

Runs on PALOS (Power Aware Light-weight Operating System)

Page 9: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 9

Sensors node neighbourhood Clustering

• Each base station responsible for a few motes

Base station• Potentiometer calibration

• Admit closest X sensors

Neighbours• Internal frame counter

from 2 to infinity

• Sends data every 13 frames

• Filters redundant values

Page 10: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 10

Serial port server Controls base stations via serial connections Takes SMPTE timecode and

synchronizes with sensor data Combines all sensor data

for a frame and sends as one

packet Interpolates missing data

Page 11: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 11

Serial port server Time synchronization

• At most 7s per second drift

• One frame error per 4766.7 frames

• Time sync every 2.5 min

Latency test• shows delay of at most

3 frames

Page 12: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 12

Sylph server middleware

UCLA project, used in Smart Kindergarten Allows queries on sensors Defines JINI attributes such as light, period,

command JINI query: “READ LIGHT EVERY 30

SECONDS” Translated: “SET PERIOD 30 SECONDS”,

“SET COMMAND=STARTSENDING”

Page 13: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 13

JINI client Client retrieves sensor data per frame and stores

in MS Access DB Provides playback features and data collection

controls

Page 14: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 14

Evaluations Deployed 2 base stations, 4 sensors each Clustering algorithm took 2.59 min Exp1: Graduate light intensity changes

Page 15: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 15

Evaluations Exp2: Delay measurement

• Some delay of ~10 frames

• Maximum of 20 frames

Page 16: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 16

Future work

Dynamic control• Real time on-the-fly adjustment to studio

equipments Semantic indexing of video streams

• Express interest and query high level events Continuity Management

• Allows checks for continuity in different footage

Page 17: Augmenting Film and Video Footage with Sensor Data

March 10, 2005 Herman Li 17

References Augmenting Film and Video Footage with Sensor Data, Norman

Makoto Su, Heemin Park, Eric Bostrom, Jeff Burke, Mani B. Srivastava, Deborah Estrin

ARS: http://www.ee.ucla.edu/~hmpark/ars PALOS:

http://deerhound.ats.ucla.edu:7777/pls/portal/docs/PAGE/CENS_REPOSITORIES/TECHNOLOGIES/PALOS-TUTORIAL.PDF

http://mmsl.cs.ucla.edu/sylph http://www.fxguide.com http://computer.howstuffworks.com/mote.htm