02 Fall09 Lecture Sept18web

Preview:

Citation preview

MIT Media LabMIT Media Lab

Camera CultureCamera Culture

Ramesh RaskarRamesh Raskar

MIT Media LabMIT Media Lab

http:// CameraCulture . info/http:// CameraCulture . info/

Computational Camera & Computational Camera & Photography:Photography:

Computational Camera & Computational Camera & Photography:Photography:

Where are the ‘cameras’?Where are the ‘cameras’?

Poll, Sept 18Poll, Sept 18thth 2009 2009

• When will DSCamera disappear? Why?When will DSCamera disappear? Why?

Like Wristwatches ?

Taking NotesTaking Notes

• Use slides I post on the siteUse slides I post on the site

• Write down anecdotes and storiesWrite down anecdotes and stories

• Try to get what is NOT on the slideTry to get what is NOT on the slide

• Summarize questions and answersSummarize questions and answers

• Take photos of demos + doodles on boardTake photos of demos + doodles on board

– Use laptop to take notesUse laptop to take notes– Send before next MondaySend before next Monday

Synthetic LightingSynthetic LightingPaul Haeberli, Jan 1992Paul Haeberli, Jan 1992

HomeworkHomework

• Take multiple photos by changing lighting other parameters. Take multiple photos by changing lighting other parameters. Be creative.Be creative.

• Mix and match color channels to relightMix and match color channels to relight

• Due Sept 25Due Sept 25thth

• Submit on Stellar (via link):Submit on Stellar (via link):– Commented Source codeCommented Source code– Input images and output images PLUS intermediate resultsInput images and output images PLUS intermediate results– CREATE a webpage and send me a linkCREATE a webpage and send me a link

• Ok to use online softwareOk to use online software• Update results on Flickr (group) pageUpdate results on Flickr (group) page

Debevec et al. 2002: ‘Light Stage 3’

Image-Based Actual Re-lightingImage-Based Actual Re-lighting

Film the background in Milan,Film the background in Milan,Measure incoming light,Measure incoming light,

Light the actress in Los AngelesLight the actress in Los Angeles

Matte the backgroundMatte the background

Matched LA and Milan lighting.Matched LA and Milan lighting.

Debevec et al., SIGG2001

Second HomeworkSecond Homework

• Extending Andrew Adam’s Virtual Optical BenchExtending Andrew Adam’s Virtual Optical Bench

Dual photography from diffuse reflections: Dual photography from diffuse reflections: Homework Assignment 2Homework Assignment 2

the camera’s viewSen et al, Siggraph 2005Sen et al, Siggraph 2005

Beyond Visible SpectrumBeyond Visible Spectrum

CedipRedShift

• Brief IntroductionsBrief Introductions• Are you a photographer ?Are you a photographer ?• Do you use camera for vision/image Do you use camera for vision/image processing? Real-time processing?processing? Real-time processing?

• Do you have background in optics/sensors?Do you have background in optics/sensors?

• Name, Dept, Year, Why you are hereName, Dept, Year, Why you are here

• Are you on mailing list? On Stellar?Are you on mailing list? On Stellar?• Did you get email from me?Did you get email from me?

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Dark Bldgs

Reflections on bldgs

Unknown shapes

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

‘Well-lit’ Bldgs

Reflections in bldgs windows

Tree, Street shapes

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuBackground is captured from day-time scene using the same fixed camera

Night Image

Day Image

Context Enhanced Image

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Mask is automatically computed from scene contrast

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

But, Simple Pixel Blending Creates Ugly Artifacts

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuPixel Blending

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuPixel Blending

Our Method:Integration of

blended Gradients

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Rene Magritte, ‘Empire of the Light’

Surrealism

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Time-lapse MosaicsTime-lapse Mosaics

Maggrite StripesMaggrite Stripes

timetime

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

tt

Range Camera DemoRange Camera Demo

http://www.flickr.com/photos/pgoyette/107849943/in/photostream/

  Scheimpflug Scheimpflug principleprinciple

PlanPlan

• LensesLenses– Point spread functionPoint spread function

• LightfieldsLightfields– What are they?What are they?– What are the properties?What are the properties?– How to capture?How to capture?– What are the applications?What are the applications?

• FormatFormat– 4 (3) Assignments4 (3) Assignments

• Hands on with optics, Hands on with optics, illumination, sensors, masksillumination, sensors, masks

• Rolling schedule for overlapRolling schedule for overlap• We have cameras, lenses, We have cameras, lenses,

electronics, projectors etcelectronics, projectors etc• Vote on best projectVote on best project

– Mid term examMid term exam• Test conceptsTest concepts

– 1 Final project1 Final project• Should be a Novel and CoolShould be a Novel and Cool• Conference quality paperConference quality paper• Award for best projectAward for best project

– Take 1 class notesTake 1 class notes– Lectures (and guest Lectures (and guest

talks)talks)– In-class + online In-class + online

discussiondiscussion

• If you are a listenerIf you are a listener– Participate in online discussion, dig Participate in online discussion, dig

new recent worknew recent work– Present one short 15 minute idea or Present one short 15 minute idea or

new worknew work

• CreditCredit• Assignments: 40%Assignments: 40%• Project: 30%Project: 30%• Mid-term: 20%Mid-term: 20%• Class participation: 10%Class participation: 10%

• Pre-reqsPre-reqs• Helpful: Linear algebra, image Helpful: Linear algebra, image

processing, think in 3D processing, think in 3D • We will try to keep math to We will try to keep math to

essentials, but complex conceptsessentials, but complex concepts

Assignments:Assignments:You are encouraged to program in Matlab for image analysisYou are encouraged to program in Matlab for image analysisYou may need to use C++/OpenGL/Visual programming for some hardware assignmentsYou may need to use C++/OpenGL/Visual programming for some hardware assignments

Each student is expected to prepare notes for one lectureEach student is expected to prepare notes for one lectureThese notes should be prepared and emailed to the instructor no later than the following These notes should be prepared and emailed to the instructor no later than the following Monday night (midnight EST). Revisions and corrections will be exchanged by email and Monday night (midnight EST). Revisions and corrections will be exchanged by email and after changes the notes will be posted to the website before class the following week.after changes the notes will be posted to the website before class the following week.5 points5 points

Course mailing listCourse mailing list: Please make sure that your emailid is on the course mailing list : Please make sure that your emailid is on the course mailing list Send email to raskar (at) media.mit.eduSend email to raskar (at) media.mit.edu

Please fill in the email/credit/dept sheetPlease fill in the email/credit/dept sheet  Office hoursOffice hours::Email is the best way to get in touchEmail is the best way to get in touchRamesh:. raskar (at) media.mit.eduRamesh:. raskar (at) media.mit.eduAnkit: Ankit: ankit (at) media.mit.eduankit (at) media.mit.edu

After class:After class:Muddy Charles PubMuddy Charles Pub(Walker Memorial next to tennis courts)(Walker Memorial next to tennis courts)

2 Sept 18th Modern Optics and Lenses, Ray-matrix operations

3 Sept 25th Virtual Optical Bench, Lightfield Photography, Fourier Optics, Wavefront Coding

4 Oct 2nd Digital Illumination, Hadamard Coded and Multispectral Illumination

5 Oct 9thEmerging Sensors: High speed imaging, 3D  range sensors, Femto-second concepts, Front/back

illumination, Diffraction issues

6Oct 16th

Beyond Visible Spectrum: Multispectral imaging and Thermal sensors, Fluorescent imaging, 'Audio camera'

7 Oct 23rdImage Reconstruction Techniques, Deconvolution, Motion and Defocus Deblurring,

Tomography, Heterodyned Photography, Compressive Sensing

8Oct 30th

Cameras for Human Computer Interaction (HCI): 0-D and 1-D sensors, Spatio-temporal coding, Frustrated TIR, Camera-display fusion

9Nov 6th

Useful techniques in Scientific and Medical Imaging: CT-scans, Strobing, Endoscopes, Astronomy and Long range imaging

10Nov 13th

Mid-term  Exam, Mobile Photography, Video Blogging, Life logs and Online Photo collections

11Nov 20th Optics and Sensing in Animal Eyes. What can we learn from successful biological vision systems?

12Nov 27th Thanksgiving Holiday (No Class)

13Dec 4th Final Projects

• What are annoyances in photography ?What are annoyances in photography ?

• Why CCD camera behaves retroreflective?Why CCD camera behaves retroreflective?

• Youtube videos on camera tutorial (DoF etc)Youtube videos on camera tutorial (DoF etc)http://www.youtube.com/user/MPTutorhttp://www.youtube.com/user/MPTutor

Anti-Paparazzi FlashAnti-Paparazzi Flash

The anti-paparazzi flash: 1. The celebrity prey. 2. The lurking photographer. 3. The offending camera is detected and then bombed with a beam of light. 4. Voila! A blurry image of nothing much.

• Anti-Paparazzi FlashAnti-Paparazzi Flash

Retroreflective CCD of cellphone camera

Preventing Camera Recording by Designing a Capture-Resistant EnvironmentKhai N. Truong, Shwetak N. Patel, Jay W. Summet, and Gregory D. Abowd.Ubicomp 2005

Auto FocusAuto Focus

• Contrast method compares contrast of images at three depths,Contrast method compares contrast of images at three depths,

if in focus, image will have high contrast, else notif in focus, image will have high contrast, else not

• Phase methods compares two parts of lens at the sensor plane,Phase methods compares two parts of lens at the sensor plane,

if in focus, entire exit pupil sees a uniform color, else notif in focus, entire exit pupil sees a uniform color, else not

• - assumes object has diffuse BRDF- assumes object has diffuse BRDF

Final Project IdeasFinal Project Ideas• User interaction deviceUser interaction device

– Camera basedCamera based

– Illumination basedIllumination based

– Photodetector or line-scan cameraPhotodetector or line-scan camera

• Capture the invisibleCapture the invisible

– Tomography for internalsTomography for internals

– Structured light for 3D scanningStructured light for 3D scanning

– Fluorescence for transparent materialsFluorescence for transparent materials

• Cameras in different EM/other spectrumCameras in different EM/other spectrum

– Wifi, audio, magnetic, haptic, capacitiveWifi, audio, magnetic, haptic, capacitive

– Visible Thermal IR segmentationVisible Thermal IR segmentation

– Thermal IR (emotion detection, motion Thermal IR (emotion detection, motion detector)detector)

– Multispectral camera, discriminating (camel-Multispectral camera, discriminating (camel-sand)sand)

• IlluminationIllumination

– Multi-flash with lighfieldMulti-flash with lighfield

– Schielren photographySchielren photography

– Strobing and Colored strobingStrobing and Colored strobing

• External non-imaging sensorExternal non-imaging sensor– Camera with gyro movement sensors, Camera with gyro movement sensors,

find identity of userfind identity of user– Cameras with GPS and online geo-tagged Cameras with GPS and online geo-tagged

photo collectionsphoto collections– Interaction between two cameras (with Interaction between two cameras (with

lasers on-board)lasers on-board)• OpticsOptics

– LightfieldLightfield– Coded apertureCoded aperture– Bio-inspired visionBio-inspired vision

• TimeTime– Time-lapse photosTime-lapse photos– Motion blurMotion blur

Kitchen Sink: Volumetric Scattering

Direct Global

Volumetric Scattering:

Chandrasekar 50, Ishimaru 78

“Origami Lens”: Thin Folded Optics (2007)

“Ultrathin Cameras Using Annular Folded Optics, “E. J. Tremblay, R. A. Stack, R. L. Morrison, J. E. FordApplied Optics, 2007 - OSA

Slides by Shree Nayar

Origami Lens

ConventionalLens

Origami Lens

Slides by Shree Nayar

Fernald, Science [Sept 2006]

Shadow Refractive

Reflective

Tools

for

Visual Computin

g

Photonic Crystals• ‘Routers’ for photons instead of electrons

• Photonic Crystal– Nanostructure material with ordered array of holes– A lattice of high-RI material embedded within a lower RI – High index contrast– 2D or 3D periodic structure

• Photonic band gap– Highly periodic structures that blocks certain wavelengths– (creates a ‘gap’ or notch in wavelength)

• Applications– ‘Semiconductors for light’: mimics silicon band gap for electrons– Highly selective/rejecting narrow wavelength filters (Bayer Mosaic?)– Light efficient LEDs– Optical fibers with extreme bandwidth (wavelength multiplexing)– Hype: future terahertz CPUs via optical communication on chip

Schlieren Photography

• Image of small index of refraction gradients in a gas• Invisible to human eye (subtle mirage effect)

Knife edge blocks half the light unless

distorted beam focuses imperfectly

Collimated Light

Camera

http://www.mne.psu.edu/psgdl/FSSPhotoalbum/index1.htm

Sample Final ProjectsSample Final Projects

• Schlieren Photography Schlieren Photography – (Best project award + Prize in 2008)(Best project award + Prize in 2008)

• Camera array for Particle Image VelocimetryCamera array for Particle Image Velocimetry

• BiDirectional ScreenBiDirectional Screen

• Looking Around a Corner (theory)Looking Around a Corner (theory)

• Tomography machineTomography machine

• ....

• ....

Computational IlluminationComputational Illumination

• Dual PhotographyDual Photography

• Direct-global SeparationDirect-global Separation

• Multi-flash CameraMulti-flash Camera

Ramesh Raskar, Computational Illumination

Computational Illumination

Computational Computational PhotographyPhotography

Illumination

Novel CamerasGeneralized

Sensor

Generalized Optics

Processing

4D Light Field

Computational IlluminationComputational Illumination

Novel CamerasGeneralized

Sensor

Generalized Optics

Processing

Generalized Optics

Light Sources

Modulators

4D Light Field

Programmable 4D Illumination field + time + wavelength

Programmable 4D Illumination field + time + wavelength

Novel Illumination

Edgerton 1930’sEdgerton 1930’s

Not Special Cameras but Special Lighting

Edgerton 1930’sEdgerton 1930’s

Multi-flash Sequential Photography

Stroboscope(Electronic Flash)

Shutter Open

Flash Time

‘Smarter’ Lighting Equipment

What Parameters Can We What Parameters Can We Change ?Change ?

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flashFlash/No-flash

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Synthetic Aperture IlluminationSynthetic Aperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Multi-flash Camera for Detecting Depth Edges

Ramesh Raskar, Karhan Tan, Rogerio Feris, Jingyi Yu, Matthew Turk

Mitsubishi Electric Research Labs (MERL), Cambridge, MAU of California at Santa Barbara

U of North Carolina at Chapel Hill

Non-photorealistic Camera: Non-photorealistic Camera: Depth Edge Detection Depth Edge Detection andand Stylized Stylized

Rendering Rendering usingusing Multi-Flash ImagingMulti-Flash Imaging

Car Manuals

What are the problems with ‘real’ photo in conveying information ?

Why do we hire artists to draw what can be photographed ?

Shadows

Clutter

Many Colors

Highlight Shape Edges

Mark moving parts

Basic colors

Shadows

Clutter

Many Colors

Highlight Edges

Mark moving parts

Basic colors

A New ProblemA New Problem

GesturesGestures

Input Photo Canny Edges Depth Edges

Depth Edges with MultiFlashDepth Edges with MultiFlashRaskar, Tan, Feris, Jingyi Yu, Turk – ACM SIGGRAPH 2004

Depth Discontinuities

Internal and externalShape boundaries, Occluding contour, Silhouettes

Depth Edges

Our MethodCanny

Canny Intensity Edge Detection

Our Method

Photo Result

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Imaging Geometry

Shadow lies along epipolar ray

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Shadow lies along epipolar ray,

Epipole and Shadow are on opposite sides of the edge

Imaging Geometry

m

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Shadow lies along epipolar ray,

Shadow and epipole are on opposite sides of the edge

Imaging Geometry

m

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Depth Edge Camera

Light epipolar rays are horizontal or vertical

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input U{depth edges}

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input U{depth edges}

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input U{depth edges}

Negative transition along epipolar ray is depth edge

Plot

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input

Negative transition along epipolar ray is depth edge

Plot U{depth edges}

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

% Max composite maximg = max( left, right, top, bottom);

% Normalize by computing ratio imagesr1 = left./ maximg; r2 = top ./ maximg;r3 = right ./ maximg; r4 = bottom ./ maximg;

% Compute confidence mapv = fspecial( 'sobel' ); h = v';d1 = imfilter( r1, v ); d3 = imfilter( r3, v ); % vertical sobeld2 = imfilter( r2, h ); d4 = imfilter( r4, h ); % horizontal sobel

%Keep only negative transitions silhouette1 = d1 .* (d1>0); silhouette2 = abs( d2 .* (d2<0) );silhouette3 = abs( d3 .* (d3<0) );silhouette4 = d4 .* (d4>0);

%Pick max confidence in eachconfidence = max(silhouette1, silhouette2, silhouette3, silhouette4);imwrite( confidence, 'confidence.bmp');

No magicparameters

!

Depth Edges

Left Top Right Bottom

Depth EdgesCanny Edges

GesturesGestures

Input Photo Canny Edges Depth Edges

Flash MattingFlash Matting

Flash Matting, Jian Sun, Yin Li, Sing Bing Kang, and Heung-Yeung Shum, Siggraph 2006

Multi-light Image CollectionMulti-light Image Collection[Fattal, Agrawala, Rusinkiewicz] Sig’2007

Input Photos ShadowFree,Enhanced

surface detail,

but Flat look

Some Shadows for depth

but Lost visibility

Multiscale decomposition using Bilateral Filter,Combine detail at each scale across all the input images.

Fuse maximum gradient from each photo, Reconstruct from 2D integrationall the input images.

Enhanced shadows

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Los Angeles, CA August 2, 2005 Pradeep Sen

Dual Photography

Pradeep Sen, Billy Chen, Gaurav Garg, Steve MarschnerMark Horowitz, Marc Levoy, Hendrik Lensch

Stanford University Cornell University

*

*

August 2, 2005

Los Angeles, CA

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

book

camera

card

projector

primal

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

primal

dual

Los Angeles, CA August 2, 2005 Pradeep Sen

Overview of dual photography

standard photographfrom camera

dual photographfrom projector

Los Angeles, CA August 2, 2005 Pradeep Sen

Outline

1. Introduction to dual photography

2. Application to scene relighting

3. Accelerating acquisition

4. Conclusions

Los Angeles, CA August 2, 2005 Pradeep Sen

Helmholtz reciprocity

light

eye

eye

light I

I

I

I

primaldual

scene

projector

photosensor

primal

Los Angeles, CA August 2, 2005 Pradeep Sen

Helmholtz reciprocity

projector

photosensor

projector

photosensor

scene

light

cameraprimaldual

Los Angeles, CA August 2, 2005 Pradeep Sen

Forming a dual image

projector

photosensor C0

C1

C2

C3

C4

C5

C6

C7

light

camera

scene

primaldual

Los Angeles, CA August 2, 2005 Pradeep Sen

Forming a dual image

scene

C0

C1

C2

C3

C4

C5

C6

C7

light

cameradual

Los Angeles, CA August 2, 2005 Pradeep Sen

Physical demonstration

Projector was scanned across a scene while a photosensor measured the outgoing light

photosensor

resulting dual image

Los Angeles, CA August 2, 2005 Pradeep Sen

Related imaging methods

Example of a “flying-spot” camera built at the dawn of TV (Baird 1926)

Scanning electron microscope

Velcro® at 35x magnification,Museum of Science, Boston

Los Angeles, CA August 2, 2005 Pradeep Sen

Dual photography for relighting

p

q n

m

projector cameradual camera dual projector

4D

projectorphotosensor

primal

scene

dual

Los Angeles, CA August 2, 2005 Pradeep Sen

Mathematical notation

scene

p

q n

m

projector camera

P

pq x 1

C

mn x 1

T

mn x pq

primal

Los Angeles, CA August 2, 2005 Pradeep Sen

Mathematical notation

P

pq x 1

C

mn x 1

T

mn x pq

=

primal

Los Angeles, CA August 2, 2005 Pradeep Sen

=

pq x 1mn x 1

T

mn x pq

Mathematical notation

100000

Los Angeles, CA August 2, 2005 Pradeep Sen

=

pq x 1mn x 1

T

mn x pq

Mathematical notation

010000

Los Angeles, CA August 2, 2005 Pradeep Sen

=

pq x 1mn x 1

T

mn x pq

Mathematical notation

001000

Los Angeles, CA August 2, 2005 Pradeep Sen

= T

mn x pq

Mathematical notation

P

pq x 1

C

mn x 1little interreflection → sparse matrix

many interreflections → dense matrix

Los Angeles, CA August 2, 2005 Pradeep Sen

?

Mathematical notation

T= P

pq x 1

C

mn x 1

mn x pq

primal space

dual space =

pq x mn

C

mn x 1

P

pq x 1

j

i

i

j

Tij

T

Tji

T = TT

T = Tji ij

Los Angeles, CA August 2, 2005 Pradeep Sen

Definition of dual photography

= T

mn x pq

primal space

dual space =

pq x mn

TT

T

mn x pq

mn x 1

C

pq x 1

P

pq x 1

P

mn x 1

C

Los Angeles, CA August 2, 2005 Pradeep Sen

Sample results

primal dual

Los Angeles, CA August 2, 2005 Pradeep Sen

Sample results

primal dual

Los Angeles, CA August 2, 2005 Pradeep Sen

Scene relighting

Knowing the pixel-to-pixel transport between the projector and the camera allows us to relight the scene with an arbitrary 2D pattern

primal dual

Los Angeles, CA August 2, 2005 Pradeep Sen

Photosensor experiment

= T

mn x pq

pq x 1

P

mn x 1

C C T

1 x pq

primal space

dual space

T

1 x pq

=

pq x 1

P

pq x 1

TT

C

Los Angeles, CA August 2, 2005 Pradeep Sen

2D relighting videos

Relighting book scene with animated patterns

Relighting box with animated pattern

Los Angeles, CA August 2, 2005 Pradeep Sen

Relighting with 4D incident light fields

2D 4D

6D Transport

Los Angeles, CA August 2, 2005 Pradeep Sen

From Masselus et al. SIGGRAPH ‘03

Relighting with 4D incident light fields

Los Angeles, CA August 2, 2005 Pradeep Sen

Relighting with 4D incident light fields

2D 4D

6D Transport

Los Angeles, CA August 2, 2005 Pradeep Sen

Relighting with 4D incident light fields

2D 4D

6D Transport

Los Angeles, CA August 2, 2005 Pradeep Sen

Acquisition of transport from multiple projectors cannot be parallelized

Acquisition of transport using multiple cameras can!

Advantages of our dual framework

Los Angeles, CA August 2, 2005 Pradeep Sen

“Multiple” cameras with mirror array

Los Angeles, CA August 2, 2005 Pradeep Sen

Conclusions

Dual photography is a novel imaging technique that allows us to interchange the camera and the projector

Dual photography can accelerate acquisition of the 6D transport for relighting with 4D incident light fields

We developed an algorithm to accelerate the acquisition of the transport matrix for dual photography

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

book

camera

card

projector

primal

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

primal

dual

Los Angeles, CA August 2, 2005 Pradeep Sen

Photosensor experiment

= T

mn x pq

pq x 1

P

mn x 1

C C T

1 x pq

primal space

dual space

T

1 x pq

=

pq x 1

P

pq x 1

TT

C

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Visual Chatter in the Real World

Shree K. Nayar

Computer ScienceColumbia University

With: Guru Krishnan, Michael Grossberg, Ramesh Raskar

Eurographics Rendering SymposiumJune 2006, Nicosia, Cyprus

Support: ONR

source

surface

P

Direct and Global Illumination

A

A : Direct

B

B : Interrelection

C

C : Subsurface

D

participating medium

D : Volumetric translucent surface

E

E : Diffusion

camera

Direct Global

Shower Curtain: Diffuser

],[],[],[ icLicLicL gd

direct globalradiance

Direct and Global Components: Interreflections

surface

i

camera

source

P

g jiLjiAicL ],[],[],[

j

BRDF and geometry

High Frequency Illumination Pattern

surface

camera

source

fraction of activated source elements

],[],[],[ icLicLicL gd +

i

High Frequency Illumination Pattern

surface

fraction of activated source elements

camera

source

],[],[],[ icLicLicL gd + - ],[],[ icLicL g )1(

i

:2

1 min2LLg

Separation from Two Images

direct global

,minmax LLLd

Other Global Effects: Volumetric Scattering

surface

camera

source

participating medium

i

j

Diffuse Interreflections

SpecularInterreflections

Volumetric Scattering Subsurface

Scattering

Diffusion

Scene

Direct Global

F G C A D B E

A: Diffuse Interreflection (Board)

B: Specular Interreflection (Nut)

C: Subsurface Scattering (Marble)

D: Subsurface Scattering (Wax)

E: Translucency (Frosted Glass)

F: Volumetric Scattering (Dilute Milk)

G: Shadow (Fruit on Board)

Verification

V-Grooves: Diffuse Interreflections

Direct Global

concave convex

Psychophysics:

Gilchrist 79, Bloj et al. 04

Mirror Ball: Failure Case

Direct Global

Real World Examples:

Can You Guess the Images?

Eggs: Diffuse Interreflections

Direct Global

Stick

Building Corner

Shadow

minLLg

direct global

,minmax LLLd

3D from Shadows:

Bouguet and Perona 99

Direct Global

Building Corner

Shower Curtain: Diffuser

Shadow

Mesh

minLLg

direct global

,minmax LLLd

Direct Global

Shower Curtain: Diffuser

Kitchen Sink: Volumetric Scattering

Direct Global

Volumetric Scattering:

Chandrasekar 50, Ishimaru 78

Novel Image

Real or Fake ?

Direct Global

R

F

RF

R

F

Hand

Direct Global

Skin: Hanrahan and Krueger 93,

Uchida 96, Haro 01, Jensen et al. 01,

Igarashi et al. 05, Weyrich et al. 05

Hands

Afric. Amer.Female

ChineseMale

SpanishMale

Direct Global

Afric. Amer.Female

ChineseMale

SpanishMale

Afric. Amer.Female

ChineseMale

SpanishMale

Pink Carnation

GlobalDirect

Spectral Bleeding: Funt et al. 91

Summary

• Fast and Simple Separation Method

• Wide Variety of Global Effects

• No Prior Knowledge of Material Properties

• Implications:

• Generation of Novel Images

• Enhance Computer Vision Methods

• Insights into Properties of Materials

+=

+=

+=

+=

+=

+=

+

+

=

=

+=

+=

www.cs.columbia.edu/CAVE

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Day of the year

Time of day

The Archive of Many Outdoor Scenes (AMOS)

Images from ~1000 static webcams, every 30 minutes since March 2006.

Variations over a year and over a day

Jacobs, Roman, and Robert Pless, WUSTL CVPR 2007,

Analysing Time Lapse Images

• PCA– Linear Variations due to lighting and seasonal variation

• Decompose (by time scale)– Hour: haze and cloud for depth.

– Day: changing lighting directions for surface orientation

– Year: effects of changing seasons highlight vegetation

• Applications:– Scene segmentation.– Global Webcam localization. Correlate timelapse

video over a month from unknown camera with:• sunrise + sunset (localization accuracy ~ 50 miles)

• Known nearby cameras (~25 miles)

• Satellite image (~15 miles)

Mean image + 3 components from time lapse of downtown st. louis over the course of 2 hours

2 Hour time Lapse in St Louis:Depth from co-varying regions

Surface Orientation

False Color PCA images

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Image Fusion for Context Enhancementand Video Surrealism

Adrian IlieAdrian IlieRamesh RaskarRamesh Raskar Jingyi YuJingyi Yu

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Dark Bldgs

Reflections on bldgs

Unknown shapes

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

‘Well-lit’ Bldgs

Reflections in bldgs windows

Tree, Street shapes

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuBackground is captured from day-time scene using the same fixed camera

Night Image

Day Image

Context Enhanced Image

http://web.media.mit.edu/~raskar/NPAR04/

Ramesh Raskar, CompPhoto

Factored Time Lapse Video

Factor into shadow, illumination, and reflectance. Relight, recover surface normals, reflectance editing.

[Sunkavalli, Matusik, Pfister, Rusinkiewicz], Sig’07

Ramesh Raskar, CompPhoto

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare