67
Image-Based Lighting cs129: Computational Photograph James Hays, Brown, Spring 201 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec ted by Klaus Mueller, added some slides from Jian Huang, UTK

Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Embed Size (px)

Citation preview

Page 1: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Image-Based Lighting

cs129: Computational PhotographyJames Hays, Brown, Spring 2011

© Eirik Holmøyvik

Slides from Alexei Efros and Paul Debevec

adapted by Klaus Mueller, added some slides from Jian Huang, UTK

Page 2: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Inserting Synthetic Objects

Why does this look so bad?• Wrong orientation• Wrong lighting• No shadows

Page 3: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Environment Maps

Simple solution for shiny objects• Models complex lighting as a panoramic image• i.e. amount of radiance coming in from each direction• A plenoptic function!!!

Page 4: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

reflective surface

viewer

environment texture image

v

n

r

projector function converts reflection vector (x, y, z) to texture image (u, v)

Environment Mapping

Reflected ray: r=2(n·v)n-v

Texture is transferred in the direction of the reflected ray from the environment map onto the objectWhat is in the map?

Page 5: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

What approximations are made?The map should contain a view of the world with the

point of interest on the object as the Center of Projection• We can’t store a separate map for each point, so one map is

used with the COP at the center of the object• Introduces distortions in the reflection, but we usually don’t

notice• Distortions are minimized for a small object in a large room

The object will not reflect itself!

Page 6: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Environment MapsThe environment map may take various forms:

• Cubic mapping• Spherical mapping• other

Describes the shape of the surface on which the map “resides”

Determines how the map is generated and how it is indexed

Page 7: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Cubic Mapping

The map resides on the surfaces of a cube around the object• Typically, align the faces of the cube with the coordinate axes

To generate the map:• For each face of the cube, render the world from the center of

the object with the cube face as the image plane– Rendering can be arbitrarily complex (it’s off-line)

• Or, take 6 photos of a real environment with a camera in the object’s position

To use the map:• Index the R ray into the correct cube face• Compute texture coordinates

Page 8: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Cubic Map Example

Page 9: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Calculating the Reflection Vector

Normal vector of the surface : NIncident Ray : IReflection Ray: R N,I,R all normalized

R = I -2 N ( N . I )

The texture coordinate is based on the reflection vector

Assuming the origin of the vector is always in the center of the cube environment map

Page 10: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Indexing Cubic Maps

Assume you have R and the cube’s faces are aligned with the coordinate axes, and have texture coordinates in [0,1]x[0,1]• How do you decide which face to use?• Use the reflection vector coordinate with

the largest magnitude

(0.3, 0.2, 0.8) face in +z direction

Page 11: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Indexing Cubic Maps

How do you decide which texture coordinates to use?• Divide by the coordinate with the largest magnitude• Now ranging [-1,1]• Remapped to a value between 0 and 1.

(0.3,0.2,0.8) ((0.3/0.8 +1)*0.5, ((0.2/0.8 +1)*0.5) = (0.6875, 0.625)

Page 12: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

A Sphere Map

• A mapping between the reflection vector and a circular texture

• Prepare an image/texture in which the environment is mapped onto a sphere

Page 13: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Sphere Mapping

To generate the map: Take a photograph of a shiny sphere • Mapping a cubic environment map onto a sphere• For synthetic scenes, you can use ray tracing

Page 14: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Sphere Map

Compute the reflection vector at the surface of the objectFind the corresponding texture on the sphere mapUse the texture to color the surface of the object

Page 15: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Indexing Sphere Maps

Given the reflection vector R (Rx,Ry,Rz)

• (u,v) on the spherical map

1,0 vu

222 1

2

1

2 v,

2

1

2

zyx

yx

RRRm

m

R

m

Ru

Page 16: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Indexing Sphere Maps

The normal vector is the sum of the reflection vector and the eye vector

Normalization of the vector gives

If the normal is on a sphere of radius 1, its x, y coordinates are also location on the sphere map

Finally converting them to make their range [0,1]

m

R

m

R

m

R

RRRm

zyx

zyx

1,,

1 222

n

)1,,( zyx RRRN

222 1

2

1

2 v,

2

1

2

zyx

yx

RRRm

m

R

m

Ru

m

R

m

R

m

R

RRRm

zyx

zyx

1,,

1 222

n

Page 17: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Non-linear Mapping

Problems:• Highly non-uniform sampling• Highly non-linear mapping

Linear interpolation of texture coordinates picks up the wrong texture pixels• Do per-pixel sampling or use high resolution polygons• Can only view from one direction

Correct Linear

Page 18: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Example

Page 19: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

What about real scenes?

From Flight of the Navigator

Page 20: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

What about real scenes?

from Terminator 2

Page 21: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Real environment mapsWe can use photographs to capture environment maps

How do we deal with light sources? Sun, lights, etc?• They are much much brighter than the rest of the

enviarnment

User High Dynamic Range photography, of course!

Several ways to acquire environment maps:• Stitching mosaics• Fisheye lens• Mirrored Balls

Page 22: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Stitching HDR mosaicsStitching HDR mosaics

http://www.gregdowning.com/HDRI/stitched/http://www.gregdowning.com/HDRI/stitched/

Page 23: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Scanning Panoramic CamerasScanning Panoramic CamerasPros:

very high res (10K x 7K+)Full sphere in one scan – no stitchingGood dynamic range, some are HDR

Issues:More expensiveScans take a while

Companies: Panoscan, Sphereon

Pros:very high res (10K x 7K+)Full sphere in one scan – no stitchingGood dynamic range, some are HDR

Issues:More expensiveScans take a while

Companies: Panoscan, Sphereon

Page 24: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Fisheye ImagesFisheye Images

Page 25: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Mirrored SphereMirrored Sphere

Page 26: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted
Page 27: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted
Page 28: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Sources of Mirrored BallsSources of Mirrored Balls

2-inch chrome balls ~ $20 ea. McMaster-Carr Supply Company

www.mcmaster.com

6-12 inch large gazing balls Baker’s Lawn Ornaments

www.bakerslawnorn.com

Hollow Spheres, 2in – 4in Dube Juggling Equipment

www.dube.com

FAQ on www.debevec.org/HDRShop/

2-inch chrome balls ~ $20 ea. McMaster-Carr Supply Company

www.mcmaster.com

6-12 inch large gazing balls Baker’s Lawn Ornaments

www.bakerslawnorn.com

Hollow Spheres, 2in – 4in Dube Juggling Equipment

www.dube.com

FAQ on www.debevec.org/HDRShop/

Page 29: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

0.34 0.34

0.580.58

=> 59% Reflective=> 59% Reflective

Calibrating Mirrored Sphere Reflectivity

Calibrating Mirrored Sphere Reflectivity

Page 30: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Real-World HDR Lighting Environments

Lighting Environments from the Light Probe Image Gallery:http://www.debevec.org/Probes/Lighting Environments from the Light Probe Image Gallery:http://www.debevec.org/Probes/

FunstonBeach

UffiziGallery

EucalyptusGrove

GraceCathedral

Page 31: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Acquiring the Light ProbeAcquiring the Light Probe

Page 32: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Assembling the Light ProbeAssembling the Light Probe

Page 33: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Not just shiny…Not just shiny…

We have captured a true radiance map

We can treat it as an extended (e.g spherical) light source

Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be

lighted What’s the limitation?

We have captured a true radiance map

We can treat it as an extended (e.g spherical) light source

Can use Global Illumination to simulate light transport in the scene So, all objects (not just shiny) can be

lighted What’s the limitation?

Page 34: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Illumination ResultsIllumination Results

Page 35: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Comparison: Radiance map versus single imageComparison: Radiance map versus single image

Page 36: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Putting it all togetherPutting it all together

Synthetic Objects +Real light!

Synthetic Objects +Real light!

Page 37: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

CG Objects Illuminated by a Traditional CG Light Source

CG Objects Illuminated by a Traditional CG Light Source

Page 38: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.

Paul Debevec. A Tutorial on Image-Based Lighting. IEEE Computer Graphics and Applications, Jan/Feb 2002.

Page 39: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Rendering with Natural LightRendering with Natural Light

SIGGRAPH 98 Electronic Theater

Page 40: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

RNL Environment mapped onto

interior of large cube

Page 41: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

MOVIE!MOVIE!

Page 42: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Illuminating a Small SceneIlluminating a Small Scene

Page 43: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted
Page 44: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

We can now illuminatesynthetic objects with real light.

How do we add synthetic objects to a real scene?

We can now illuminatesynthetic objects with real light.

How do we add synthetic objects to a real scene?

Page 45: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Real Scene ExampleReal Scene Example

Goal: place synthetic objects on tableGoal: place synthetic objects on table

Page 46: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Light Probe / Calibration GridLight Probe / Calibration Grid

Page 47: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

real scenereal scene

Modeling the SceneModeling the Scene

light-based modellight-based model

Page 48: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

The Light-Based Room ModelThe Light-Based Room Model

Page 49: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

real scenereal scene

Modeling the SceneModeling the Scene

synthetic objectssynthetic objects

light-based modellight-based model

local scenelocal scene

Page 50: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

The Lighting ComputationThe Lighting Computation

synthetic objects(known BRDF)

synthetic objects(known BRDF)

distant scene (light-based, unknown BRDF)distant scene (light-based, unknown BRDF)

local scene(estimated BRDF)

local scene(estimated BRDF)

Page 51: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Rendering into the SceneRendering into the Scene

Background PlateBackground Plate

Page 52: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Rendering into the SceneRendering into the Scene

Objects and Local Scene matched to SceneObjects and Local Scene matched to Scene

Page 53: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Differential RenderingDifferential Rendering

Local scene w/o objects, illuminated by modelLocal scene w/o objects, illuminated by model

Page 54: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Differential Rendering (2)Difference in local sceneDifferential Rendering (2)Difference in local scene

-- ==

Page 55: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Differential RenderingDifferential Rendering

Add differences to images -- Final ResultAdd differences to images -- Final Result

Page 56: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

High-Dynamic Range PhotographyHigh-Dynamic Range Photography

• Standard digital images typically represent only a small fraction of the dynamic range—the ratio between the dimmest and brightest regions accurately represented—present in most realworld lighting environments.

The bright light saturates the pixel colour

Need a high dynamic range image which can be produced by combining images of different shutter speed

• Standard digital images typically represent only a small fraction of the dynamic range—the ratio between the dimmest and brightest regions accurately represented—present in most realworld lighting environments.

The bright light saturates the pixel colour

Need a high dynamic range image which can be produced by combining images of different shutter speed

Page 57: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

High-Dynamic Range PhotographyHigh-Dynamic Range Photography

Page 58: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

HDR imagesHDR images

Page 59: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted
Page 60: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

IMAGE-BASED LIGHTING IN FIAT LUXPaul Debevec, Tim Hawkins, Westley Sarokin, H. P. Duiker, Christine Cheng, Tal Garfinkel, Jenny Huang

SIGGRAPH 99 Electronic Theater

Page 61: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted
Page 62: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

HDR Image SeriesHDR Image Series

2 sec2 sec 1/4 sec1/4 sec 1/30 sec1/30 sec

1/250 sec1/250 sec 1/2000 sec1/2000 sec 1/8000 sec1/8000 sec

Page 63: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Stp1 PanoramaStp1 Panorama

Page 64: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Assembled PanoramaAssembled Panorama

Page 65: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Light Probe ImagesLight Probe Images

Page 66: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

Capturing a Spatially-Varying Lighting EnvironmentCapturing a Spatially-Varying Lighting Environment

Page 67: Image-Based Lighting cs129: Computational Photography James Hays, Brown, Spring 2011 © Eirik Holmøyvik Slides from Alexei Efros and Paul Debevec adapted

The MovieThe Movie