1
An Optical System for Single-Image Environment Maps (sap 0450) Jonas Unger * Link¨ oping University Stefan Gustavson Link¨ oping University Figure 1: a) A principal sketch of the mirror and lens setup, with reflected and refracted rays for the parallel projection case. b) An image of a checkered environment map, as displayed in c), as projected by the setup. The field of view of the central lens is about 30 , represented by the white circle in the cube map. Abstract We present an optical setup for capturing a full 360 environment map in a single image snapshot. The setup, which can be used with any camera device, consists of a curved mirror swept around a negative lens, and is suitable for capturing environment maps and light probes. The setup achieves good sampling density and uniformity for all directions in the environment. Introduction Image based lighting techniques using cap- tured real world lighting as source of illumination for renderings was proposed in [Debevec 1998], and has been used successfully in recent years. In computer graphics applications, light probes are usually captured by photographing mirror spheres. A spherical mirror exhibits poor sampling, particularly for the forward-facing directions relative to the camera. The sphere is also a so called non- single viewpoint system, which means that the center of projection moves as a function of the position on the sphere. The properties of single and non-single viewpoint catadioptric systems have been investigated by [Baker and Nayar 1999] and [Swaminathan et al. 2006]. Non-single viewpoint systems are not optimal, but still use- ful for imaging. Optical System The main goal of the setup is to achieve good and uniform sampling density for all directions in the envi- ronment. We use a swept mirror surface with a hole at the center, and a negative lens in the hole to yield a wide-angle view of the problematic forward facing directions, see Figure 1 a). The mirror reflects the backward and forward facing directions in the environment up to a certain limit. Assuming a parallel projec- tion, the height z as a function of the radius 0 < r < 1 of our mirror profile is given by: dz dr = tan ( π r 2 ) = z = - 2 π ln ( cos π r 2 ) (1) * e-mail:[email protected] e-mail: [email protected] This yields a linear mapping of radius to reflection angle, similar to a fisheye lens and the ”angular map” used in HDR imaging. Near forward-facing angles, the profile has a very steep slope and will be unwieldy to fabricate and use, but we truncate the profile to exclude the forward-facing angles covered by the central image. The exam- ple mirror profile covers about 330 of the environment, and the lens the remaining 30 . Some stretching can be noticed along the rim, but this non-uniformity is common to all circular wide-angle projection, and the radial sampling density in our setup is perfectly uniform. No contraction occurs, see Figure 1 b). The setup with both a curved mirror and a lens gives two separate image formations of the environments, but it can easily be mapped to any convenient panorama representation. The lens is designed such that there is a small overlap between the reflected (mirror) and refracted (lens) fields of view, for better resampling and interpola- tion properties. There is still a dead angle in this setup: the camera will still block out a small part of the view for back facing angles. However, this is less of a problem. Our example uses parallel projection for clarity, and the same pro- file can be used with good accuracy for a telephoto lens. The profile can easily be changed to take perspective projection into account. References BAKER, S., AND NAYAR, S. 1999. A Theory of Single-Viewpoint Catadioptric Image Formation. International Journal on Computer Vision 35, 2 (Nov), 175–196. DEBEVEC, P. 1998. Rendering synthetic objects into real scenes: bridging tradi- tional and image-based graphics with global illumination and high dynamic range photography. In SIGGRAPH ’98: Proceedings of the 25th annual conference on Computer graphics and interactive techniques, ACM Press, New York, NY, USA, 189–198. SWAMINATHAN, R., GROSSBERG, M. D., AND NAYAR, S. K. 2006. Non-Single Viewpoint Catadioptric Cameras: Geometry and Analysis. International Journal of Computer Vision 66, 3 (Mar), 211–229.

An Optical System for Single-Image Environment Maps (sap 0450)

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: An Optical System for Single-Image Environment Maps (sap 0450)

An Optical System for Single-Image Environment Maps (sap 0450)

Jonas Unger∗

Linkoping UniversityStefan Gustavson†

Linkoping University

Figure 1:a) A principal sketch of the mirror and lens setup, with reflected and refracted rays for the parallel projection case.b) An image ofa checkered environment map, as displayed inc), as projected by the setup. The field of view of the central lens is about 30◦, represented bythe white circle in the cube map.

Abstract We present an optical setup for capturing a full 360◦

environment map in a single image snapshot. The setup, which canbe used with any camera device, consists of a curved mirror sweptaround a negative lens, and is suitable for capturing environmentmaps and light probes. The setup achieves good sampling densityand uniformity for all directions in the environment.

Introduction Image based lighting techniques using cap-tured real world lighting as source of illumination for renderingswas proposed in [Debevec 1998], and has been used successfullyin recent years. In computer graphics applications, light probesare usually captured by photographing mirror spheres. A sphericalmirror exhibits poor sampling, particularly for the forward-facingdirections relative to the camera. The sphere is also a so called non-single viewpoint system, which means that the center of projectionmoves as a function of the position on the sphere. The propertiesof single and non-single viewpoint catadioptric systems have beeninvestigated by [Baker and Nayar 1999] and [Swaminathan et al.2006]. Non-single viewpoint systems are not optimal, but still use-ful for imaging.

Optical System The main goal of the setup is to achievegood and uniform sampling density for all directions in the envi-ronment. We use a swept mirror surface with a hole at the center,and a negative lens in the hole to yield a wide-angle view of theproblematic forward facing directions, see Figure 1 a).

The mirror reflects the backward and forward facing directions inthe environment up to a certain limit. Assuming a parallel projec-tion, the heightzas a function of the radius 0< r < 1 of our mirrorprofile is given by:

dzdr

= tan(πr

2

)=⇒ z=− 2

πln

(cos

πr2

)(1)

∗e-mail:[email protected]†e-mail: [email protected]

This yields a linear mapping of radius to reflection angle, similar toa fisheye lens and the ”angular map” used in HDR imaging. Nearforward-facing angles, the profile has a very steep slope and will beunwieldy to fabricate and use, but we truncate the profile to excludethe forward-facing angles covered by the central image. The exam-ple mirror profile covers about 330◦ of the environment, and thelens the remaining 30◦. Some stretching can be noticed along therim, but this non-uniformity is common to all circular wide-angleprojection, and the radial sampling density in our setup is perfectlyuniform. No contraction occurs, see Figure 1 b).

The setup with both a curved mirror and a lens gives two separateimage formations of the environments, but it can easily be mappedto any convenient panorama representation. The lens is designedsuch that there is a small overlap between the reflected (mirror) andrefracted (lens) fields of view, for better resampling and interpola-tion properties.

There is still a dead angle in this setup: the camera will still blockout a small part of the view for back facing angles. However, this isless of a problem.

Our example uses parallel projection for clarity, and the same pro-file can be used with good accuracy for a telephoto lens. The profilecan easily be changed to take perspective projection into account.

References

BAKER, S.,AND NAYAR , S. 1999. A Theory of Single-Viewpoint Catadioptric ImageFormation.International Journal on Computer Vision 35, 2 (Nov), 175–196.

DEBEVEC, P. 1998. Rendering synthetic objects into real scenes: bridging tradi-tional and image-based graphics with global illumination and high dynamic rangephotography. InSIGGRAPH ’98: Proceedings of the 25th annual conference onComputer graphics and interactive techniques, ACM Press, New York, NY, USA,189–198.

SWAMINATHAN , R., GROSSBERG, M. D., AND NAYAR , S. K. 2006. Non-SingleViewpoint Catadioptric Cameras: Geometry and Analysis.International Journalof Computer Vision 66, 3 (Mar), 211–229.