Download pdf - Giga Voxels SIggraph Talk

Transcript
Page 1: Giga Voxels SIggraph Talk

Interactive Indirect Illumination Using Voxel-Based Cone Tracing : An InsightCyril Crassin

Grenoble Univ, INRIAFabrice Neyret

CNRS, Grenoble Univ, INRIAMiguel Sainz

NVIDIASimon Green

NVIDIAElmar Eisemann

Telecom ParisTech

Figure 1: Real-time indirect illumination (25-70 fps on a GTX480): Our approach supports diffuse and glossy light bounces on complexscenes. We rely on a voxel-based hierarchical structure to ensure efficient integration of 2-bounce illumination. (Right scene courtesy of G. M. Leal Llaguno)

AbstractIndirect illumination is an important element for realistic image syn-thesis, but its computation is expensive and highly dependent on thecomplexity of the scene and of the BRDF of the surfaces involved.While off-line computation and pre-baking can be acceptable for somecases, many applications (games, simulators, etc.) require real-time orinteractive approaches to evaluate indirect illumination. We present anovel algorithm to compute indirect lighting in real-time that avoidscostly precomputation steps and is not restricted to low frequency il-lumination. It is based on a hierarchical voxel octree representationgenerated and updated on-the-fly from a regular scene mesh coupledwith an approximate voxel cone tracing that allows a fast estimationof the visibility and incoming energy. Our approach can manage twolight bounces for both Lambertian and Glossy materials at interactiveframerates (25-70FPS). It exhibits an almost scene-independent per-formance and allows for fully dynamic content thanks to an interactiveoctree voxelization scheme. In addition, we demonstrate that our voxelcone tracing can be used to efficiently estimate Ambient Occlusion.

1 Algorithm overviewOur approach is based on a three-steps algorithm. We first inject in-coming radiance (energy and direction) from dynamic light sourcesinto the sparse voxel octree hierarchy, by rasterizing scene meshesand splatting a photon for each visible surface fragment. In a secondstep, we filter the values in the higher levels of the octree (mipmap).This is done efficiently in parallel by relying on screen-space quad-tree analysis. Finally, we render the scene from the camera. For eachvisible surface fragment, we combine the direct and indirect illumi-nation. We employ an approximate cone tracing to perform a finalgathering, sending out a few cones over the hemisphere to collect il-lumination distributed in the octree. Typically for Phong-like BRDF,a few large cones (˜5) estimate the diffuse energy coming from thescene, while a tight cone in the reflected direction with respect to theviewpoint captures the specular component. The aperture of this coneis derived from the specular exponent of the material, allowing us tocompute glossy reflections.

2 Dynamic sparse voxel octree structureThe core of our approach is built upon a pre-filtered hierarchical voxelversion of the scene geometry. For efficiency, this representation isstored in the form of a compact pointer-based sparse voxel octree inthe spirit of [Crassin et al. 2009]. To adapt to the specificities of thescenes, we use small 33 bricks with values located in octree-node cor-ners. This allows us to employ hardware interpolation, while usingless than half the amount of memory compared to previous solutions.This structure exhibits an almost scene-independent performance. Itis updated dynamically thanks to a fast GPU based mesh voxelizationand octree building system. This system handle dynamic updates ofthe structure, allowing for animated objects and dynamic modifica-tions of the environment.

3 Pre-integrated Voxel Cone TracingOur approach approximates the re-sult of the visibility, energy andNDF estimation for a bundle ofrays in a cone using only a singleray and our filtered (mipmapped)voxel structure. The idea is toperform volume integration stepsalong the cone axis with lookupsin our hierarchical representation

at the LOD corresponding to the local cone radius. During this step,we use quadrilinear interpolation to ensure a smooth LOD variation,similarly to anti-aliasing filtering [Crassin et al. 2009]. Our voxelshading convolves the BRDF, the NDF, the distribution of light direc-tions and the span of the view cone, all considered as Gaussian lobes.These lobes are reconstructed from direction distributions stored in acompact way as non-normalized vectors in the structure

4 Anisotropic Pre-IntegrationIn order to get higher quality visibility estimation and to limit leak-ing of light when low resolution mipmap levels are used, we proposeto rely on an anisotropic pre-integration of voxel values stored in adirection-dependent way in the structure. We use the 6 main directionsand values are reconstructed by linear interpolation of the 3 closest di-rections. It provides a better approximation of the volume renderingintegral, at the cost of 1.75x the storage requirement and a slightlyslower sampling.

ReferencesCRASSIN, C., NEYRET, F., LEFEBVRE, S., AND EISEMANN, E. 2009. Gigavoxels :

Ray-guided streaming for efficient and detailed voxel rendering. In ACM SIGGRAPHSymposium on Interactive 3D Graphics and Games (I3D).

Recommended