22
Video Game Development: The Rendering Pipeline A. Babadi 1 of 22 In The Name Of God Video Game Development Amin Babadi Department of Electrical and Computer Engineering Isfahan University of Technology Spring 2015 The Rendering Pipeline

04. The Rendering Pipeline

Embed Size (px)

Citation preview

Video Game Development: The Rendering Pipeline A. Babadi 1 of 22

In The Name Of God

Video Game Development

Amin Babadi

Department of Electrical and Computer Engineering

Isfahan University of Technology

Spring 2015

The Rendering Pipeline

Video Game Development: The Rendering Pipeline A. Babadi 2 of 22

Outline

Graphics primitives

Video random access memory (VRAM)

A review of the rendering pipeline

Functions performed by modern 3D graphics hardware

Video Game Development: The Rendering Pipeline A. Babadi 3 of 22

Graphics Processors

A scene is composed of many separate objects.

Each object is represented by a set of vertices and a particular type of graphics primitive that indicates how the vertices are connected to produce a shape.

Most of the time, the surface of a 3D model is represented by a list of triangles.

Video Game Development: The Rendering Pipeline A. Babadi 4 of 22

Graphics Primitives

A scene is composed of many separate objects.

The OpenGL library defines 10 types of graphics primitive. The numbers indicate the order in which the vertices are specified for each primitive type.

Video Game Development: The Rendering Pipeline A. Babadi 5 of 22

Graphic Processing Unit (GPU)

The usual modern 3D graphics board possesses GPU that executes instructions independently of the CPU.

The CPU sends rendering commands to the GPU, which then performs the rendering operations while the CPU continues with other tasks.

Video Game Development: The Rendering Pipeline A. Babadi 6 of 22

CPU-GPU Communication

The communications that take place between the CPU and GPU.

Video Game Development: The Rendering Pipeline A. Babadi 7 of 22

Video Random Access Memory (VRAM)

A 3D graphics board has its own memory core, which is commonly called VRAM.

The most important information that are stored in VRAM are front and back image buffers.

The front image buffer contains the exact pixel data that is visible in the viewport.

The back image buffer is the location to which the GPU actually renders a scene.

Video Game Development: The Rendering Pipeline A. Babadi 8 of 22

Buffer Swap

Once an image has been completely rendered, the front and back image buffers are exchanged. This operation is called a buffer Swap.

Video Game Development: The Rendering Pipeline A. Babadi 9 of 22

Screen Tearing

The buffer swap is often synchronized with the refresh frequency of the display to avoid an artifact known as tearing. The shot is taken from the Splinter Cell: Black List (2013).

Video Game Development: The Rendering Pipeline A. Babadi 10 of 22

Z-Buffer

Also stored in VRAM is a block of data called the depth buffer or Z-buffer.

𝑋

𝑌

𝑍 Camera’s local coordinate system

Video Game Development: The Rendering Pipeline A. Babadi 11 of 22

Stencil Buffer

An application may request that a stencil buffer be created along with the image buffers and the depth buffer.

The stencil buffer contains an integer mask for each pixel in the image buffer that can be used to enable or disable drawing on a per-pixel basis.

Video Game Development: The Rendering Pipeline A. Babadi 12 of 22

Texture Mapping

Usually the usage of VRAM is dominated by texture maps.

1) A 3D model without textures 2) A 3D model with textures

Video Game Development: The Rendering Pipeline A. Babadi 13 of 22

Vertex Transformation

The coordinate spaces appearing in the rendering pipeline.

Video Game Development: The Rendering Pipeline A. Babadi 14 of 22

Local, World & Camera Space

The vertices of a model are typically stored in object space, a coordinate system that is local to the particular model and used only by that model.

The position and orientation of each model are often stored in world space, a global coordinate system that ties all of the object spaces together.

Before an object can be rendered, its vertices must be transformed into camera space (also called eye space), the space in which the X and Y axes are aligned to the display and the Z axis is parallel to the viewing direction.

Video Game Development: The Rendering Pipeline A. Babadi 15 of 22

An Example

Video Game Development: The Rendering Pipeline A. Babadi 16 of 22

Homogeneous Clip Space

Once a model’s vertices have been transformed into camera space, they undergo a projection transformation that has the effect of applying perspective so that geometry becomes smaller as the distance from the camera increases.

In homogeneous clip space the X, Y, and Z coordinates of each vertex fall in the range −1,1 and graphics primitives are clipped to the boundaries of the visible region of the scene, ensuring that no attempt is made to render any part of a primitive that falls outside the viewport.

Video Game Development: The Rendering Pipeline A. Babadi 17 of 22

Window Space

The vertices must undergo one more transformation, called the viewport transformation, that maps the normalized coordinates to the actual range of pixel coordinates covered by the viewport.

The z coordinate is usually mapped to the floating-point range 0,1 .

After the viewport transformation, vertex positions are said to lie in window space.

Video Game Development: The Rendering Pipeline A. Babadi 18 of 22

Rasterization and Fragment Operations

The process of converting a graphics primitive to a set of fragments.

Video Game Development: The Rendering Pipeline A. Babadi 19 of 22

Face Culling

An application may specify that face culling be performed as the first stage of this process.

Face culling is employed as an optimization that skips polygons facing away from the camera.

Video Game Development: The Rendering Pipeline A. Babadi 20 of 22

Rasterization and Fragment Shading

The GPU calculates the depth, interpolated vertex colors, and interpolated texture coordinates for each pixel. o This information, combined with the location of the pixel itself, is

called a fragment.

The process of filling in the horizontal spans of pixels belonging to a primitive is called rasterization.

A graphics application specifies how the fragment data is used to determine the final color and final depth of each pixel during rasterization. o This process is called fragment shading or pixel shading.

Video Game Development: The Rendering Pipeline A. Babadi 21 of 22

Fragment Operations

Most of fragment operations determine whether a fragment should be drawn to the viewport or discarded altogether.

Most GPUs perform as many tests as possible before performing fragment shading calculations to avoid spending time figuring out the colors of fragments that will ultimately be discarded.

After fragment operations, the fragment’s final color is blended into the image buffer.

Video Game Development: The Rendering Pipeline A. Babadi 22 of 22

References

Lengyel’s textbook,

Wikipedia, and

Some other sources on the Internet.