Jan Kolomazník

Jan Kolomazník Email: kolomaznik (at) cgg.mff.cuni.cz

Realtime Graphics on GPU (NPGR019) - summer 2023/2024

Lecture for real-time 3D graphics supported by GPUs.

LECTURE PLAN

Lab practices overview

Lab practices start from second week (March 1st).

In order to pass the lab practices, you need to pick 2 assignments from the pool below, implement them and submit them via e-mail by 30. 6. 2024, i.e., the end of the summer semester examination period. Note, that to be able to sign up for the exam, you should obtain the grade for practicals first. Use separate messages for each project and use a meaningful subject like "NPGR019 - {name of the project}". I will send you e-mail confirming delivery followed by further questions (if any) and the result or I may ask you for some further improvements or fixes.


Sources for the lab tutorials: github repository
  • Lab #1 (1st March):

    • Setup and compilation
    • Error handling
    • Resource management
    • Defining geometry
    • Simple shaders

  • Lab #2 (8th March):

    • Index buffer
    • Camera setup
    • Scene + renderer

  • Lab #3 (15th March):

    • Loading geometry
    • Using textures
    • Experiments with shaders and Z-buffer

  • Lab #4 (22nd March):

    • Experiments with deffered shading - compositing and shadow mapping preparation
    • Example 04_deffered now contains a simplest variant of the compositing step with shadow mapping
    • Play with light position, shadow bias, etc. and see how it influences the quality

  • Lab #5 (5th April):

    • Advanced shaders - material shaders using tangent space, geometry and tesselation shaders
    • Example 05_shaders

  • Lab #6 (12th April):

    • Experiments with 3D textures, raycasting, Maximum intensity projection
    • Example 06_3d_textures
    • Download data here

Semestral project assignments

  • Particle system:

    Implement a simple particle system that will represent some effect such as fire, smoke, etc. (be creative). Render the particles as sprites with an animated texture. Particles should be lit by a scene light.
    Requirements:

    • Simulated particle effect using GPU sprites (billboards)
    • Particles should be animated (both the particle and its texture in UV space)
    • Particles should be affected by scene light
    Recommended approach:
    • Take the 05_shaders example and implement the particle object as a new type of scene object in a similar way as the InstancedCube was implemented
    • Use instancing for particles, or just buffer of vertices rendered as GL_POINTS and generate the particles in a geometry shader
    • Use pregenerated 3D texture for particle animation, or use example 07_noise to generate the texture procedurally.

  • Screen Space Ambient Occlussion:

    Screen Space Ambient Occlussion is very popular method for approximating decrease of light intensity in corners and crevices. Such and effect is normally produced by global illumination which is, however, impractical for real-time scene hence the approximation. The easiest and more straightforward way of doing this is taking the scene depth buffer and calculating depth difference of each texel vs. some average around sampled texel.
    Requirements:

    • Create sufficiently complex scene that would nicely show the effect (with some corners, etc.)
    • Implement the most basic SSAO effect using just the scene depth buffer, random sampling and blurring
    • Apply the effect to the rendered scene
    Recommended approach:
    • Take the 04_deffered example as a starting point
    • You need an additional postprocessing step, so the compositing step also renders to a texture

  • Cascaded shadow maps

    Implement a simple scene that uses directional light to cast shadows. A common extension to basic shadow map used to fight perspective aliasing is called Cascaded Shadow Maps, where we render the scene into several shadow maps based on the distance from the camera. A scene you create should be sufficiently large, e.g., long alley of poles, and should provide varied geometry to assess how you fought with common shadow map artifacts like shadow acne and Peter panning. At the least you should there at least some spheres and cylinders or some other curved surfaces.
    Recommended approach:

    • Take the 04_deffered example as a starting point, but it is not required to implement it using deffered shading - you can just take the shadow pass
    • Instead of a single shadow map you will have to create the shadow cascade

  • Simulation of depth of field:

    In this assignment, you will implement a Depth of Field (DoF) post-processing effect for a deferred shading renderer. The objective is to simulate camera focus in a 3D scene, making objects outside a certain range appear blurred, while keeping those within the range sharp and in focus. This effect enhances the visual realism and can be used to direct the viewer's attention to specific parts of the scene.
    Implement the DoF effect as a post-processing step to deffered shading renderer. You will need to:

    • Calculate the blur amount for each fragment based on its depth and a predefined focus range.
    • Apply a blur effect where necessary. The blur should be more pronounced for fragments further away from the focus range.
    Recommended approach:
    • Take the 04_deffered example as a starting point
    • You need an additional postprocessing step, so the compositing step also renders to a texture

  • Depth peeling:

    Depth peeling is an advanced rendering technique used to accurately display translucent objects in 3D scenes. Unlike traditional transparency methods that can lead to sorting issues and inaccurate blending, depth peeling allows for correct rendering of multiple overlapping translucent surfaces by peeling away and rendering layers of geometry in a sequence. In this task, students will implement depth peeling in an existing renderer to enhance the rendering of translucent materials.
    Extend the renderer to support depth peeling. This includes:

    • Creating multiple framebuffer objects (FBOs) for peeling layers.
    • Implementing a sequence of rendering passes where each pass peels away one layer of geometry based on depth, rendering it to a texture.
    • Blending these layers back-to-front to achieve the final image.
    Recommended approach:
    • You will need your custom renderer, but you can take the 04_deffered example as a inspiration for framebuffer manipulation

  • Scientific volumetric visualization:

    Extend the provided ray-casting renderer to support advanced rendering modes, based on integration along the ray.
    Implement an integration scheme that accumulates color and opacity along each ray. This should take into account the material properties of the voxels, such as color and transparency.
    Use the visibility integration to render semi-transparent materials, allowing for the visualization of internal structures within the volumetric data.
    Examples:

    • Simulation of X-ray imaging from CT data.
    • Application of transfer functions
    Recommended approach:
    • Use the mip shader in 06_3d_textures as a starting point and implement the integration of the opacity and color along the ray.

  • Implicit surface rendering:

    Ray-casting can be used to search for intersection with user specified implicit surfaces. Let user specify implicit surface in following format F(x, y, z) = 0. During ray following you will evaluate the expression in the shader and find point of zero crossing. For correct shading of the surface you will need a normal to the surface. The direction of the normal is same as the direction of the gradient vector. You can let user specify also a formula for the partial derivatives, or you can compute the gradient numerically.
    Recommended approach:

    • Use the sphere_raycast shader in 06_3d_textures as a starting point and implement the iso surface intersection search for general implicit surface.