This is a new course NPGR042 that will be first taught in the Summer Semester 2025/2026. Please enroll in SIS!
Study materials
Links to slides are available in SIS in the course page (you have to login).
Source codes and other files that I showed during the lectures are available on our GitLab (you have to login).
Schedule
| Week 1 19.2. | Introduction to shading Topics you can expect in NPGR042, rendering vs. shading, from shading to shaders |
| Week 2 26.2. | Introduction to shading – continued Procedural shading and textures Labs |
| Week 3 5.3. | Procedural shading and textures – continued Random numbers (PRNG), linear congruential generator (LCG), value noise, gradient noise / Perlin noise, hashing in shaders, fractal Brownian motion (fBM), Voronoi diagrams, combination of random noise and Voronoi cells, brief mention of reaction–diffusion systems |
| Week 4 12.3. | Light–surface interactions & BRDF |
Home assignments
Overview
Your first goal is to find a real material around you, take photographs of it from different angles, and then try to replicate its patterns in two shaders.
Appropriate materials
Choose a material that can be used on a flat surface. The material needs to have both a regular pattern (grid, cells, lines, chevrons, etc.) and a stochastic pattern (random noise, dots, lines, gradients that are not visually repetitive). Here are examples of such materials that I randomly photographed at the university:

Make sure to choose your own material and take at least 3 photographs of it at varying angles and distances.
Implementation
Your goal is to implement the material in a shader. The virtual material should be visually similar or at least strongly inspired by the real photos you took. For now, we are only interested in replicating the color texture, not any other properties (glossiness, bumps, etc. will be explored later in Assignment 2).
You must implement the same material twice:
- Once by programming a GLSL fragment shader suitable for real-time rendering. For that, use the provided Three.js template in which you can edit the fragment shader and GUI.
- Once in Blender, using shading node graphs. I strongly recommend to use the visual nodes and built-in functions (checkers, bricks, Voronoi, noise, color ramp, mix, etc.). It is allowed to use Open Shading Language (OSL) for scripting if you want to, but it is not required.
It is up to you whether you start in Blender and then replicate the look in GLSL; or you start in GLSL and then replicate it in Blender.
Additional criteria:
- You must expose at least 3 user-editable parameters.
- At least one of them has to be a color (e.g., base color of the material).
- At least one of them has to be related to size (e.g., size of the patterns).
- In Three.js, please use the GUI class to make the parameters accessible.
- In Blender, please use Group Nodes and their input sockets to make the parameters accessible.
Template
Please visit the study-materials repository. I strongly recommend starting with checkerboard.html (older GLSL versions) or checkerboard_glsl3.html (GLSL 3+), but you might also choose a different starting point (for example one with noise).
Submission
Prepare the following:
- .html file with the Three.js/GLSL shader based on the provided template
- .blend file with a simple plane with the assigned Blender material
- .md or .pdf report with:
- your material photos
- screenshots from Three.js and Blender
- brief explanation of how your shaders work
- declaration whether you used any AI assistants and how much of the work was your own
Upload these to a git repository under gitlab.mff.cuni.cz – each student will get a private repository created by me.
Deadline
The submissions deadline has been extended to Sunday, March 15, 23:59. Please ensure you have committed and pushed your changes by that time.
Grading
- You get 100% if you meet the full assignment instructions.
- You can get extra 25% bonus if your solution is exceptionally good.
- I will apply 50% deduction for each important requirement that was not followed.
Example: You only implemented the shader in GLSL and you did not expose any user-editable parameters → you only receive 50%*50%=25%.
Overview
In this assignment, we will work with view-dependent appearance, in other words, a material model whose appearance changes based on the illumination and camera angles. The goal is to implement a material with whose roughness, specularity, metalicity, etc. varies across the material’s texture (see below). If possible, reuse the material from Assignment 1 that you already photographed. If not possible, photograph a new suitable material.
Real example
Consider how different parts of the following material from Assignment 1 look different when the view angle changes:
While the patterns are rough and look more diffuse, the flat parts between the patterns are very polished and specular.
This is why in Assignment 1, you should have taken at least 3 photographs of your material at varying angles and distances.
Implementation
Similarly to Assignment 1, you are expected to implement the material both in Blender and in GLSL.
Blender implementation
I suggest you start with Blender, which already comes with a specular material node.
Here is an example of a simple shader graph that modulates the base color (diffuse color), specular color, and roughness of a checkerboard. Notice how the golden squares are glossier and have lower roughness than the diffuse white squares.
While roughness is a float, not a color, you can still connect color nodes to it. Blender will simply read the color channels and interpret it as a single float.
One way to map the roughness is to use Color Ramp as above, but you can also use the Map Range node, which maps a float value from range [a,b] to range [x,y]. So for example if you have a texture from [0,1] but you want your roughness to be 0.7 and 0.4, that means you want to map 0→0.7 and 1→0.4, which you can do as follows:
To achieve reasonable appearance, keep in mind that conductors (metals) can have a colorful specular component, but their diffuse color is usually black or dark gray. For dielectrics (such as paper or plastic), you want the specular component to be colorless (only grayscale), while the diffuse color can be arbitrary.
When working with specular materials, it might be useful to switch between environment illumination (which is the default Shading preview) and scene illumination (where by default you have a point light). You can switch between the viewport modes in the top-right corner:
GLSL implementation
In GLSL, the situation is more difficult. In Assignment 1, there was no illumination and the material was just a color. That changes and you now have to implement a specular microfacet BRDF shader that takes the light and view direction into account.
I recommend you to have a look at diffuse.html, which implements a simple diffuse BRDF shader with a point light source whose position you can change in the GUI. You can take this shader as your baseline and then copy-paste your Assignment 1 texture code into it as the vec3 albedo component. As a result, you will get a diffuse BRDF combined with your Assignment 1 material.
Then, you will have to add a microfacet BRDF to the diffuse BRDF component. I recommend to start with something like this:
vec3 f_diff = albedo / PI; // diffuse component
vec3 f_spec = …; // specular microfacet component
vec3 L_o = (f_diff + f_spec) * L_i * NdotL; // outgoing (reflected) light
Remember that a specular microfacet BRDF is essentially:
vec3 f_spec = (D * G * F) / (4.0 * NdotL * NdotV)
To ensure you are not dividing by zero, you might want to write something like:
vec3 f_spec = (D * G * F) / max(4.0 * NdotL * NdotV, 0.001)
The D, G, F components are the normal distribution function (D), masking-shadowing function (G), and Fresnel term (F). For D, you can use the GGX formula which depends on roughness and NdotH (dot product between the normal vector and half-vector). For G, you can use the so-called Smith-GGX formula, which depends on roughness, NdotL, and NdotV, which are the dot products between the normal vector and the light vector or view vector, respectively. For F, you can use Schlick’s Fresnel approximation formula, which depends on the F0 term (Fresnel term at 0°, or basically the specular color) and VdotH (dot product between the view vector and half-vector).
Note that diffuse.html already has the code for the N and L vectors and NdotL. You can similarly compute:
vec3 V = normalize(cameraPosition – vWorldPos);
vec3 H = normalize(L + V);
float NdotV = max(0.0, dot(N, V));
float NdotH = max(0.0, dot(N, H));
float VdotH = max(0.0, dot(V, H));
There, cameraPosition is a uniform injected by Three.js (you do not have to explicitly ask for it) and vWorldPos is a variable passed from the vertex shader (that is also already included in the demo code).
Implementing the D, G, F terms is left up to you and to your research in available graphics literature.
Once you have the whole BRDF implemented, try to replicate the material from Blender and from your photographs. Please note that minor differences are perfectly fine! The goal is not to have a 1:1 replica, just something similar.
Why are my renders different between Blender and GLSL?
One of the main things you will notice is that the specular highlights in your Three.js/GLSL render are significantly sharper than in Blender, where they appear smooth. Stay calm, this is not an error! What happens is that the point light source falloff is very sharp and if your light is bright, the circular illumination it creates is saturated and reaches values way above 1.0. In your image, any value above 1.0 is clamped and rendered as 1.0 anyway, so your specular highlights look weird. But Blender applies a tone mapping curve to the renders, which smoothens out the HDR (high dynamic range) pixel values. By default, Blender 5.0 uses AgX for tone mapping. You can disable it and switch it to “Standard” to get appearance closer to your Three.js/GLSL render.
Submission
Prepare the following:
- .html file with the Three.js/GLSL shader updated from Assignment 1
- .blend file with a simple plane with the assigned Blender material
- .md or .pdf report with:
- your material photos (or stating that you use the same as in Assignment 1)
- screenshots from Three.js and Blender
- brief explanation of how your shaders work
- declaration whether you used any AI assistants and how much of the work was your own
Upload these to a git repository under gitlab.mff.cuni.cz – each student will get a private repository created by me.
Deadline
Sunday, March 29, 23:59. Please ensure you have committed and pushed your changes by that time.
Grading
- You get 100% if you meet the full assignment instructions.
- You can get extra 25% bonus if your solution is exceptionally good.
- I will apply 50% deduction for each important requirement that was not followed.
Not yet released.
Not yet released.
Not yet released.
Not yet released.



