Help in understanding an example in wgpu-rs

layout(location = 0) in vec2 v_TexCoord;
layout(location = 0) out vec4 o_Target;
layout(set = 0, binding = 1) uniform texture2D t_Color;
layout(set = 0, binding = 2) uniform sampler s_Color;

void main() {
    vec4 tex = texture(sampler2D(t_Color, s_Color), v_TexCoord);
    float mag = length(v_TexCoord-vec2(0.5));
    o_Target = vec4(mix(tex.xyz, vec3(0.0), mag*mag), 1.0);
}

I know it's not rust. But if there is someone who knows this shading lang please help. I didn't understand the first line in main function. I wanted to know what Sampler2D function is doing. And what texture function is doing. I searched in Google about Sampler2D it shows as a type rather than a function. This part confuses me.
For context, I was learning from examples of wgpu-rs and got stuck here. This code is in here
Also I'm completely new to this language. If you provide any links to tutorials or recommend any books, I'd be really happy

  1. "Take this texure2D, this sampler and this pixel coordinate and give me the color."
    • texture: rectangular pixels or rhombus voxels.
    • texture2D: rectangular pixels (2-dimensional texture).
    • sampler: describes how to get pixels from a texture, especially in situations like sub-pixels (pixels between pixels). It can refer to any type of texture (there are 2D texture and 3D textures).
    • sampler2D(): "I have a sampler that can be used on any texture, please make it apply to this specific texture2D for me".
    • texture(): "I have a sampler2D, please grab this specific pixel from it"
    • vec4: also known as color4. Each pixel has 4 components (red, green, blue, alpha). A vec4 also has 4 components (x, y, z, w), so there's no reason you can't use a vec4 to represent a color4.
  2. "Determine the distance from half the texture"
    • vec2(0.5) creates a vec2 with (x = 0.5, y = 0.5). This also refers to the middle of any texture on the GPU. The GPU does not refer to textures based on their original size (including the texture that represents your screen), all textures are 1-by-1.
    • v_TextCoord-vec2(). The distance vector to the middle of the texture.
    • length(): The length of a vector.
  3. "Gradually fade to black (vignette) from the center of the texture)"
    • tex.xyz: is a swizzle. This takes the x, y and z components from tex (which are r, g and b respectively) and gives you a new vec3 containing them.
    • mix: "Fade between the color and black, based on the percentage mag*mag". Note that it isn't really a percentage, rather it's a value between 0 and 1 (where 0 is completely the color and 1 is completely black).
    • vec4(vec3, 1.0) make a new vec4 by taking x, y and z from the vec3 and adding 1.0 as w (alpha/opacity).

Edit: I would recommend changing the shader to this, to see what's going on:

    o_Target = texture(sampler2D(t_Color, s_Color), v_TexCoord);

Then this:

    o_Target = vec4(length(v_TexCoord-vec2(0.5)));

I'd also recommend you start with OpenGL/WebGL and not Vulkan/WebGPU.

1 Like

The line in question:

vec4 tex = texture(sampler2D(t_Color, s_Color), v_TexCoord);

Uses the texture function to sample a sampler at a coordinate (in the range [0, 1]2).

Wgpu takes a different approach to this than most OpenGL tutorials, since instead of passing a sampler combined with a texture, it passes a texture and a sampler separately.

A texture contains one or more layers of texture data, and in this example is passed in as

layout(set = 0, binding = 1) uniform texture2D t_Color;

A sampler defines rules when sampling a texture, such as clamping behaviour, and interpolation functions. In this example it's passed in as

layout(set = 0, binding = 2) uniform sampler s_Color;

A sampler2D is a combination of a sampler and a texture which allows you to actually query for a value. sampler2D is a type indeed, and what you're seeing there isn't a function called sampler2D, but instead a constructor. This is similar to how you can say

value = vec4(something);

Or

int floored = int(my_float);

OpenGL defines these rules per texture-binding, hence when using OpenGL you define your textures as only

layout(set = 0, binding = X) uniform sapmler2D my_texture;

However, wgpu only happens to use the OpenGL shading language (glsl for short) to write shaders. Its stable backends currently do not include OpenGL, and instead are Vulkan, Metal, DX12 and WebGPU.


While I agree with you to some extent, there are generally better docs for Vulkan than OpenGL (apart from GLSL functions, where there aren't any in Vulkan specs as far as I can tell, and the ones for OpenGL are still lacklustre). There are more tutorials for OpenGL however, which you could follow to learn the basics about the render pipeline and math when shading.

1 Like

Thanks for the help guys. It made me a lot clearer about the concept. So, I've another question, is there anyway I can directly write shaders and test online like a rust playground?. Because, you see, directly compiling from rust and checking is tedious, especially when the code panics, I've to go through my check if error is part of the rust code or shader code.

For learning to write shaders, it's a bit of a trial and error process, however if you compile your shader code prior to running your application in either a makefile like in the wgpu-rs examples, or in a cargo build script, then you can quickly retry. If you just want to get a feel for what functions are available, you could always browse shadertoy. If you want to write entire shaders with an IDE-like environment, then something like SHADERed should work.

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.