GLSL debuggers?

Apparently, GLSL does not have a printf.

I have Rust application using crate "glow". It is currently running on:

  • linux / x86_64
  • wasm32_unknown_unkonwn / browser

Are there GLSL debugger tools for either? (The basic OpenGL / WebGL setup is fine; basic shaders running fine, main problem is: doing something funky a fragment shader sampling a texture for data, and something going wrong there.)

You may be looking for Renderdoc: https://renderdoc.org/. I'm not sure about debugging stuff running under wasm, but it works wonders for desktop/native applications.

1 Like

Render doc looks interesting. Here is my problem (which in retrospect I should have mentioned earlier):

  1. I ran draw basic OpenGL traignle fine

  2. I can draw basic texture fine

  3. I ran run custom vert/frag shader fine

  4. I can run custom vert/frag shader using vertex attribs & uniform buffers fine

  5. I have a situation where the data my frag shader needs to sample exceeds what the uniform buffer can handle; so instead, I am stuffing this data into a texture, and sampling from the texture

  6. suddenly, I'm now getting black screen, (somewhere along the way of upload to texture / shader read back from texture, something went wrong)

  7. making things worse, the data in the texture is not a "natural texture", but instead it's a Vec encoding a bunch of control points for a bezier curve that the fragment shader than renders ....

Have you thought of stuffing your data into an SSBO instead of a texture? That allows you to have structured data without the wonkyness of textures getting in your way.

Are you sure then that your size and align are as they should be? Along with padding to make sure your supply the right amount of data to fill the texture. GPU UB is still UB.

Take care to make sure your data fits accurately into your texture's range.

IIRC, wasm doesn't have access to f32 backed storage for textures, so you must make sure to encode the data between 0.0 and 1.0 when uploading as f32, or 0 and 255 when uploading as u8/i8.

Quoting that article:

  1. SSBOs can be much larger. The OpenGL spec guarantees that UBOs can be up to 16KB in size (implementations can allow them to be bigger). The spec guarantees that SSBOs can be up to 128 MB . Most implementations will let you allocate a size up to the limit of GPU memory.

This would solve so much of my problems. Unfortunately, it says: core since OpenGL 4.3 -- is this supported in WebGL 2.0 ?

Ah, I see, according to a user on the gamedev discord server, it's not available. If that's the case, I'd do the texture thing with great care.

I think I almost have it working (or atleast located the bug). The texture function we can call from GLSL is: texture - OpenGL 4 Reference Pages

which returns float for the bytes we store. I think this is some horrendous normalization where for byte 0<= b <= 255, it returns the float 0. <= b/255.0 <= 255.0. I think the way to 'recover' this is to just round(x * 255.0). However -- I would prefer to just read the byte in the first place.

When sampling textures in WebGL, can we just read off an unsigned byte, or are we limited to reading out floats ?

In case anyone is curious, this is the GLSL code for reading out a Rust u32:

uint from_tex_u32(int i) {
    float delta = 1. / float(2*grid);
    float x = float(i % grid) / float(grid) + delta;
    float y = float(i / grid) / float(grid) + delta;
    vec4 c = texture(u_texture, vec2(x, y));
    uint r = uint(round(c.r * 255.));
    uint g = uint(round(c.g * 255.));
    uint b = uint(round(c.b * 255.));
    uint a = uint(round(c.a * 255.));
    return (r << 0) + (g << 8) + (b << 16) + (a << 24) ;
}

I believe you are incorrect in that assertion, it should be 0.0 <= b/255.0 <= 1.0.

According to this doc page on sampler types, and this page on the texture function, if you call texture with a usampler2D, you will be returned a uvec4 (which I presume would return the original 0-255 ranged data, and could be used to reconstruct the original u32).

Try declaring your sampler as usampler2D, and ensuring that the backing storage is indeed r8g8b8a8 or equivalent (IE, the order of r, g, b, a, are irrelevant), and sampling using that to acquire a uvec4.

1 Like

You're right, this is a typo on my part.

  1. I got it working with float sampler + round(x * 255.0)

  2. I believe the documentation you are pointing to is correct, and indeed it should be possible to do usampler2D and get back uvec4. I tried verifying this; for some stupid reason, it causes Xorg to freeze.

I am certain that my code is buggy, but if changing the frag shader from sampler2D to usampler2D causes Xorg to freeze, there may be some driver bugs we've just run into. :frowning:

I think I figure out what is wrong:

https://www.khronos.org/opengl/wiki/GL_EXT_texture_integer

There are combinations of usampler2D / image format that are allowed / not allowed (and it appears RGBA does not get along with usampler2D).

Everything works now. Thanks @OptimisticPeach for all the debugging help. For anyone else running into this issue, here is what helped me:

  1. texelFetch (better than texture, can give index instead of texture location)

  2. usampler2D instead of sampler2D (returns uvec4 instead of floating vec4)

  3. texture internal format: RGBA8UI (8 for bits, UI for unsigned integer); format: RGBA_INTEGER (no idea why, some webpage suggested this, and it works)

3 Likes

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.