How to convert 8bit texture to 32bit using SDL2

Hi,

I stumbled upon a really strange behavior while using sdl2 bindings. I started by loading some textures from a game file. At the beginning I was only interested to see if I got the decoding of the alignment of the textures right, so I just printed one using this code:

    let (_, tile) = tile_iter.next().unwrap();
    let texture_creator = canvas.texture_creator();
    let mut texture = texture_creator
        .create_texture_static(
            Some(PixelFormatEnum::RGB332),
            IMAGE_SIZE as u32,
            IMAGE_SIZE as u32,
        )
        .unwrap();

    texture.update(None, &tile.0, IMAGE_SIZE).unwrap();
    canvas.copy(&texture, None, None).expect("Render failed");
    canvas.present();

After I was sure the textures loaded correctly, I tried to use palettes. It seemed I need to use SDL2's Surface for this, so I tried and I now I get a different and messed up textures shown. Seems to be reading memory garbage or wrong memory location. Here is the code:

    let (_, tile) = tile_iter.next().unwrap();
    let texture_creator = canvas.texture_creator();
    let surface = Surface::new(
        IMAGE_SIZE as u32,
        IMAGE_SIZE as u32,
        PixelFormatEnum::RGB332,
    )
    .unwrap();
    //surface.set_palette(&palette).unwrap();

    let mut texture = texture_creator
        .create_texture_from_surface(surface)
        .unwrap();

    texture.update(None, &tile.0, IMAGE_SIZE).unwrap();
    canvas.copy(&texture, None, None).expect("Render failed");
    canvas.present();

At first I thought I messed up my loading again, but now I tried both codes above with the same loading logic and the result is not the same. I hope someone has an idea, what I'm doing wrong here.

edit

OK, it seems that I need to load the 8bit Texture into an RGB332 surface first. After that I should set my palette (with RGBA values) and convert the surface into an RGBA32 surface. Here is my current code:

    let mut tile = tile.0.clone();
    let mut surface = Surface::from_data(
        &mut tile,
        IMAGE_SIZE as u32,
        IMAGE_SIZE as u32,
        IMAGE_SIZE as u32,
        PixelFormatEnum::RGB332,
    )
    .unwrap();

    // trying a palette with all values set to white
    let palette = Palette::new(256).unwrap();
    surface.set_palette(&palette).unwrap();
    
    let surface = surface.convert_format(PixelFormatEnum::RGBA32).unwrap();
    let  texture = surface.as_texture(&texture_creator).unwrap();

    canvas.copy(&texture, None, None).expect("Render failed");
    canvas.present();

Now, the content of the texture works, but I can't get SDL to use my palette.

For future visitors with similar problems. It seems that my hardware doesn't support the RGB332 color space. Unfortunately SDL2 uses a supported setting instead of the given one, without any information. If you are interested which setting is really used, you can query your texture with:

texture.query()