What FilterType to use for downscaling images?

Hello

I have a website where people can upload profile pictures. The problem is that a user could upload a 1000 x 1000px image, while these images are only displayed with a maximum dimension of 100 x 100px. So to optimize loading-times and disk-space of my website I have written the following code to auto-resize images when they're uploaded.

use image::imageops::FilterType;
use image::ImageFormat;
use std::fmt;
use std::fs::File;

pub async fn downscale_image(file_path: &String, max_width_px: f64) {
    let image = image::open(format!(".{}", file_path)).unwrap();
    let (width, height) = image.dimensions();

    let ratio: f64 = width as f64 / height as f64;
    let new_height = max_width_px / ratio;

    let new_image = image.resize(max_width_px as u32, new_height as u32, FilterType::Lanczos3);
    let format = ImageFormat::from_path(format!(".{}", file_path)).unwrap();

    let mut output = std::fs::File::create(format!(".{}", file_path)).unwrap();
    new_image.write_to(&mut output, format).unwrap();
}

the Image-crate supports multiple FilterTypes.

("near", FilterType::Nearest),
("tri", FilterType::Triangle),
("cmr", FilterType::CatmullRom),
("gauss", FilterType::Gaussian),
("lcz2", FilterType::Lanczos3),

Currently I am using Lanczos3 which works fine but a bit slow while uploading. Does it really matter on what filter to use? Which one is the fastest? Quality of the image does not really matter but file size should be small as possible.

Thanks!

The docs page for FilterType includes a table of times for the test image along with sample outputs.

Nearest neighbor is fastest by a mile, but clearly destroys the test image. It may be visibly wrong even at small sizes. The next fastest is Triangle, which is an order of magnitude slower, but still more than twice as fast as your current strategy. The test image is a little fuzzy in that mode, but that's probably acceptable for images that are displayed at small sizes.

3 Likes

Instead of resizing the image, couldn't you just avoid the problem altogether by requiring a small enough image/file size in the first place?

1 Like

Thanks. I'm going for Triangle.

@H2CO3 I will be limiting the file size in MB. But I want to keep the upload process as smooth as possible for the user. I will put limits on ridiculous sizes as >5MB. But not all users maybe know how to resize an image, so I want to it for them.

1 Like

ImageMagick has some good reads on the various filters:

The names are from the authors of the papers that first introduced them, so mean the same thing across the different libraries (assuming they're properly implemented).

1 Like

Another reason to limit the image dimensions is that, if an image is large and highly compressed (low detail), then it might fit under a byte size limit, but use unnecessary memory when decompressed for viewing. And it might look worse because the viewing browser's choice of downsampling filter won't necessarily be high-quality.

2 Likes

That's why I wrote "image/file size" above. The file size is easier to determine but might be a worse indicator of image size. However, the actual image dimensions can still usually be read quickly, without decompressing (or even reading) the entire file, so limiting the pixel size to something like 128 x 128 is still a reliable option.

@ONiel Based on the above, I'd recommend a combination of the two approaches. Set a more reasonable file size limit (eg. 512 kB should be plenty for any reasonable, non-malicious image that's intended to be resized to 100 x 100), and then apply a filter of which the speed doesn't really matter at that point.

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.