Does JS / Chrome have some type of limit at 16GB?

Chrome starts behaving weird (silent failures) when I have around 16GB worth of data spread out over lots of ArrayBuffer (or 1MB - 10MB each).

I'm fairly confident that I am not screwing up the wasm 4GB limit. I don't know if (1) I'm doing something silly on the Rust side or (2) if Chrome hard limits a single page to 16GB [not too unreasonable].

Anyone else run into this issue ?

All I can do is search on the internet, but there are some useful results:

Error code: Out of Memory - How to allocate more memory to Google Chrome - 64 GB ram available - Super User

https://groups.google.com/a/chromium.org/g/chromium-dev/c/qsffxWuaNeQ

1 Like
  1. Thanks, looks like 16GB is a hard limit somewhere, and I am running into that.

  2. In my case, curiously enough, it's not tabs crashing ... it's WebWorkers just silently stop sending each other messages on post_message.

@Michael-F-Bryan : did your ever run into the above "16GB limit" with your ML in wasm work? If so, any tricks to get around this?

Not really. Our main target is browsers and edge devices where you'll often need to download a model over the internet, so people will deliberately try to keep their models small. In turn, that means you have smaller input tensors and therefore lower memory usage.

Besides, you don't use excessively huge amounts of memory for ML. As a ballpark figure, the pixel buffer for a 4k image is "only" about 35 MB (4096*2160*4 = 35,389,440), so something weird would be going on if a single WebAssembly module was using more than a GB of RAM.

I remember we initially used wee_alloc as our allocator and ran out of memory because its free() doesn't do anything.


I just did an experiment on the desktop and with Wasmer it looks like linear memory for a single WebAssembly module is capped at 4GB. All the code does is allocate a vec![0_u8; 100_000 * n] in an infinite loop and leaking the memory (originally I was doubling every time, but it didn't give good enough resolution so I switched to quadratic memory usage instead).

Source code
$ cd /tmp/memory_usage && tree
.
├── Cargo.lock
├── Cargo.toml
├── guest
│   ├── Cargo.toml
│   └── src
│       └── lib.rs
└── host
    ├── Cargo.toml
    └── src
        └── main.rs

4 directories, 6 files
# guest/Cargo.toml
[package]
name = "guest"
version = "0.1.0"
edition = "2021"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[lib]
crate-type = ["rlib", "cdylib"]

[dependencies]
dlmalloc = { version = "0.2.3", features = ["global"] }
// guest/src/lib.rs

#![feature(vec_into_raw_parts)]

use std::sync::atomic::{AtomicU64, Ordering};

#[global_allocator]
static ALLOCATOR: Allocator = Allocator(dlmalloc::GlobalDlmalloc);

extern "C" {
    fn update_bytes_allocated(bytes_allocated: u64);
}

#[no_mangle]
pub extern "C" fn malloc(size: u64) -> *const u8 {
    let buffer = vec![0; size as usize];
    let (ptr, _, _) = buffer.into_raw_parts();

    ptr
}

struct Allocator(dlmalloc::GlobalDlmalloc);

static BYTES_ALLOCATED: AtomicU64 = AtomicU64::new(0);

unsafe impl std::alloc::GlobalAlloc for Allocator {
    unsafe fn alloc(&self, layout: std::alloc::Layout) -> *mut u8 {
        let bytes_allocated = BYTES_ALLOCATED.fetch_add(layout.size() as u64, Ordering::Relaxed);
        update_bytes_allocated(bytes_allocated);
        self.0.alloc(layout)
    }

    unsafe fn dealloc(&self, ptr: *mut u8, layout: std::alloc::Layout) {
        let bytes_allocated = BYTES_ALLOCATED.fetch_sub(layout.size() as u64, Ordering::Relaxed);
        update_bytes_allocated(bytes_allocated);
        self.0.dealloc(ptr, layout);
    }
}
# host/Cargo.toml
[package]
name = "host"
version = "0.1.0"
edition = "2021"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
wasmer = "2.3.0"
// host/src/main.rs
use wasmer::{Function, Instance, NativeFunc, Store};

fn main() {
    let filename = std::env::args().nth(1).expect("Usage: host <filename>");

    let wasm = std::fs::read(&filename).unwrap();

    let store = Store::default();
    let module = wasmer::Module::new(&store, wasm).unwrap();
    let imports = wasmer::imports! {
        "env" => {
            "update_bytes_allocated" => Function::new_native(&store, |bytes_allocated: u64| {
                println!("{bytes_allocated}");
            }),
        }
    };

    let instance = Instance::new(&module, &imports).unwrap();
    let malloc: NativeFunc<u64, u32> = instance.exports.get_native_function("malloc").unwrap();

    for power in 0.. {
        let size = 100_000 * power;
        if let Err(e) = malloc.call(size) {
            println!("{e}");
            return;
        }
    }
}
$ cargo build -p guest --target wasm32-unknown-unknown 
$ cargo run -p host -- target/wasm32-unknown-unknown/debug/guest.wasm
0
100000
300000
600000
1000000
...
4248600000
4277800000
RuntimeError: unreachable
    at rust_oom (<module>[226]:0x13070)
    at __rg_oom (<module>[238]:0x1350b)
    at __rust_alloc_error_handler (<module>[165]:0x1152a)
    at alloc::alloc::handle_alloc_error::rt_error::h24c338322ccc2f66 (<module>[235]:0x134a0)
    at core::ops::function::FnOnce::call_once::h29cf5a21b0e47360 (<module>[234]:0x13492)
    at alloc::alloc::handle_alloc_error::he886c6482e100e1b (<module>[236]:0x134ae)
    at alloc::raw_vec::RawVec<T,A>::allocate_in::hfbe11fbb9e669738 (<module>[148]:0x10d8d)
    at <u8 as alloc::vec::spec_from_elem::SpecFromElem>::from_elem::hff0e825d4b1d6e8f (<module>[156]:0x111c7)
    at alloc::vec::from_elem::h3b3820ee29046ac8 (<module>[147]:0x109cc)
    at malloc (<module>[4]:0x5eb)

I also hacked together an identical host using NodeJS with identical results.

Source code
$ node --version
v18.6.0
const fs = require("fs").promises;

async function main() {
    const filename = process.argv[2];
    const wasm = await fs.readFile(filename);

    const imports = {
        env: {
            update_bytes_allocated: bytes => console.log(bytes),
        },
    };

    const { instance } = await WebAssembly.instantiate(wasm, imports);
    const malloc = instance.exports.malloc;
    let size = 0n;

    while (true) {
        size += 100000n;
        malloc(size);
    }

}

main();
$ node main.js target/wasm32-unknown-unknown/debug/guest.wasm
0n
100000n
300000n
...
4248600000n
4277800000n
wasm://wasm/007c911a:1


RuntimeError: unreachable
    at rust_oom (wasm://wasm/007c911a:wasm-function[226]:0x13070)
    at __rg_oom (wasm://wasm/007c911a:wasm-function[238]:0x1350b)
    at __rust_alloc_error_handler (wasm://wasm/007c911a:wasm-function[165]:0x1152a)
    at _ZN5alloc5alloc18handle_alloc_error8rt_error17h24c338322ccc2f66E (wasm://wasm/007c911a:wasm-function[235]:0x134a0)
    at _ZN4core3ops8function6FnOnce9call_once17h29cf5a21b0e47360E (wasm://wasm/007c911a:wasm-function[234]:0x13492)
    at _ZN5alloc5alloc18handle_alloc_error17he886c6482e100e1bE (wasm://wasm/007c911a:wasm-function[236]:0x134ae)
    at _ZN5alloc7raw_vec19RawVec$LT$T$C$A$GT$11allocate_in17hfbe11fbb9e669738E (wasm://wasm/007c911a:wasm-function[148]:0x10d8d)
    at _ZN63_$LT$u8$u20$as$u20$alloc..vec..spec_from_elem..SpecFromElem$GT$9from_elem17hff0e825d4b1d6e8fE (wasm://wasm/007c911a:wasm-function[156]:0x111c7)
    at _ZN5alloc3vec9from_elem17h3b3820ee29046ac8E (wasm://wasm/007c911a:wasm-function[147]:0x109cc)
    at malloc (wasm://wasm/007c911a:wasm-function[4]:0x5eb)

Node.js v18.6.0

lol, looks like I got nerd sniped :joy:

1 Like

It's against my selfish best interest to nerd snipe someone as helpful as you. :slight_smile:

1 Like