I want to implement an Erlang NIF (Native Implemented Function) that decodes JSON.
Thanks to rustler and serde, I've implemented the decoder NIF that takes the whole json message as argument. But in Erlang there's a requirement that each NIF call should return within 1 millisecond, in order not to interfere with the scheduling.
So, for large message, I want to break down the decoding process into smaller steps. My current thought is to create a deserializer that has a feed() and eof() function. And in each NIF call, feed() is called with a slice of the payload. And finally eof() is called to retrieve the result.
Serde_json can decode from a stream. This gets you an iterator.
let stream = serde_json::Deserializer::from_reader(rdr).into_iter::<serde_json::Value>();
Each "next()" of the iterator "stream" gets you one decoded JSON value. You can write your own reader object which will get called as the deserializer needs more bytes.
The usual inside-out problem applies. "next()" is, in a sense, a blocking operation if you have to wait for another "feed". You could do this by having "feed" put bytes on an input queue, have a thread to call the iterator and return JSON objects on a queue, and call "try_recv" on the channel after each "feed" to see if a new JSON object emerged. Then the Erlang code never waits for the Rust code.