The json
crate can convert only primitives. For vectors and objects there are macros. Which means, one has to do the nesting manually.
This has advantages and disadvantages. Disadvantage is obvious: more lines of code, each struct needs something like a as_json()
method. Advantage is, one can shape JSON while writing it. For example, this one adds a type
parameter, which isn't in the Rust struct. Receiving side of the JSON needs this to recognize the type of the structure encoded, Rust recognizes it by the struct used:
use json; // use this in all code snippets here
pub struct TSParameterBool {
pub default: bool,
pub current: bool,
}
impl TSParameterValue for TSParameterBool {
fn as_json(&self) -> json::JsonValue {
json::object!{
type: "bool", // <- not in Rust, but in JSON
default: self.default,
current: self.current,
}
}
}
One level higher, one encodes data of that level and calls as_json()
from the lower level:
pub struct TSParameter<'a> {
pub name: &'static str,
pub description: &'static str,
pub value: &'a dyn TSParameterValue, // <- lower level
}
impl<'a> TSParameter<'a> {
pub fn as_json(&self) -> json::JsonValue {
json::object!{
name: self.name,
description: self.description,
value: self.value.as_json(), // <- lower level
}
}
}
Same when collecting that into an array/vector:
let mut json_parameters = json::JsonValue::new_array();
for parameter in ¶meter_set {
json_parameters.push(parameter.as_json()).unwrap();
}
This done, make a string from that collected JSON stuff:
let result = json::stringify(json_parameters);
As this is kind of an advertisement for "manual" coding already, let me give an example where data shaping saves a lot of space. Imagine an application which wants to send stock trading candles to the web browser user interface. It's data structure (simplified):
pub struct Candle {
pub timestamp: i64,
pub open: f32,
}
A generic JSON encoding would give something like this (I hope I don't mistype):
[
{
timestamp: 1690192800,
open: 16200.969583458793
},
{
timestamp: 1690196400,
open: 16220.969583458793
},
// ... repeat 48 bytes 50,000 times.
]
See the redundancy? Parameter names timestamp
and open
get written over and over again. Also, floats get written with 12 digits after the decimal, where 2 are entirely sufficient for the use case. A better data structure would look like this, less than half the size:
{
timestamps: [
1690192800,
1690196400,
// ... repeat 12 bytes 50,000 times.
],
opens: [
16200.96,
16220.96,
// ... repeat 9 bytes 50,000 times.
]
}
Rust code the get this smaller data structure:
let mut json_timestamps = Vec::with_capacity(history.candles.len());
let mut json_opens = Vec::with_capacity(history.candles.len());
for candle in &history.candles {
json_timestamps.push(candle.timestamp);
json_opens.push(round_f32(candle.open, 2));
}
let json = json::object!{
timestamps: json_timestamps,
opens: json_opens,
};
What to choose?
- If you have a complex structure, just want to write it somehow for re-loading later, and data size isn't an issue, generic serialization with
serde
is certainly a pragmatic choice.
- If the structure is rather simple, JSON is received by some other application which prefers a certain formatting, and thousands of instances are needed, manual code with
json
gives quite some opportunities.