Opinion on using a lot of macros?


If I for example have a JSON value coming in as String and I want to convert it to f64 I'd have to use this code: json["k"]["o"].as_str().unwrap().parse::<f64>().unwrap(). Which in my opinion looks unnecessarily long and messy.

I think that when someone reads my code he should be able to understand the overall logic and structure, and not be distracted by details as how a JSON-string gets converted to f64, no?

So by defining a macro like this:

macro_rules! json_str_to_f64 {
    ($s: expr) => {

I could just use json_str_to_f64!(json["k"]["o"]) instead. Which in my opinion makes the code look more readable and makes programming easier.

What is your opinion on using macros like this? Is it for some reason not recommended? How many macros do you use in your code?

This should be a function. Macros are only needed when functions can't be used, or at least not conveniently used. There is no performance benefit to using a macro when a function will work.


As above, that should be a plain function. And in general, I'd be suspicious of macro-heavy code, but while I'd see it as a "code smell" that doesn't inherently mean that it's wrong.

As the Python people say (but don't do very well IMO) "Explicit is better than implicit", and as the Go people say " A little copying is better than a little dependency."

But why I really chimed in: Please don't .unwrap with JSON - it's hardly exceptional for JSON to be malformed somehow.

A very least json["k"]["o"].as_str()?.parse::<f64>()?, which is already much shorter. And assuming you're assigning it to a f64 field, you can skip the turbofish, and you're down to json["k"]["o"].as_str()?.parse()? and about as short as your macro.

But ideally, use SERDE or one of the other libraries to give even better errors for your users. (even if that user is just yourself)


Macros should mostly be a tool of last resort, since they are so hard to write, read and maintain. For example, while your simple example probably can't be affected by this issue, in general you can't rely on the method call in macros to be what you expect them to be. A simple error in a macro definition or usage can cause a wrong type with a similarly named method to be called. Trait methods are also unavailable unless you import the relevant traits explicitly, which for macros is usually non-obvious and confusing, since the exact traits are the implementation details of the macro. This means that a robust macro requires naming trait methods via an explicit fully qualified call syntax. If you have trait Foo { fn bar(&self) }, you shouldn't call obj.bar() in a macro. You should write foocrate::someplace::Foo::bar(&obj), to ensure that the end user doesn't need to mess with their imports and use traits which aren't directly used in the code.

In your json example, the macro is barely more readable than the original method chain, particularly if you use the proper ?-based error handling. Unless you're writing a simple one-off script, a better approach is to directly model the schema of your expected data.

#[derive(Serialize, Deserialize)]
struct Foo {
    k: Vec<Bar>,
    // other fields if required

#[derive(Serialize, Deserialize)]
struct Bar {
    #[serde_as(as = "DisplayFromStr")]
    o: f64,
    // other fields if required

This uses the serde crate for (de-)serialization, and the serde-with helper crate to simplify parsing an f64 from a string field. You use only serde and write the relevant conversion manually, but serde-with is easier to use.

Now, when you need to convert your untyped json into structured data, you just do this:

let foo: Foo = serde_json::from_value(json)?;
let field: f64 = foo.k.o; // use as needed

This uses the serde-json crate (which you're probably already using), specifically serde_json::from_value. In fact, depending on your application, you probably don't even need to parse your data into a serde_json::Value, and instead can directly parse the original serialized string using serde_json::from_str.

This is more concise and readable than either your original code and your macro. It consolidates the error checking into a single function call, and gives even greater benefits if you intend to read anything more complex than a couple of nested fields. You also get the benefits of autocomplete and type inference on your strongly typed deserialized data.

Of course, this uses the macros from serde and serde-with, but that's the point. Macros are hard to write, use and maintain. Instead of spreading poorly tested and poorly documented macros around your codebase, use the commonly used, well-designed and well-supported macros from the ecosystem, and maximize their leverage.


This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.