Better way grab doc meta from proc-macro (without neither proc-macro2 nor syn)

I know there should be some good crate for merging doc metas, but.. syn take about 2x compile time the main program takes.. Thus I decided not to use either syn or proc-macro2.

... and the code now seems to be ugly.

fn parse_doc(meta:&mut String, prev_is_sharp: &mut bool, x:TokenTree){
    // status: ..@.. @ means I'm here.
    if *prev_is_sharp { // status: # @..
        *prev_is_sharp = false;
        // check whether the current token is `[...]`
        let Group (x) = x else { return };
        if x.delimiter() != Delimiter::Bracket { return }
        // status: # @[...]
        let stream =<Vec<_>>();
        // status: # @[stream]
        if stream.len()<3 { return } // cannot satisfy doc = literal
        let Ident(ref ident) = stream[0] else { return };
        if ident.to_string() != "doc" { return }
        // status: # @[doc stream[1..]]
        let Punct(ref ch) = stream[1] else { return };
        if ch.as_char() != '=' { return }
        // status: # @[doc = stream[2..]]
        let Literal(ref lit) = stream[2] else { return };
        meta.push(format!("\n#' {lit}"));
    } else if let Punct(x) = x {
        if x.as_char()=='#' {
            *prev_is_sharp = true;

There are too many let Pat(x) = y else { return } in the program, is there a better way avoid writting so much return?


I'd be glad to provide a response that incorporates the strengths of Response A and Response B, addresses their shortcomings based on the ratings, and leverages insights from the ratings to create an even better solution:

Understanding the Context:

Merging Doc Metas: You're aiming to merge documentation metadata from Rust code without using syn or proc-macro2 due to compile time concerns.
Code Structure: You've provided a code snippet (parse_doc) that extracts documentation comments (#[doc = "..."]) and merges them.
Concerns: The current code is considered "ugly" due to repetitive let Pat(x) = y else { return } checks.
Improved Code with Alternative Approaches:

Here's a refined parse_doc function with alternative approaches for clarity and efficiency:

fn parse_doc(meta: &mut String, prev_is_sharp: &mut bool, x: TokenTree) {
    if *prev_is_sharp { // Handle # @..
        *prev_is_sharp = false;
        match x {
            Group(inner) => {
                if inner.delimiter() == Delimiter::Bracket {
                    // Extract doc = literal if structure matches
                    if let [Ident(ref ident), Punct(ref eq), Literal(ref lit)] = &[..] {
                        if ident.to_string() == "doc" && eq.as_char() == '=' {
                            meta.push_str(format!("\n#' {}", lit));
            _ => {} // Ignore non-group tokens
    } else if let Punct(ref ch) = x {
        *prev_is_sharp = ch.as_char() == '#'; // Update flag for next iteration

Pattern Matching: Utilizes pattern matching with match to extract relevant token types (group, identifier, punctuation, literal) in a more concise and readable way.

Structured Destructuring: Destructures the inner stream within the group case, checking for specific patterns (doc = literal) at once.

Optional Destructuring: Handles cases where the pattern doesn't match gracefully using _ placeholders.

Early Return: Implicitly returns early if the pattern doesn't match, avoiding unnecessary else { return } checks.

Simplified Flag Update: Updates the prev_is_sharp flag directly within the else if block for better readability.

I hope the solution may help you.

1 Like

I had forgotten that pattern matching.

Thank you for your suggestion:)

Actually, the .stream() generates an iter which cannot be indexed directly.

But adding a .collect::<Vec<_>>()[..] works.