Parsing with slice patterns

So I was on day 2 of the Advent of Code and had to parse some strings to get a series of tuples and group them together. At one point I had a very basic tokenizer and I wanted to parse the tokens in a match expression. After some messing around I ended up using slice patterns and thought it look alot like EBNF.

So I spend the next day messing around with pattern rules and decided to move it into a library: https://github.com/Cooljawty/aoc23_parser/blob/main/src/lib.rs

You can see how I used it to solve the puzzle for day 2 in the parse_tokens function on line 56: https://github.com/Cooljawty/aoc23_parser/blob/main/src/day2.rs

I doubt I'll touch it for a while but my wishlist would be:

  • Custom keywords instead of having them defined in the library
  • More generic tokens. Possibly just having them defind by a regex pattern, but I wanted to see how much I can do with just the standard library
  • Cleaner error handling. I'd rather the closure for matching tokens not be a Option nested in a Result but that might just be necessary given how I defined the closure

I mainly just wanted to share it since its the most complete project I've written in Rust!
I'm also curious if anyone's used match expressions in this way?

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.