Borrow checker error E0515, goes away when inlining

Hi there! I'm writing a parser for a peculiar subset of TeX/markdown. My initial version was using owned strings all over the places. I'm trying to rewrite it to be based on taking slices of the input string, but running into borrow checker errors. All the previous errors
I could understand and deal with, but this one leaves me puzzled. Among others, it disappears when inlining a particular function — so it doesn't seem to me that my design is inherently broken (or so).
I'd be grateful if anybody had an idea how to fix this :slight_smile:

I tried this code
A reduced code example can be found on the playground, but I've also put it below.

Click to expand
/// `'input` is the lifetime of the string we are scanning.
pub(crate) enum Lexeme<'input> {
    String(&'input str),
    // MANY other variants omitted here

impl<'a> Into<&'a str> for &'a Lexeme<'a> {
    fn into(self) -> &'a str {
        match self {
            Lexeme::Backslash => "\\",
            Lexeme::String(s) => s,

/// `'input` is the lifetime of the string we are tokenising.
enum Token<'input> {
    /// Represents a `TeX` command in my actual program (slightly simplified).
    Command(&'input str),
    /// A string which is not a `TeX` command name.
    FreeString(&'input str),

/// Recognise `TeX` commands (reduced toy version).
fn recognise_commands(input: Vec<Lexeme>) -> Vec<Token> {
    let mut result = Vec::new();
    let mut iter = input.iter();
    while let Some(pt) = {
        match pt {
            // If we have a backslash, look ahead to see if we have a command.
            Lexeme::Backslash => {
                let token =
                    match iter.as_slice().get(0) {
                        Some(Lexeme::String(s)) => Token::Command(s),
                        Some(other) => {
                            // Otherwise, a non-letter ends the current command.
                            // Inlining the into() implementation above works (but yields duplication).
                        // This backslash is the last character in the input. Convert it to a string.
                        None => Token::FreeString("\\")
            // This string is not part of command.
            Lexeme::String(s) => result.push(Token::FreeString(s)),
fn main() {}

I expected the code to compile without errors.

Instead, compiling this (both with nightly or stable rust) yields an E0515 error.

Full output
   Compiling playground v0.0.1 (/playground)
error[E0515]: cannot return value referencing function parameter `input`
  --> src/
28 |     let mut iter = input.iter();
   |                    ----- `input` is borrowed here
50 |     result
   |     ^^^^^^ returns a value referencing data owned by the current function

error: aborting due to previous error

For more information about this error, try `rustc --explain E0515`.
error: could not compile `playground`

To learn more, run the command again with --verbose.

Closing thoughts
If I inline the call to into() into the match, the error goes away! However, this

  • would actually duplicate code. For some other parts of my code (not shown in my reduced example), I need to implement Display on Lexeme, and the into() method is just what I want for that purpose.
  • would be a layering violation. (In my actual code, Lexemes and Tokens are in separate modules, and displaying Lexemes should belong into that module.)

To pre-empt some suggestions, I cannot easily replace the match by a call to map: recognising a command consumes up to two additional tokens; I don't know how to model this with a simple map().

Oh, and blindly changing to iter() to into_iter() didn't make things better.

Any thoughts or suggestions how to deal with this best? Thanks for reading!

This is too restrictive

impl<'a> Into<&'a str> for &'a Lexeme<'a>

Try this instead

impl<'a> Into<&'a str> for &Lexeme<'a>

Otherwise Rust thinks that you are borrowing from the Lexeme, when you are really copying out of it


That worked like a charm, thanks so much!

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.