Fighting the Borrow Checker

I'm running into a curious problem with this code. The compiler is insisting to me that there is an active borrow where, as best I can tell, there should not be one. In each case, the next call is to a function that does take a mutable borrow of self, so obviously things aren't compiling. There are several different instances of this occurring, but here's the archetypal example:

fn key_down_event(&mut self, ctx: &mut Context, keycode: KeyCode, _keymod: KeyMods, _repeat: bool) {
let stack = self.read_stack(0);
self.display(stack, true);

And the structs, enums, and functions that are involved with the above code look like this:

fn read_stack(&self, key: usize) -> Output {
let mut stacks = vec![];
for player in &self.players {
if !player.has_busted() {
for player in &self.players {
if player.has_busted() {
if key >= stacks.len() {
}else {

pub struct Player {
name: String,
hand: Vec<Card>,
folded: bool,
stack: i32,
to_call: i32,
to_play: bool,
busted: bool,
impl Player {
pub fn print_stack(&self) -> Output {
Output::FormattedStack(, self.stack, self.folded, self.busted)
pub enum Output<'a> {
FormattedStack(String, i32, bool, bool),

And the errors I'm getting look something like this:

error[E0502]: cannot borrow *self as mutable because it is also borrowed as immutable
--> src\
665 | let stack = self.read_stack(0);
| ------------------ immutable borrow occurs here
666 | self.display(stack, true);
| ^^^^^-------^^^^^^^^^^^^^
| | |
| | immutable borrow later used by call
| mutable borrow occurs here

I'm not seeing where there should be a borrow that holds over until the second line of code here. The Output variant that read_stack returns contains only owned types; there are no references to self or any of its fields that are returned. If anyone could shed some light on this, I would greatly appreciate it.

read_stack(&self, key: usize) -> Output is sugar for read_stack<'a>(&'a self, key: usize) -> Output<'a>

This is less confusing if you turn on elided_lifetimes_in_paths lint to avoid such hidden occurrences of elided lifetimes.

Also, you could put more effort into properly formatting your question :wink: Formatted error messages, and indented code would help a lot.


I understand where the lifetimes are elided here; what I'm not understanding is how that is convincing the compiler that self is still borrowed after read_stack returns.

Oh, well, that’s easy. As long as Output<'a> exists, the &'a self borrow is kept alive. After all, the type Output<'a> could be containing the reference &'a self reference, as far as the signature of read_stack is concerned.

Apparently display takes &mut self, if I understand the error message correctly, and that’s the conflict: the &mut self and the &self-based Output<'_> can’t coexist.

1 Like

Ah. So redefining Output to not require a lifetime specifier should fix this? I have one variant in Output right now that takes an &str, which I could just turn into a String instead with no real headache.

That sounds like a reasonable approach to solve the problem, yes. [1]

  1. Alternatively, if there is a strong reason for keeping it a &str (and that &str is borrowed from some run-time values, i.e. not just a string literal), then you could also change the signatures of methods that don’t produce borrowing variants to be e.g. print_stack(&self) -> Output<'static> and read_stack(&self, key: usize) -> Output<'static> or something like that. ↩︎

Alternatively, if you find that variant useful but it's not actually being used in this case (i.e. read_stack never actually returns an Output<'_> borrowing something from &self), you could

-fn read_stack(&self, key: usize) -> Output<'_> {
+fn read_stack(&self, key: usize) -> Output<'static> {