Cannot infer an appropriate lifetime due to conflicting requirements

I have yet another problem with Rust with lifetime inference ...

The following code belongs to a small playground code base for a neural network.
I am mainly using the crate "ndarray" for the matrix stuff.

struct Layer32 {
    weights: Array<f32, (Ix, Ix)>,
    outputs: Array<f32, Ix>
}

impl NeuralLayer for Layer32 {
    type Elem = f32;

    fn feed_forward<'a>(
        &'a mut self,
        input: &[Self::Elem],
        activation_fn: &ActivationFn<Self::Elem>
    )
        -> &'a [Self::Elem] // annotated lifetime!
    {
        // unimportant impl details ...
        self.output_as_slice()
    }
}

pub struct ConvolutionalNet32 {
    layers: Vec<Layer32>
}

impl NeuralNet for ConvolutionalNet32
{
    type Elem = f32;

    fn feed<'a, 'b>(&'a mut self, input: &'b [Self::Elem]) -> &'a [Self::Elem] {
        {
            let mut out = input;
            for layer in self.layers.iter_mut() {
                out = layer.feed_forward(out, &activation_fn);
            }
        }
        self.layers.last().unwrap().output_as_slice()
    }
}

The above code works just fine.
However, I didn't like the ConvolutionalNet32::feed method and thus I experimented a bit to convert it to a (hopefully) equivalent functional method shown below, using only a single fold.

impl NeuralNet for ConvolutionalNet32
{
    type Elem = f32;

    fn feed<'a, 'b>(&'a mut self, input: &'b [Self::Elem]) -> &'a [Self::Elem] {
        self.layers.iter_mut().fold(input, |out, layer| layer.feed_forward(out, &activation_fn))
    }
}

The above code is not accepted by the compiler with the following error:

src/prophet/convolutional_net.rs:130:26: 130:30 error: cannot infer an appropriate lifetime due to conflicting requirements [E0495]
src/prophet/convolutional_net.rs:130         self.layers.iter_mut().fold(input, |out, layer| layer.feed_forward(out, &activation_fn))
                                                                    ^~~~
src/prophet/convolutional_net.rs:129:2: 138:3 help: consider using an explicit lifetime parameter as shown: fn feed<'a>(&'a mut self, input: &'a [Self::Elem]) -> &'a [Self::Elem]
src/prophet/convolutional_net.rs:129     fn feed<'a, 'b>(&'a mut self, input: &'b [Self::Elem]) -> &'a [Self::Elem] {
                                         ^

Is there a way I can fix this or help the compiler to make it accept the more functional approach?

I think the issue here is that fold function returns the type of the initial value. In your case the type of initial value is &'b [Self::Elem], however the "feed" function requires that return type is &'a [Self::Elem] where 'a is a lifetime of &self. You could change the signature of the feed function requiring that 'b: 'a (which I assume you can not do as it's part of the library trait), you could keep a value for type Vec on your struct, which would leave as long as your struct thus satisfying 'a requirements and use that value in fold, so you would have to copy the "input" into that variable first. Hope this makes sense

1 Like

Thank you for your answer!

I am the writer of the trait so I was able to redefine it with your recommended lifetime signature.
However, I have yet to find out if this is a good idea. :smiley:

Since performance is a high priority for this code I do not want to go with unnecesary copies. :S

  • works
pub trait NeuralNet {
    type Elem: Float;

    fn feed<'a, 'b: 'a>(&'a mut self, input: &'b [Self::Elem]) -> &'a [Self::Elem];
}


impl NeuralNet for ConvolutionalNet32
{
    type Elem = f32;

    fn feed<'a, 'b: 'a>(&'a mut self, input: &'b [Self::Elem]) -> &'a [Self::Elem] {
        self.layers.iter_mut().fold(input, |out, layer| layer.feed_forward(out, &activation_fn))
    }
}

I think this is too restrictive to be practical.
If I understand it correctly, this means every input that you feed the neural net must outlive the net itself. In other words you have to keep all inputs as long as the neural net exists...

It's probably much more practical to invert the relationship:

pub trait NeuralNet {
    type Elem: Float;

    fn feed<'b, 'a: 'b>(&'a mut self, input: &'b [Self::Elem]) -> &'b [Self::Elem];
}

which means you only have to keep the inputs as long as you need the return value.
And the neural net also has to outlive the input, but this is probably ok.

3 Likes

Thank you for your comment!
Made the changes and works perfectly fine. =D

Seems like I should re-read the rust book chapter about lifetimes - they still confuse me a lot in situations like these ...