How do you pass a 2D array to a function?


Suppose I have a 2D array like this:

let features = [
let labels = [1.0,2.0,3.0,4.0];

My function signature is:

pub fn batch(alpha:f32, features:&[&[f32]], labels:&[f32]) -> Vec<f32>

If I call my function like this:

gradient_descent::batch(0.001f32, &features, &labels)

I get this error: 11:70 error: mismatched types:
 expected `&[&[f32]]`,
    found `&[[f32; 1]; 4]`
(expected slice,
    found array of 4 elements) [E0308]

How can I pass a fixed size array of arrays as a slice of slices?



Here in your signature the feature parameter is expecting “an array reference of arrays references”. What you are trying to do is to pass an array reference of fixed sized arrays.

You may fix this just by listening to what the compiler is telling you :wink:.
This might work (haven’t tried):

pub fn batch(alpha:f32, features:&[[f32; 1]; 4], labels:&[f32]) -> Vec<f32>



Sorry, I wasn’t very clear about the problem. I could specify the exact size of the input arrays in my function definition, but this would limit all inputs to be of that exact size. I am basically passing an (n x m) matrix where I don’t know n or m at compile time. This is why I wanted to use slices.

Thanks though


Fixed width arrays are passed by value. You can pass an array of unknown length by reference if you use a slice. To do it the way you are trying to do, you’d need to take slices of the arrays before you make the outer array of slices, using


which is slice syntax. You’d also have to create a binding for each interior array because someone must own it.

I think what you really want is a vector of vectors, or if you don’t want them to be growable, a boxed slice of boxed slices. This way, the outer structure owns the interior structures and you don’t have to worry about bindings or lifetimes. To see both the borrowed slices usage and the layered vectors, see this code:


If you create them with fixed-sized slices, you will need to specify the size of the inner slice in the function definition.

The reason for this being that &[&[f32]] and &[[f32; 1]] are fundamentally different constructs - one is an array of references, the other is an array of arrays. The array of references is going to store an array of (pointer, size), while the array of arrays just stores (inner_value).

You can create an &[&[f32]] by using let features_refs = features.iter().map(|x| &x[..]).collect::<Vec<&[f32]>>();, but as shown, this requires allocating a new Vec to store the references in.


The other, more traditional approach for representing matrices would be as a struct holding a Vec<f32> and the dimensions. Then you’d just need a convenience method to access the elements in either row major or column major format. This is what you’ll want if you’re going to do any serious numerics. It’s something like the 2d analogue of a slice. Of course you could also implement a slice-like type on top of that, if you’ll be wanting to take submatrices.

BTW this advice is not based on experience with rust, but rather on experience with other languages than Fortran. You pretty much always have to do the same song and dance if you care about performance, and thus memory locality.


Just to answer the question; it’s possible to use generics to take anything array-ish that contains anything array-ish:

fn batch<Matrix: AsRef<[Row]>, Row: AsRef<[f32]>>(features: Matrix) {
    for row in features.as_ref() {
        for cell in row.as_ref() {
            print!("{} ", cell);


This basically says that Matrix is required to be representable as a slice of Rows and that Row is required to be representable as a slice of f32, and we don’t care if they are arrays, slices, Vecs or whatever.

@droundy is of course right when it comes to how a matrix is most efficiently represented. That’s how you avoid unnecessary indirections and fragmentation.