When I use the debian:stable-slim docker image to run my rust binary it always failes.
With rust:latest it does run. On a real Debian it runs as well.
Any ideas why that is and how it can be fixed?
Thx
thread 'main' panicked at 'range end index 30 out of range for slice of length 20', /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/alloc/src/vec/mod.rs:2004:36
stack backtrace:
0: rust_begin_unwind
at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panicking.rs:575:5
1: core::panicking::panic_fmt
at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/panicking.rs:64:14
2: core::slice::index::slice_end_index_len_fail_rt
at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/slice/index.rs:77:5
3: core::slice::index::slice_end_index_len_fail
at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/slice/index.rs:69:9
It's hard to tell what's going on. It seems like the backtrace is cut off. The actual source of the error is below what you've posted, closer to the end of the backtrace.
Slice out of bounds is a very basic error, which could happen in any piece of code for lots of reasons. Without the full error message it's impossible to say what is using the slice and why.
This could be anything. It could be a different order of files on disk, causing some bad build script to misbehave. Or it could be a terrible data corruption deep in some binary shipped in the image.
Please post the full error because without more information it's impossible to tell and it's just needless guesswork.
No, don't use the rustc package! apt will give you very outdated and practically unusable version of Rust. The Debian package may even miscompile code, because it uses Debian's LLVM not Rust's LLVM.
Don't try to use Debian's Rust for anything. Get the official, up-to-date Rust version from https://rustup.rs or use the official docker images with rustup-based Rust: Docker
There shouldn't be, unless your Debian packages are messed up, and you've somehow managed to install Debian's version of cargo without installing Debian's version of rustc.
If you don't use Debian's packages, and use official Rust version instead, it will have everything, and rustup will set up the environment correctly.
You still haven't posted the full error message output, so again I'm just guessing based on no actual information about the error you're seeing.
From the backtrace it looks like it's a runtime issue in your program. The root cause is in r_news_poster_bot::hn_frontpage::HackerNewsFrontpage::get_fp_stories. I can't find the source for it, so I assume it's your private.
Since it's a runtime issue, it's very unlikely to be actually affected by presence of Debian's rust compiler (unless you're trying to compile Rust code from an already-compiled Rust program, which would be weird).
Look for uses of [] in get_fp_stories function, and find which one is not correct.
Try not to use [] at all. Rust has iterators, so e.g. for i in 0..10 { foo[i] } is fragile and slow, but for item in foo.iter().take(10) {} is faster and can't panic.
If you have to use arbitrary indexing, instead of foo[i] you can use foo.get(i).expect("failed to get foo") that will panic with a less generic error message. Even better, if you make your function return Result<…, Box<dyn Error>> and use foo.get(i).ok_or("missing item")? it will handle the error without stopping the whole program.
If you're fetching something over the network, the difference between images can be caused by how much network access they allow, and lack of error handling.
to fetch over HTTPS, the image needs to have ca-certificates package installed
it should have reasonably recent openssl version, and Debian doesn't have reasonably recent versions of anything.
Docker must allow network connections
If you're making HTTP request, then anything intercepting the connection can replace response with something else.
Also check HTTP status before parsing the page to be sure you're not trying to read data from an error page. Ensure page structure is how you expect. Your program could be blocked for being a bot and fed garbage data too.