What is the correct parallelization approach if you do not want to allocate all parallelized items?

If you run the program with a debugger, it will tell you at which line you got the stack overflow. For instance, you could try gdb.

1 Like

I'll try that :slight_smile:

Yay! I tried to run the code in VSCode instead of IntelliJ, because VSCode supports a debugger... Merely running the app in VSCode provided more information than in IntelliJ (sad). I finally got to see a message printed to the console, which is:

Rayon: detected unexpected panic; aborting

However, there is no actual panic message. Even with debugging I can't find out the underlying panic message... Where could I find that?

As it turns out, when the thread_pool.spawned task panics, the whole process is aborted. Adding a panic handler to the thread pool prevents abortion, but of course this does not solve the problem. The synchronous version of the algorithm never panics, so the reason for the panic must somehow be the sending or receiving of the blocks

Edit: Through the use of println! debugging I am confident that sending fails and unwrapping the resulting error panics... Yes! If there is any error, the receiver is dropped, and the thread cannot send the decompressed block. But the synchronous version did not have any errors so there should be no errors... right? (talking to myself)

Edit: The test actually required some operation to fail, so it is expected that an error occurs, even though the test does not fail. This was my problematic erroneous assumption, as I assumed no errors should happen. Now that I know this, the presence of an error is not a problem anymore and the code can be fixed by simply not panicking when the sending fails.

After talking to my rubber duck we decided that your suggestion is pretty and works perfectly for our use case, alice. Thank you for your time <3

2 Likes

:duck:

4 Likes

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.