Choose FFI signature at runtime

Nowhere did I assert that kind of thing. Which part do you think says that? I don't even think I or anyone else in this thread proposed anything that doesn't work in practice. Even the dubious hacks enumerated above tend to "work" as "expected". My claim is exactly on the contrary: even the fact that a program apparently behaves as expected doesn't constitute proof of its correctness.

Consider what famously happened to OpenSSL a couple
of years ago. It was the de facto cryptography library. Surely its authors must be right! Alas, as it turns out, they weren't. The OpenSSL code happened to be full of bad practice and UB, one notable instance being uninitialized variables used for gathering entropy and "random" values off the stack. It worked for a long time, until one day GCC got smarter and started exploiting said UB for some sort of optimization, resulting in drastic consequences wrt. the correctness of OpenSSL.

This is the prototypical example of programming against a naïve mental model instead of staying within the explicitly allowed boundaries.

And, by the way, if you ask me: if I program against a standard, and the code doesn't work correctly, that is most definitely not my fault. I don't appreciate how this is relevant to the discussion at all, because again, we are arguing about the converse, but it's true nevertheless. This is not to say I won't try to work around a bug in a buggy compiler, for example, but starting with the assumption that compilers are so buggy that we might as well not obey any rules and write anything we imagine to be "correct" is a bizarre and nonsensically nihilist point of view.

That is grossly misrepresenting my claim, too. I don't recommend using obscure features. On the contrary, I recommend writing clear code, and simultaneously staying within what is defined to be correct. These two don't contradict each other. If anything, it is your recommendation of dynamically changing prototypes that could be considered somewhat obscure.

The claim that it's guaranteed by practice/experience is incorrect. While dlopen is indeed not in the ISO C standard, there absolutely is a standard that is of similar importance to C programmers: POSIX. The POSIX C library does define dlopen and the conversion between void * and function pointers. That is, for POSIX-conformant implementations, it is guaranteed, although not by the core language, but by a broader standard.

That is definitely not comparable to J. Random Person claiming that "compilers don't care about abused feature X and it will be just fine, trust me".

Care to show me a link? Because the only case which looks very-slightly-similar to what you describe was not case of “GCC becoming smarter” but rather someone who have no idea what they are doing have gotten bolder.

It happens. No, coding according to the standard wouldn't save you. Compilers tend to give warning for the fully-standard-compliant-code, too.

Give me a link and we can continue from there.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.