The problem is that C++ does not have a standard ABI, each compiler implements their own, with varying degrees of stability promises. So you need the C++ to expose a C ABI if you want this to be foolproof.
Yeah. The reason everyone does FFI through a C ABI is not that C is awesome – it just has a stable and reasonably simple ABI to work with.
To some extent, this can be solved if we tightly integrate with a C++ compiler. For example, we could have clang frontend parse the headers and then query it about precise layout of the data structures it is going to generate.
I’ve posted some ideas here the other day. If you think of something else, please add to that discussion.
Personally I don’t see any other ways for now. And template->generic transformation isn’t possible in general case - because as you know templates are duck-typed and have multiple ways of specialization: not only via template arguments, but also via SFINAE.
Sorry for resurrecting this thread, but I just started reading https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md after finishing reading through this tutorial. I am amazed by how much overlap there is with Rust concepts and the guidelines put forward there. I would say that if you follow these guidelines, you should have safe C++ code. The problem with that is of course that the compiler doesn’t enforce it; you have to use an additional tool which can only give warnings.
That said, as a young programmer (born around the turn of the century), I am sympathetic with C/C++. Imagine yourself in 1972. Imagine how awesome it must have been to be able to write an OS in a high-level language. Even if we now think C is the worst language ever because of memory and security bugs, it must have been the thing back then. Remember that most computers weren’t networked back then, so the occasional system crash wasn’t as bad. Security vulnerabilities weren’t even on the radar back then. And, I bet all those OSes and other low-level programs written in assembly prior to that crashed even more.
Then came C++ in 1985. I don’t know about you, but I don’t want to write a modern browser or OS in C. I like my higher level of abstraction that C++ provide. Still, this was not memory safe and security bugs were easy. Yet again, this was before the Internet, so I believe that security vulnerabilities weren’t as much of an issue as it is today. C++ is also most definitely safer than C.
Then came the 90s and Java with its garbage collector. It was wonderful if you could afford the performance hit, but if you needed speed, you still had to use C++ or, if it wasn’t fast enough, C. Thus C++ carved out a niche for itself. This was also the time of the advent of the Internet. Security bugs suddenly became much more of an issue. But you needed fast code for your OS, so you still programmed in C or C++.
In the first decade of the 21st century, multithreading became a thing. C and C++ (and all the other languages) had to adapt. But they botched it. Race conditions were discovered, I believe, after language support for multithreading. By this time, the computing landscape changed dramatically from 1972 when C first came out. There is no way the designers could’ve foreseen all that. The result was a language with more and more pitfalls, because it wasn’t designed with the new developments in mind. Clearly a new direction was needed. The problem was, if you needed fast software, there was no “new direction” for you to take. C and C++ were still the only games in town.
Enter Rust. A new language developed with the hindsight of all the failures of C and C++ in the face of new technology, and more crucially, a knowledge of what these new technologies are. The development of a new language was almost unavoidable. The need arised to address the use-after-free, double free, buffer overread, buffer overflow, dangling pointers/references, nullpointer dereferences, use of uninitialized data, data races, etc. nightmares.
Do I say that C++ is a bad language? No, I think it was very good for its time. Do I believe that it is past its prime? Maybe. But it isn’t dead yet. Do I believe that a Rust could and should replace it as the language in which new high-performance code is written? Absolutely! But maybe the most important lesson we should learn from this anecdote is that while we believe Rust will still be used 40 years from now, we shouldn’t be as shortsighted as to close our eyes for and turn our backs on new languages which are developed to address problems arising from new technology we cannot forsee. We should realize that Rust will not necessarily live for ever, just like C and C++ will not live for ever. New hardware and tech will mandate new languages. May we as community be open-minded enough to realize that when those new languages start to make their appearance and not cling to our beloved Rust like C and C++ programmers sadly do today.
PS sorry for the long rant.
An interesting rant. Thanks for posting it.
As a not so young programmer who wrote his first program in 1963, I have one, minor correction. We knew about race conditions much earlier than the 2000s. In the mid 1980s, I managed to generate plenty of them when using various dialects of parallel FORTRAN.
Would it be correct to say that race conditions weren’t as much of an issue back then because parallel programming was a very niche thing back then?
Race conditions were a problem for anybody trying to take advantage of shared memory multiprocessors. My work was in scientific programming, but people working with commercial software likely faced similar problems.