I'm going to ignore the correctness of ones algorithm/implementation. If you get that wrong there is not much that can be done by a language, it's compiler or it's run time. As far as I know. If you used a "+" where you meant to use a "-" how is the compiler to know that is wrong? Yes there are ways to do formal proofs of correctness and such but I'm going to put such things on another, higher, level of abstraction.
When it comes to correctness in a programming language let's start with the simple and obvious notion that instructions/expressions/commands etc in the language should do what they are supposed to do or, if they cannot do it, produce a meaningful error/failure message. What a language should not do is just crash totally without any indication of the problem. Worse still perhaps it should not silently produce the incorrect result and continue running.
When I write "x = x + 1" then after execution x had better be 1 bigger than it was before. Not some incorrect random other number, not some overflow/wrap around result. If it overflows execution should halt and say so.
Similarly for all the other arithmetic and logical operators.
When I write "x = a[i]" then after execution x had better be the value of the i'th element of a. If there is no i'th element x should not be some other random number. The program should not just crash without indicating a meaningful reason why.
When I dereference some pointer like thing, x = *p, then after execution x had better be the value of whatever p is referencing AND that thing had better be properly defined prior to execution. Again, not some other random number and not crashing blindly.
Similarly dereferencing NULL pointers and the like had better fail in a meaningful way.
Conversely, when writing to a dereferenced pointer then some defined object it the program had better be written to, not some random location in memory that has no meaning. An action that may crash my program or worse still silently corrupt my data.
I could continue, but you probably get the idea by now. Then of course there are all the issues around memory leaks, resource leaks and sharing data between concurrent processes.
Now, as simple and obvious as this idea of correctness is, most programming languages in use today do not implement it. Which if you think about it for a second is an appalling, intolerable situation.
Languages like Java, C# and such go someway to achieving this correctness. At a huge cost in performance, timing determinism, memory requirements and other problems.
The first programming language I learned was ALGOL. A compiled language that enforced such correctness. Since then the only languages I know that do so is Ada and now Rust.
The biggest joke of all is C++. A hugely complex and sophisticated language system, with it's classes, inheritance, generics and now lambdas, closures, type deduction etc, etc, etc. Developed over decades. Yet at it's core built on C with all the same possibilities of incorrectness through and through. It's like having built a gigantic modern tower block on the rickety foundations of my grandfathers shed. Well, it would be a funny joke if it were not so tragically sad and disastrous.
In short, correctness in a programming language is simply: do the correct thing or halt with a meaningful reason why not.
No, I do not accept that all the documentation of such undefined behaviors in a language specification makes any such incorrectness in programs the programmers fault and therefore lack of correctness in execution is the programmers fault. We do our best to check for user errors in the operation of application software, why not view a compiler as an application in the same way and check for user (programmer) errors as best we can?
And this is why I'm here. Having had to deal with many languages over many years Rust is the first one since Ada that tackles the correctness problems whilst still being able to produce small and fast executables in the way C/C++ do. I can see this taking off in the embedded systems and IoT world where correctness, security, performance and size are all important.