Why Rust has no tilde (~) operator as in C language?

In C, ~ is used for bitwise NOT, and ! is used for logical NOT.
and Rust uses ! for bitwse NOT and logical NOT.

printf("%d", !2);  // C: prints 0
printf("%d", ~2);  // C: prints -3
println!("{}", !2);  // Rust: prints -3
println!("{}", ~2);  // Rust: Does not compile

Why is that?

The only type for which logical negation makes sense is bool, and for a single bit, logical negation is the same as bitwise negation. So there's no need for two separate symbols.

C abuses "logical" negation for integers with an implicit meaning of "different from zero". That's more clearly expressed in any language as value != 0.

22 Likes

Early in the language, ~ was used as an operator for heap allocation, but that was removed well before Rust 1.0.

2 Likes

To add to this, the reason C "needs" separate logical negation (!) and bitwise negation (~) is that it didn't[1] have a boolean type. Before C99, conventionally BOOL was a macro (or typedef) for int, FALSE for 0, and TRUE for 1. (For legacy/backcompat reasons[2], you'll often still see these used.) The controlling expression of an if or while is then tested for comparing equal to zero.

So without a boolean type, if you want boolean negation, you need an operator that does boolean negation to any integral type. C !a is defined to have type int with value 0 if a compares unequal to literal zero and 1 if a compares equal to literal zero. C's usual arithmetic conversions paper over the gaps.

Rust doesn't do that. Rust makes you ask for (most[3]) coercions instead of making them implicit. Thus there's only one "not" operator needed. If it weren't for the lazy/short-circuiting behavior of logical &&/|| in a procedural language with mutation, the same unification could've also applied to "and"/"or".


  1. C99's _Bool is an actual boolean type along with <stdbool.h> providing bool/true/false macros, and C23 makes bool/true/false into predefined macro like keywords. ↩ī¸Ž

  2. And being very pedantic, int is a more guaranteed type to pass over ABI boundaries than _Bool is, depending on just how paranoid about minimally defined calling conventions you want to be — int is historically just "a register" for the purpose of an assembly level call, whereas _Bool has to additionally worry about defining how it's converted from a single bit to asm level parameters. But any halfway reasonable "stable" calling convention in the modern era will. ↩ī¸Ž

  3. As with all practical systems, there are exceptions. Ask your compiler today if deref/reborrow coercions, unsizing coercions, and lifetime variance are right for you. ↩ī¸Ž

9 Likes