Calin said:
Diversity is good for fine tuning and optimisation. But it`s hard to learn for newcomers and also very demanding and expensive.
This is always the case. Usually the lower-level the construct, the more the language must assume the programmer knows.
At the assembly programming level, the assembler assumes the programmer knows all the risks and is doing everything right.
In languages like C and C++ the compiler can do an awful lot of hand-holding and detects many conditions with diagnostic warnings, but still ultimately assumes that if the programmer says “do it” the programmer knows what they're saying. Sometimes the costs of the abstractions are high, like with Boolean values versus bools and with bit manipulation even if the hardware doesn't support it directly. Even so, it provides some options and enough to usually get the job done.
As you get even more abstract in languages, there are tools and languages that will do the “correct” thing even if they're terrible at the hardware level. You'll have a type and get into enormous hidden conversions as seen in languages like Python or JavaScript. Rust offers lots of protections in memory and for expressions, but comes at an implementation cost. They're easy for beginners to approach but the naive approach can have unexpected performance problems. In all of these langauges it's easy to write code that hums along in one scenario, then suddenly falls off the performance cliff with no obvious reasons why.
Drawing back to C++ and the original topic, that's a tradeoff the language has made. The original language didn't deal with Boolean logic, it was made as a portable way to program at a higher level than machine code. Thus, the original language dealt with a char as the smallest addressable unit, and if you wanted bit manipulation you could do it. Adding Boolean logic meant additional rules around conversions and promotions, and the decision to keep it as a single addressable object meant requiring a full byte (or more) to address what could be stored in a single bit. That's potentially wasteful from a memory perspective but good for processing performance, and it ensures correct behavior of Boolean logic. The tradeoff is a bitfield which is trivially implemented but requires additional processing work, sometimes significant processing work, in order to regain that space in the time/space exchange. The compiler still holds your hand and ensures correct behavior, but you're exchanging storage space for processing time, a common optimization choice.
Exchanging between the two requires more knowledge and is less beginner friendly, but offers all the options to those who know to take them. The default and easier to use approach of the bool data type encoding Boolean logic, taking a single machine addressable byte which also typically means 7 bits wasted space but faster processing is usually the right approach.