Advertisement

OMG C++ sucks !

Started by September 20, 2010 07:38 PM
49 comments, last by Way Walker 14 years, 1 month ago
Quote: Original post by MrCoolsman
Third, if your code is taking a while to compile, turn off optimizations and debugging support.


This is a total non-option for most production projects in the game industry. You either need debugging support for finding code issues, or you need optimizations so you can actually play the damn game at a reasonable framerate.

Quote: Fourth, template error hell. Yep, that's the price you pay for a Turing-complete generics system. Of course, it didn't need to be that way. Would you have preferred generics the same way C# and Java do it? Or have the power to do the things that Boost does all the time? Everything's a choice.


I'm with incin on this one - Lisp-style macros/compile-time metaprogramming is a far superior option. Granted, C++ was created when that kind of hackery wasn't reasonable for resource and performance reasons, but that's no reason not to explore alternatives.

This is in fact one of the major goals of Epoch.


Quote: Fifth, an import system. Personally, there isn't much difference between "#include <somelib.h>" and "import somelib;". You still have to have the path set correct, which is where most of the pain comes from.


I strongly disagree.

In my experience, much of the pain comes from the fact that #includes are essentially mechanical versions of copy/pasting code between files. This can lead to all kinds of subtle problems if you do something naughty in a header, like use a using declaration.

The other big source of pain in C++ is the inability to use a class without having its definition visible in each translation unit. If we had a real module system, we could write code once instead of splitting between a definition in a header and a redundant copy of the interface in the .cpp implementation file. This kind of thing made sense in the 1970's when memory was scarce and holding every module of a large program in RAM during compilation was a non-starter. In 2010, it's fucking stupid.


Quote: Sixth, intellisense. Yup, it sucks. Nothing I can say about that.


We had a big discussion awhile back on why exactly this is. The bottom line is that the C++ grammar is so idiotically hard to parse that building reliable incremental parsers for it is very, very difficult. This is the real source of faster and more informative syntax completion and similar features in languages like C#, or even Java.

Quote: Seventh, concurrency. Every imperative language fails at it.


Again, this is a problem that we are directly (and I daresay successfully) attacking in Epoch.


Quote: Eighth, memory management. The fact that I can overload global new and control memory allocation in critical situations is a plus. However, those instances are few and far between, so there isn't much of a reason to have that feature in other languages.


Except it means that we can't implement things that do need manual memory management, unless we blend C++ and Some Other Language. This is messy, impractical for many projects, and difficult for most programmers (believe it or not). Keeping everything within a unified, single-language framework makes a lot of sense. Once again, see Epoch.

Quote: Ninth, data types. The standard type uint32_t is guaranteed to be 32 bits. This is no longer a portability issue.


Except as you noted the bulk of C++ code is actually legacy stuff anyways, which predates uint32_t. If the language had been standardized on type sizes to begin with (which would have made it useless for portable systems programming back in the day and so arguably would have been a bad decision at that time) then these problems wouldn't have arisen.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Quote: Original post by ApochPiQ
One of the things I want to do properly from day one in Epoch is to support very direct mapping of code to hardware concepts. For instance, integer is always 32 bits, and integer8, integer16, and integer64 types will all be provided. There is no support for type munging or implicit conversions; if you want to convert a 32-bit integer into an 8-bit integer, you have to explicitly cast it; depending on the cast operator used, you can optionally throw an exception if the conversion overflows (or permit truncation if that's more sensible, as is sometimes the case when bit-twiddling). No more accidentally losing sign bits or other valuable data because the compiler implicitly fucked your types for you.

...

Moreover, there will be support for adding your own custom type constructors and other data directly to the VM, so if you want to write a port of Epoch for a platform with a 13-bit byte, there's nothing stopping you - and that code will be visibly different from, say, 32-bit or 64-bit code, because the types will be integer13 instead of some mysterious "char".


The way I'm solving this in my language is to have @int (er, types are prefaced with an @) take an integer template parameter. Like, @int<32>, or @int<64>, etc. Templated types using all the default parameters can drop the brackets, so @int would use the default parameter. Which would probably the word-size of the machine (32 or 64 bits, probably). Technically it doesn't take an integer, as that's hilariously circular, it takes a @bitwidth, but whatever.

Finally, as a command-line argument, you'd preface compilation by telling the compiler which bitwidths were acceptable.
[ search: google ][ programming: msdn | boost | opengl ][ languages: nihongo ]
Advertisement
Quote: Original post by ApochPiQ
Except as you noted the bulk of C++ code is actually legacy stuff anyways, which predates uint32_t. If the language had been standardized on type sizes to begin with (which would have made it useless for portable systems programming back in the day and so arguably would have been a bad decision at that time) then these problems wouldn't have arisen.


Arguably, types such as uint32_t (or uint32 - gah, I hate the _t annotation) should have been part of the spec from the beginning. Too late for that now.

[OpenTK: C# OpenGL 4.4, OpenGL ES 3.0 and OpenAL 1.1. Now with Linux/KMS support!]

Quote: Original post by _goat
er, types are prefaced with an @


Can I ask why?

What advantage does that give to the language?


One of the biggest things that annoys me about C and C++ is the use of & and * when dealing with pointers. I would have much rather have to type out a 3 or 4 character keyword for either operation and have it completely clear when I come back to using the language after not touching it for months, and not having to even think about which one does what.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
Quote: Original post by Talroth
One of the biggest things that annoys me about C and C++ is the use of & and * when dealing with pointers. I would have much rather have to type out a 3 or 4 character keyword for either operation and have it completely clear when I come back to using the language after not touching it for months, and not having to even think about which one does what.

template < typename T >       T& deref(       T* ptr ) { return *ptr; }template < typename T > const T& deref( const T* ptr ) { return *ptr; }template < typename T >       T* ptr(       T& ref ) { return &ref; }template < typename T > const T* ptr( const T& ref ) { return &ref; }

That said, if you're seriously having trouble keeping them apart, consider completing a small project abusing pointers for pass-by-reference, and possibly for hand rolling arrays everywhere, instead of abusing the above. This may also help you learn what to look for when trying to unravel or debug poorly written code.


Noting this (somewhat odd) dichotomy may also help:
type* defines a pointer   type          *expression results in a reference valuetype& defines a reference type          &expression results in a pointer   value


Use it enough and a few months won't dull your memory of it.
Quote: Original post by MaulingMonkey
Use it enough and a few months won't dull your memory of it.


But that is the thing, I have no reason or desire to use C that much. Pointers are easy enough to understand, it is the second guessing of scanning a page of code and wondering if you have things the correct way in your head. Then it is usually just looking it up to double check, or just testing it in a scrap script.

Not to mention the number of different ways to format the code and its layout. I think that was the key part that made me like Python.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
Advertisement
Quote: Original post by Talroth
Quote: Original post by MaulingMonkey
Use it enough and a few months won't dull your memory of it.


But that is the thing, I have no reason or desire to use C that much. Pointers are easy enough to understand, it is the second guessing of scanning a page of code and wondering if you have things the correct way in your head. Then it is usually just looking it up to double check, or just testing it in a scrap script.

So you dislike C's syntax because you don't want to learn C, basically? This is rather circular reasoning. Taken to the logical extreme, the argument could apply to any and all syntax, which is self defeating: Languages need some form of syntax. Even python ;)

With C, you at least have a legitimate complaint in the symbols having multiple purposes (* being dereferencing, declaring pointers, and multiplication -- & being creating a pointer to, declaring references, bitwise and, part of logical and.) to the point that keeping them straight requires slightly less than casual knowledge of C -- but I can't see that applying to e.g. "@int", which would presumably work to disambiguate to both parser and coder that this is a type and not e.g. a variable -- and nothing more.

To play devil's advocate, I can see two (IMO, weak) arguments against the @ prefix:
1) C# lets you use @ to name things after reserved keywords (@class, etc), which it could be confused with if you happen to know C# well enough to know that's even legal
2) Syntax should be kept to a bare minimum to avoid turning the language into Perl, J (sorry KnightTemplar!), or worse,
">APL
.

(A stronger argument is that class names should be treatable as constant 'variables' as classes should be storable as values, and that such prefixes draw arbitrary distinctions between the two for no good reason -- but that doesn't seem to be related to your line of argument)
ApochPiQ

Out of interest are you expecting anyone to use your language, or is a research/hobby project only?
It's very much intended for real-world use.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Quote: Original post by MaulingMonkey
So you dislike C's syntax because you don't want to learn C, basically?


No, I dislike C's syntax because parts of it are needlessly ambiguous, and I have better things to do with my time. (IE, be productive and actually get things done. Programming already makes a up only a small portion of what I need to do, and it is usually done in an easier to read and write language, like Python. C is the fall back when part of a python script runs below acceptable levels.)

I like C and how its nearly all of its syntax works, but I do not like the chosen 'keywords' used for pointers. Other than people not lining up their braces on the same column, or failing to intend properly, I can't think of anything else that annoys me about working in the language.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

This topic is closed to new replies.

Advertisement