Advertisement

What am I Looking for in a computer

Started by February 17, 2024 07:02 PM
41 comments, last by Juliean 9 months, 1 week ago

JoeJ said:
No. C++ was built on top of K&R C. C is the foundation of C++. Nothing from C was removed from C++. So there is no similar foundation for both, one was extended to become the other.

Originally, yes, but C has also got extensions over the years that are not part of C++. So what we have today is two very different languages, eigther way.

JoeJ said:
Do you mean the modern way is the intended way to use C++? If so, that's just an opinion. But from all C++ code i see, only a very small part is actually modern. (where modern means to me: code that i can not easily read : ) If you mean just ‘good modern C++’, then this does not change that C is still part of that.

I mean C++ that uses the full toolset that became available with the newer standards. If you are looking at C++ that still does manual new/deletes everywhere instead of std::unique_ptr, that's not C++ anmore. It frankly doesn't matter what the C++ you look at is or is not. Some people stick with older paradigms, for reasons valid or invalid, but more then enough people adopt the newer standards, and it's not logical to compare languages based on people who still code like its 30 years ago.

JoeJ said:
All those languages adopt basic syntax, math and logic operations, function calls from C.

That's pretty reducive, IMHO. Math operators don't come from C, they are just what we use in regualr math. Function-calls have been around in languages way before C. Syntax? JavaScript doesn't even require ; at the end of lines. Comparing JavaScript to C is really stretching it. JavaScript is a dynamically/weakly typed language, which bases it's data storage on JSON-objects. Ok, we have that 5% of syntactical similarity to C, but that's the end of it.

JoeJ said:
Nah. You should code those things out manually initially. Otherwise, it's a bit like using std::sort and then assuming you're done with learning about sorting.

But we may differ on the observed range of the learning process. I talk about the first weeks maybe. Not about becoming good or more effective, just about figuring out if you can do programming or not.

Well, let me ask you this: When you want to learn how to drive a car, do you learn how to drive that car or do you learn how to assemble a car? The former is what I'm saying you should do with programming; the latter is what you are suggesting when saying you should not start by using sort, but code it yourself. Sure, a drivers license education might include lessions about motors, maintainance and whatnot, but that's not the focus. Nothing that you learn in any context starts out with going in-depth into the nitty gritty details. You always start with learning the broad picture, and how to solve that task. Then, once you have a broader understanding, you can delve into the details. Doing it the other way around is possible, but much harder.

JoeJ said:
And what is a ‘modern high-level language’ at all? This does not mean something concrete, it sounds more like a marketing promise, or a goal. Even an ideology, in your case.

That means something very concrete:

https://en.wikipedia.org/wiki/High-level_programming_language

See the section about “Relative meaning”, which explains how C today is considered low level now.

JoeJ said:
Tbh, my adoption of newer features was so slow and smoothly, maybe i just did not notice how much my coding has changed. Maybe i just simplify the experience down to ‘it's still just C, basically’, while somebody else, having the same experience, would notice a large difference.

Again, based on your own admissions and what I've seen about your code, you are not really coding modern C++ at all. Which is fine, but that's not the same experience as someone who goes to any new c++-standard that comes out and adopts all features that make sense in their day to day life. You can correct me if I'm wrong here, but like, what do you even use of newer C++-features?
I'm 100% convinced that that's the case, since again, from my own experience of pretty much using the entire range of C++23, the difference between C and C++ is night and day.

Juliean said:
Originally, yes, but C has also got extensions over the years that are not part of C++. So what we have today is two very different languages, eigther way.

But i refer to the original C, not later variants such as C99. It's nitpicking anyway, since the difference between C, C99, C11, … is quite subtle and irrelevant to my point, which is: C++ is built on top of C, and thus the two languages are not different at all, as long as you only use the C feature set.

I assume you associate various programming paradigms to various languages and derive a difference form there. But that's subjective and not general, since you don't know what paradigms various programmers use or not.

Technically and historically, C++ extends C with new features, but those new features don't make the C foundation different or partial.

Juliean said:
If you are looking at C++ that still does manual new/deletes everywhere instead of std::unique_ptr, that's not C++ anmore.

So you say people not using unique_ptr are no C++ programmers anymore, although they write C++ code and compile it with C++ compilers?

That's just wrong as well.
That's no different than saying ‘if you don't use the goto keyword, you're not a proper C/C++ programmer’.

Juliean said:
and it's not logical to compare languages based on people who still code like its 30 years ago.

I do not compares language based on people. Only you do this. I do not say ‘Code C like back then, it's good enough.’ But you say ‘Either you use and know all modern features, or you are outdated and inferior’, reading between the lines.

My proposal is to learn the core first, focusing only on the minimal feature set required to write a program, then learn the extended features later.

Juliean said:
That's pretty reducive

Exactly. Reduce to what's needed, so you have less to learn to understand how programs solve problems.

I do not say C has invented math. But the syntax it has introduced to express math and logic in programs has been adopted to the majority of languages.

Juliean said:
Comparing JavaScript to C is really stretching it.

Yep, it is stretching. But it works well enough to strengthen my point: C is the common denominator of all modern programming languages. It is thus a good idea to get started here.

Juliean said:
Well, let me ask you this: When you want to learn how to drive a car, do you learn how to drive that car or do you learn how to assemble a car? The former is what I'm saying you should do with programming; the latter is what you are suggesting when saying you should not start by using sort, but code it yourself.

Strongly disagree. People should know how the stuff works they use.

But in this case i may be indeed old fashioned.
People make good game sin GameMaker.
People make programs passing the Turing Test, without knowing how it works.

Soon, people won't be able to code Pong from scratch.
But it does not matter. Just enter ‘code me a pong, dear assistant’ to the prompt.
And we don't need people to know how Pong or a computer works. The Machines know.

The problem: It turns out that people are no longer needed for anything. So we can just lay down and die.

Choose your ideology wisely.

Juliean said:
You can correct me if I'm wrong here, but like, what do you even use of newer C++-features?

I use lambdas very heavily, templates pretty often.
Most of my structs and classes have some member functions, but a whole lot of them has no constructors.
I ignore compiler warnings telling me i should always initialize. That's for noobs. ; )

I use very little of features past C++11.
But that's not outdated. It is what most people do. At least that's my impression from looking up other code.

Actually i think that you are the exceptional case, not me.
You are the modern C++ guru here. Most of modern C++ i code see is your's, actually.

It often makes me curious, and sometimes i adopt some of those newer ways you show.
Your contribution has great value. Especially for people like me, being too lazy to learn latest stuff intentionally.

But your hyper modern way is not the only way, no matter what's the goal.

Conclusion:
So far i have not heard any valid argument against getting started with C, so i'll keep recommending it.

Advertisement

JoeJ said:
But i refer to the original C, not later variants such as C99. It's nitpicking anyway, since the difference between C, C99, C11, … is quite subtle and irrelevant to my point, which is: C++ is built on top of C, and thus the two languages are not different at all, as long as you only use the C feature set.

Ookay, then let me bring up a serious point: If you use C++, and you can only use the C feature set… then why in the love of fuck are you using C++? Why are you not using C? What's the point of C++ if you don't use any C++ features? Why would the language need to exist at that point? There is nothing that the shared C feature-set in C++ does better than in plain C. You are obviously also using C++ features, so you are invalditing your own point.

That's actually the main problem with your line of argumentation, and which is why I'm just going to focus on this and then end this debate (because everything else is just splitting hairs at that point). Yes, if you only use the core-features of what is available in C (math, operators, if, switch, functions) then pretty much every language is “just like C”. But at the same point, you are discarding all the features that give those language a reason to exist. Like, why would any language need to exist if the C-featureset was all that's needed? Everybody would just use C. But it's obviously not enough. That's why new languages are developed. That's why new features are getting added to languages. If you took the time to read through the papers and discourse that goes into those new features, it is always based on some real-world scenario/workload that is being presented, of something that is eigther annoying to use, cumbersome, or just some better way came up.

JoeJ said:
But that's not outdated. It is what most people do. At least that's my impression from looking up other code. Actually i think that you are the exceptional case, not me. You are the modern C++ guru here. Most of modern C++ i code see is your's, actually.

Well I really don't know where you are looking for code at. Stackoverflow is full of C++-questions up to 23, there is tons of articles and projects. Just look at cpp-con, where each of those speakers, who are industry veterans, hold presentations about new and upcoming features. There will always be enough people stuck in the past, eigther willingly or unwillingly; just like there is still COBOL code around that can't be changed for legacy reasons. The german railway service was just looking for a windows 3.11 administrator… jesus christ.
I'd admit that it's hard to prove with numbers who uses what exactly how often, but it is fact that those modern standards are being used, widely - proven by the simple matter that they are developed, there are more then enough resources available about them. If you don't see that, you are living in a filter bubble, plain and simple.

JoeJ said:
So far i have not heard any valid argument against getting started with C, so i'll keep recommending it.

That's because you discard any valid argument because you belive that learning it the hard way is the best way. If you think that learning how to sort is better done by coding it yourself, than using a library feature, then yeah, by that logic, there is nothing wrong with using C in the slightest, and anything else is a distraction. That's utter nonesense in my mind though, so don't worry - I'll always be there to give the counter argument 😉 You are free to not take it as valid if you want, for me it's always more about giving the people asking perspectives.

Juliean said:
which is why I'm just going to focus on this and then end this debate (because everything else is just splitting hairs at that point).

You must understand that splitting hairs is all this debate is and ever was, and even more so, what's your own contribution to this unsatisfying state of debate.

Juliean said:
Ookay, then let me bring up a serious point:

No. That's not a serious point. You are kidding. The answer is too obvious to give it once more.

Let's try something else.
There must be a way to empathize, followed by enlightenment for both of us.
I want to know why we quarrel although we love the same thing.
Tell me more about yourself. What was your path regarding programming languages, or the interest in computers all together.

I'll lead the way.
My dad wanted to show me something. It was an Atari 2600. I was amazed from interacting with the image on TV, and i wanted to do this too. I did not rest with begging until i finally got it, the C64. It teached me BASIC programming with its manual. Took me some time to get it. But then i made simple games, editor to draw animated sprites, programs to solve the penalty work from the math teacher.

It was nice, but it was not fast enough. I did read a lot about the C64, and i knew how it worked. BASIC did not let me make it work as intended. I needed an Assembler. Finally i got one, from a disk magazine. Finally.
And i had this 1000 pages book, All about the C64. I had all i needed, to finally program for real, without those terrible abstractions of a BASIC interpreter, bwah. Nothing could stop me. Nothing could slow me down, at this point.

I've written my first assembly program. Omg, you should have been there. The Sprite was moving so fast, i could no longer see it! It was just a streaky blur over the screen, impossible!!!
I could not believe how fucking fast my C64 was! Finally i could do it - making the game of my dreams!
But then, sigh, my C64 died. My programming career was over. : (

I've almost skipped the 16bit era. Guitars, girls, i did not really miss computing that much.

But then my girlfriend wanted to show me something. It was a Doom clone on her B/W laptop.
So the same story again. I've bought a big PC with a Pentium processor. And i had a CD-Rom with 1000 shareware programs and stuff, including Turbo-C and those K&C book excerpts.
C was love on first sight. I could tell the computer precisely what it should do, like with assembly.
But C was high level and convenient! Opposed, assembly is like doing a shit, while making a handstand, you know? It's low level. C is high level.
I had all i needed to enter the new dimension of 3D.
After upgrading to Watcom-C++ compiler to get those sexy 32 bits, i started working on rendering textured triangles. Those triangles could clip each other, so i never had to touch a pixel twice.
It was fast enough to show simple 3D scenes at good fps. Not as fast as Quake, but i was on the right track…

After that, it was time to look at this other tutorial from the CD-Rom, about C++.
Pretty useful. Add member functions to structs, or classes how they called it. Nice.
It also had a lot of other new options, like deriving classes from other classes. Why should i need this? I decided to ignore this one, and few others as well.

Not much had actually changed. But as my projects became larger, ++ really helped with keeping things organized. I started to notice the advantage, and as years walked by, i've adopted more of the features, which i still do to the current day.

Long story short, this is how i perceive the ‘increase of usability' of various programming languages. But i only list proper languages, ignoring all the redundant crap.

Machine code: 0%
Assembly: 30%
C: 100%
C++: 130%

Now my guess is: You are probably younger than i, your history and impression differs, and you give a different rating to different languages. You might not even know much about the low level. And thus you might not see the value and improvement of the C language.

On the other hand, i do not see the improvement of fully updated modern C++, since i do not use it yet. I can only make assumptions from impressions, likely overlooking some of its value.

By combining our lacks of knowledge, we narrow down to a pointless war about ideologies, which lack definition and purpose. The debate is flawed.

So maybe, by understanding the other better, we can learn a thing and do better in the future.
And then, but only then, is the right time to end a debate.

Correct me if i'm wrong.

So, I have been programming for not so long (5 years or so) and for the last two years I gave a few lectures to first year students (people with 0 programming experience). In general my take away is that there are two ways to approach programming for beginners and both are valid.

Starting from the complete beginning (almost), starting with a computer with Von Neumann architecture explanation, and some basic assembly instructions to give a general idea of how a computer (used to) work.
Then we build on top those extremely simple instructions the concepts of loop, functions, logic handling.
working the way up, explaining what a compiler is, what a linker is, what the OS (conceptually) does, what is memory, and so on. Moving up next is a high level language like C, and moving your way up, to working with a high level language like python. The advantage here is that you get a conceptual model of the whole thing, you get to know terms, helping you a lot when you need to google stuff. There are a few other advantages to this way, but moving on (this is what CS50 does, a Harvard introduction to programming, btw its free).

The other way is working your way from top to bottom. Starting with specifically python (not CSharp, or Java or anything like that). Being python, it is extremely intuitive to beginners, and usually has very little extra code you need to type to get things done. Want to print things to the screen?
print("my message") is all what you need to type to get something out. I feel python is very intuitive.
This has the advantage it is more fun, and you get your hands dirty on something nearly practical in like two hours, you can always experiment and play around (this is how I started roughly in 2017, MIT does this btw, in their free course on programming with python (2016 one)).

The biggest drawback is that, moving to a lower-level language like C, is not easy. Python handle a lot of things for you, and switching to C makes you think: “why was it so annoying to work Strings or Arrays or why do I have to manage Memory etc.”, to the point that Learning C becomes a chore and you will not get anything out of it. I don't know a good advice on how to learn C for high-level programmers.

Now, why do you need to learn C? again while debatable, here are the reasons I have personally experienced and found useful for the students I taught.
C language is limited, you have to do so many things yourself. Why is that a good thing? because it forces you to solve problems, to think, your mind get used to the process: “Okay I have this input, and I want that output, how do I get there?” There is a mindset that traps some beginners, in which their way of thinking is (very) limited by how they use their language, or the first thing they do is to look for a library that solves their problem (a library is someone's else code, published for others to use). C makes it harder to fall in that trap.

Another advantage of using C (not learning C) is that you get a lot more control than 99% of all other programming languages, a C program may run 1000x faster than its equivalent python program, this is also the reason C (and C++) are used in game development, Your CPU is the same whatever language you use, C just gives you more control on how to direct that CPU Power to your particular problem, allowing you to achieve crazy performance.

None

While unrelated to the question, I just found it easier - and that does not mean you have to - to learn C and then C++, even though I used C++ first, I found Modern C++ hides so many things for me that I have no idea what the language is doing, eventually decided to learn C, and then moved back to C++, I found this process very insightful (but anyone whom I told about it, found it wasteful).

None

Advertisement

@JoeJ Thanks

AliAbdulKareem said:

While unrelated to the question, I just found it easier - and that does not mean you have to - to learn C and then C++, even though I used C++ first, I found Modern C++ hides so many things for me that I have no idea what the language is doing, eventually decided to learn C, and then moved back to C++, I found this process very insightful (but anyone whom I told about it, found it wasteful).

So the Difference between quickly jumping in with a more complex and involved top to bottom. Or a more foundational bottom to top.

AliAbdulKareem said:
While unrelated to the question, I just found it easier - and that does not mean you have to - to learn C and then C++, even though I used C++ first, I found Modern C++ hides so many things for me that I have no idea what the language is doing, eventually decided to learn C, and then moved back to C++, I found this process very insightful (but anyone whom I told about it, found it wasteful).

Well, that's probably because even in C, you don't actually know what the language is doing, because even C has layers of layers of abstraction.

Take “malloc”. You might think that by calling malloc, you know how what's going on, compared to something like C++ making this automatic with it's STL. Except you don't. The OS will attempt to allocate that memory on some heap. But even then, the address you get back is not actually an address to physical memory. Any OS nowadays has a virtual address space, that is removed from the physical memory - especially in user-mode code. In fact, the OS might not even allocate any physical memory until you access the memory that has been “malloced”. Further, what's going on behind the scenes to then get this physical memory is even more involved. Now you have the BIOS, multiple buses, controllers, bridges… And if you go even further, you have the actual underlying mechanisms of how the hardware works, to get electricity do get something like RAM to do what it does. That all is hidden from you.

Even on the code-side, you obviously have assembly. “malloc” might actually be a call, or it might be inllined. It might even be removed entirely by the compiler and replaced with some memory from the stack, if the size is known and the compiler can prove that the memory is not used beyond the current function (which might very well be the case after inlining) - to be fair, I don't 100% know if C-compilers are allowed to do allocation-ellision, might just be a C++-thing. But even then, you have no idea what the actual assembly that malloc executed is.

But even if you do - it get's more crazy. Modern CPUs don't actually execute the assembly they are given. The translate it to mops (technically written with the greek m), a process which can involve a multitude of things. One point is register renaming. CPUs actually have way more physical registers than the ASM can express. So if you think

mov rax,0

Actually sets a “RAX” register to zero, you are mistaken. The CPU will rename this to whatever register is currently unused. It might not even execute the instruction at all, if it has a register that it knows has been previously zeroed and is not used again. It might transform instructions into different ones if it knows that the other instruction is more performant. Any matter of things, really, that make even ASM more of “that's what I want to happen”, instead of a “tell the CPU to do that”-kind of language new.

Case and point - abstractions are everywhere in todays computers, and for good reasons. If you move from C++ to C, you might get a bit more of the picture. But you've actually just removed one tiny layer of abstraction, over a process that is already abstracted beyond belive. Don't get me wrong - actually learning some of this stuff to get a better understanding can be helpful. It's just not unversally helpful. Me learning assembly obviously cleared up some things, and gives me a better understanding of how things work. But it's not something that is unversally useful to everyone. Somebody who only wants to code some app, doesn't need to know that stuff. If it helped you, good! - seriously. But it then becomes a point of everybody deciding where they line in the sand is for how much you actually need to know, and where to stop - you obivously didn't go from C to assembly, but back to C++.
I also have to say, inspite this, I find your post much more reasonable and agreeable than what's been put forth in favour of learning C so far, even if don't share the same conclusion. Aside from what I've already said, I'd just put a small asterics on the performance part. Here's why:

While C(++) can and often will give you the best performance overall, it is also then very dependant on actually knowing what you are doing, first of. Somebody who has very little knowledge about the language, and about performance-tuning, will probably achieve better performance on another language. At least if we are discarding languages like Python-while I'm pretty sure 1000x is an overstatment, phython (without a JIT) is a text-interpreted language, so it makes absolute sense for why it's orders of magnitude slower then a native compiled language.
Other languages like C# can actually still yield extremely close to C-performance, even beating C in certain scenarios. That's because a JIT (which is employed by C#) can compile code dynamically, based on the users hardware and the current state of the app. That can allow it to select instructions that a C program cannot, because they are only available at very new CPUs (at least not unless you restrict your C program to only run on those sets of CPUs). It could de-virtualize based on what DLLs are currently loaded. It can remove branches that it knows cannot currently be executed. Downside is the overhead of that JIT-compiler/logic. I haven't personally done many performance-tests with those languages, to be fair. And I totally second that, as an expert, if you want max performance, and most importantly control over that performance, you use C++. But it's not as drastic as you put it, IMHO.

Well, that's probably because even in C, you don't actually know what the language is doing, because even C has layers of layers of abstraction.

You are making too much assumption about me here.

Juliean said:
I don't 100% know if C-compilers are allowed to do allocation-ellision, might just be a C++-thing. But even then, you have no idea what the actual assembly that malloc executed is.

This should be a C++ thingy (I have read c99 spec, I don't recall such a thing). And according to cppref:

New-expressions are allowed to elide or combine allocations made through replaceable allocation functions.

This sounds to me like a new operator thing.

Juliean said:
while I'm pretty sure 1000x is an overstatment

I worked professionally with python, I built the source code with optimization and I can confirm it is on the order of 100x to 200x slower by default (will vary depending on the program but that is like the default state). Using Default Compiler optimization, like -O3 alone on clang, will very probably use SIMD and will make the jump to 1000x probable and not an overstatement (again will vary, but I have seen it multiple times). I don't consider JIT to be part of python, because as far as I know (up till last time I worked with it) Python installer does not, and never had a JIT in it.

Juliean said:
you obivously didn't go from C to assembly, but back to C++.

I did, But even back at University(EE), first year was C++, second year was C, third year was assembly.

Juliean said:
Other languages like C# can actually still yield extremely close to C-performance, even beating C in certain scenarios.

While I have heard of such claims, by Chandler cppcon talks, my limited experience with CSharp specifically showed me otherwise. As simple as copying RGB images, CSharp (even with fixed, and unsafe) don't come close to C, we are talking at least an order of magnitude here. (Inside Unity at least, default CSharp copying was much slower, I ended up coding it in C and loading it as a DLL). I am not saying you are wrong, but this will probably requires more tuning than it worth.

Juliean said:
That can allow it to select instructions that a C program cannot, because they are only available at very new CPUs

I am not seeing your point exactly about executing newer instructions, that sounds to me like a trade-off you definitely don't want to make for performance. So, if it may (or may not) execute faster on newer CPUs, it means it will definitely execute slower on older CPUs (if your target are mostly on newer CPUs just make that the default). Usually you want a grantee about standard/worse case scenario, rather than maximum/lucky scenarios.

BTW Gentoo OS users (and I take their word for it, I used it like for a week) compile apps instead of using prebuilt ones to get those newer instructions.

None

This topic is closed to new replies.

Advertisement