Advertisement

Somethig terribly wrong with everything

Started by May 18, 2000 02:52 PM
32 comments, last by Etnu 24 years, 7 months ago
I was reading a few articles in the last week, and i noticed something. In 1969, we were able to put a man on the moon using something like <4KB or memory in all computers used. Today, NASA has more processing power than you would believe. Why? Did the distance from the earth to the moon get that much further? Here is my theory. In the late seventies, around the time structured programming began, a sequence of events took place that triggered what is today''s modern computing world. (My guess is probably monkeys with hammers) When the first implementations of the structured languages came about(Turbo Pascal or later), someone forgot something, and caused the entire industry to forever be slaves for large amounts of overhead. Why cant modern computers be run with much less than they do today? The way I see it, there is really no need for 500 MHz computers and tons of RAM. I honestly think the make up of everything is backward (weather it is Win32,Linux,Mac OS or anything) and horribly bloated. Isnt it feasable that a PocketPC could power Quake3 at 90 FPS? After all, the computer is only drawing a 2D picture on the screen. The only objects even attempted to render are ones that are possibly visible. Why do I need a $300 video card to get that kind of performance? I think that I should be able to do this sort of thing (using MODERN GUIs and programming languages as advanced as VC++)without as much overhead as we have now. I think that there is some sort of thing that the hardware manufacturers missed that is preventing me from being able to do all that I want on a 386. (Yea, I know there is a lot of background stuff too, but that too should follow a similiar principle) I know some price is going to be paid for better tools, but I think that when these tools were written, they were written wrong. In my theory, we should be able to replace an entire Server Farm with a single PII 233 MHz. Maybe there was some meeting a long time ago where hardware and software vendors agreed to degrade performance in order to speed the growth of new technology.(Which sounds reasonable) but if thats the case, isnt now about the right time to fix those mistakes? Or is it too late? This same theory can apply to things like Fiber Optic cables too. Why cant I send a 500MB file across the country on a telephone wire instantly? Copper wire transfers electrical current much faster than that. I am just so damn fed up with it all. Even if your writing in Assembly you will have some performance issues (albeit the time spent programming doesnt [in my opinion] justify using it)Its the sort of thing that makes me want to stand up and SCREAM. I think if properly done, Unreal Tournament should run at 80 FPS on a 133 MHz chip with No 3D card and 16 MB of RAM. Windows2000? 33 MHz, 8 MB RAM. It goes on and on like this. If you dont see why, then id like you too explain. If you agree, then I applaud you. Etnu What is a man without goals? A dead man.

---------------------------Hello, and Welcome to some arbitrary temporal location in the space-time continuum.

You have to remember that most of the things humans have invented so far have a MAX of 30% efficiency. So, a 133 MHz chip running at full efficiency equals a 443 MHz computer running at 30% efficiency, which sounds about right.

As for the modem problem, no copper is pure, so signals decay and become garbled over long distances. That''s why they''re sent slower, so things won''t get too noisy.

However, I have to agree with you on how bad things are... I was reading a book about home computing in the 80''s (read "gaming") and wished I had a simple OS like they had.

lntakitopi@aol.com | http://geocities.com/guanajam/
Advertisement
I see what your saying, but I doubt were even getting that. I think were pushing something like 5 % efficency.

What is a man without goals? A dead man.

---------------------------Hello, and Welcome to some arbitrary temporal location in the space-time continuum.

"Did the distance from the earth to the moon get that much further?"

Funny quip, but irrelevant. The mission in 1969 was, by todays standards, very dangerous. Ask any person at NASA if they''d feel comfortable using an exact replica of the shuttle, and you''d get a resounding "NO."

"When the first implementations of the structured languages came about(Turbo Pascal or later), someone forgot something..."

And that would be... What?

"Why cant modern computers be run with much less than they do today?"

Because they have to manipulate data that is several orders of magnitude more complex, and they also deal with exponentially greater amounts. The concept of a 32-bit 1600x1200 TARGA image didn''t even exist in the mainstream back when I was using a Tandy 1000. Hell, today''s 1GHZ machines are far more powerful than the machines that would fill several rooms at NASA back in the day.

"The way I see it, there is really no need for 500 MHz computers and tons of RAM."

You are mistaken.

"Honestly think the make up of everything is backward (weather it is Win32,Linux,Mac OS or anything) and horribly bloated."

Are you honestly trying to say that experienced software engineers with masters and PhD degrees are writing everything backwards? Excuse me?

Isnt it feasable that a PocketPC could power Quake3 at 90 FPS?

Not in the least. At least, not at this point in time.

After all, the computer is only drawing a 2D picture on the screen.

By this statement it occurs to me that you have a horrible misunderstanding on how these things work. Go to college, learn how PCs work.

First, even if all you were doing was pasting a 2D image to screen, you still have to account for copying all those bits when blitting. Do a search for "Dirty Rectangle", and you''ll probably come up with some discussions about how moving a full-sized image (even at 800x600) isn''t the fastest thing on Earth.

Besides, Quake3 is a 3D game. There are serious amounts of mathematical computation going on behind the scenes - heirarchial animation, collision detection (polygonal or bounding-box intersection testing), physics, lighting, processing audio and waiting on input, etc. All this and you need to transform your stuff into view space and paste it on the screen.

Why the hell can''t we have a complete simulation of reality running on my fucking abacus?!? Linus, you dork, you can''t even write an OS worth the bits it''s made of!

Sorry. ;]

"The only objects even attempted to render are ones that are possibly visible."

Of course, but these consist of huge numbers of polygons each. Plus, you have to do testing for everything in the world just to see if it falls into the view frustum. Admittedly there are plenty of techniques to do bulk arbitrary discarding, but there simply is no algorithm in existance that is going to give you 600 FPS in Quake3.

"Why do I need a $300 video card to get that kind of performance?"

To offload much of the processing from software running on the CPU into a hardware environment where it can be done much faster.

"I think that I should be able to do this sort of thing (using MODERN GUIs and programming languages as advanced as VC++)without as much overhead as we have now."

If you can then in my book you''re a better programmer than myself, anyone I know, and even Tim Sweeney, Mike Dussault and John Carmack.

"In my theory, we should be able to replace an entire Server Farm with a single PII 233 MHz."

That''s not a theory, that''s just a statement without any sort of background whatsoever. In my theory, a unicorn impaled Bill Clinton and roasted him over and Orc''s campfire to share with his friends... the current president is a Sorceress using illusionary spells to hide the truth.

"Maybe there was some meeting a long time ago where hardware and software vendors agreed to degrade performance in order to speed the growth of new technology(Which sounds reasonable)..."

Far from reasonable. That''s highly unlikely, not even rational thought. The only times hardware vendors have come together is to standardize protocols and features. Even then, that was usually a standards body that wasn''t part of any particular hardware vendor.

"Why cant I send a 500MB file across the country on a telephone wire instantly?"

Blah blah blah. Obviously you can''t even read a 500mb file off of your harddrive "instantly", as the hd arms can''t move that fast. Then you have to pipe it out to your network card and through the network. For both fiber optic and copper wiring, you have both degredation due to "noise", which comes from a large number of outside influences, and a simple cap on how much unique data can possibly be sent at one time.

"I think if properly done, Unreal Tournament should run at 80 FPS on a 133 MHz chip with No 3D card and 16 MB of RAM."

Yea, Tim Sweeney and Brandon Reinhart are a couple of lousy, know-nothing programmers. ;]

"If you agree, then I applaud you."

Again, you have provided absolutely no information or any analysis as to WHY you should be able to simulate a living person on a calculator.
Creativity is a bloody nuisance and an evil curse that will see to it that you die from stress and alcohol abuse at a very early age, that you piss off all your friends, break appointments, show up late, and have this strange bohemian urge (you know that decadent laid-back pimp-style way of life). The truly creative people I know all live lousy lives, never have time to see you, don't take care of themselves properly, have weird tastes in women and behave badly. They don't wash and they eat disgusting stuff, they are mentally unstable and are absolutely brilliant. (k10k)
You're forgetting a whole hell of a lot of background stuff that gets done in all the stuff you mentioned.

Examples: Back in the day going to the moon and stuff, raster displays were used, meaning it only took a couple of bytes to draw a picture, but that picture had to be made up of monochrome lines. Also, everything was pre-computed on those trips to the moon, they spent months and months before a trip planning for everything they could, and having it ready for when the trip was actually made. Having all that planned out in advance makes a big difference.

As for a PocketPC powering Quake3? Well A) Quake3 tends to be played on displays of 640x480x16 bit color moving upwards, which takes 614,400 bytes of memory just to display, not to mention that the processor would need to draw each pixel several times over, making all kinds of fetches each time for blending and the like. Even on an ideal scene with 0 overdraw and no lighting or alpha going on, that's still probably twice per frame, if they clear it before each frame, and not to mention the Z-buffer going for placing actors in the world. Then you've got all the non-visible processing going on. Sure, only the stuff that could be visible is attempted to draw, but you have to figure out what that stuff is before you can try to draw it. With the sheer number of polygons in a given Quake3 level, that's a heck of a lot of processing. And it gets done 30+ times a second. We're using all the power we have in todays games, plain and simple.

Oh, and don't forget all the other stuff. What about playing sounds, and making those bots think? It's doing all that 30+ times a second too. And reading keyboards and mice and joysticks, and sending information back and forth over the internet. You didn't think they were actually just squandering all that extra processor time, didja?

Note that this is just games. I agree on some other applications, like maybe a browser or a word processor. I don't know where all that processing time is going. Probably to the little paper-clip.

Ok, rant done

Jonathan

P.S. Gotta love those same-time posts

Edited by - Jonathan on May 18, 2000 5:01:31 PM
OOOOOOOOOOH, revolver told you!

Hes right. If they could do what you''re suggesting, etnu, then why don''t they?
Advertisement
The astronauts going to the moon also had a UI consisting of a few buttons which were either lit up or not lit up.

You need a lot of that memory for something called a Graphical User Interface.

You know that web browser you're using right now? Your OS is doing several things right now for it:

1. holding the executable in memory
2. drawing it on the screen
3. waiting for you to click a menu
4. waiting for you to perhaps load some document like a Word app or whatever.

These things do not happen magically. You need DLLs in memory for that. Otherwise you would sit there and wait 5 minutes for them all to be loaded on demand and the "unused" resourced to be flushed out of memory, only to be re-loaded once again because you are done.

Now I'm the first person to agree that Windows uses more mem than it should, but you actually DO need most of it.

Where do you think Quake is storing all of those models and textures that are flying all over the screen? On the hard drive? Nope. They're in memory, and there's lots of them.


Edited by - Buster on May 18, 2000 5:18:26 PM
The reasons I say that it should be possible is that the same speed issues of days past seem to grow exponentialy. i am NOT saying that people like Sweeny and Carmack are bad programmers, what Im saying is that the foundations of all porgramming languages, and the structure of computers in general seems to be lacking a great deal of efficency. Im sure that if, from the ground up, C++ down to assembly down to Binary, a few missteps hadnt been made, Unreal Tournament could have been programmed with the same level of difficulty as it was, but only using a fraction of the resources. I DID account for the math going on in the background, but the math is the part that I think was first screwed up(maybe just calculating more decimal places then needed, but who knows) The very amount of effort required to draw the screen and refresh seems like too much to me. Honestly, I think the roots lie at the very way computers were created (no, im not saying that the foundings of computers are completely wrong, nor could I do any better) may have something wrong with it. If you take 1 drop of food coloring, then gradually dillute it exponentially, you will eventuall come up with a certain level of dilution. Now, say that that drop is .01 mm off of the volume used in the calculation, by the time you reach a few levels of dilution, the margin of error from the original calculation is very far off. The same idea applys to computers. If (theoretically) every 2nd byte had somehow an extra bit tagged to the end of it, just think how much that wasted memory accounts for after getting to gigabytes of data. All modern programs have bugs. Do you think this is a new thing? Was the Eniac bug free? Also, the hardware can be screwed up intentionally. Several receipt-type printers that I have seen have bugs that cause the paper to be cut several lines before the end of the receipt. To offset this, programmers make the paper cut several lines later. When the printer makers finally realize what has happened, they attempt to fix it but realize that if they fix the bug, millions of programs will have to be re written to compensate for the fix (the result being a receipt that is several lines too long, wasting paper) I think that the people getting PhDs are being taught to follow the same guidelines as those that teach them. Binary has not really been around long enough to fully determine if everythink is absolutely correct. Just like many mathematical formuals, It will take thousands of years before every little thing about a given technique can be justifiably accepted as 100% right. Do you think that "cavemen" had no system of counting, sorting, adding, etc.? eventually, the primitive systems these people used were refined by discoveries made by the Egyptians, the Greeks, and many other cultures. Programming is part of Computer SCIENCE, and like any Science, things will be changed forever. Only a few hundred years ago, people had no idea of the notion of molecular particles, and eventually, the ability to see these things became more evident. A few flaws 50 years ago could be what is preventing the unlocking of the full potential of the equipment we have made. Nature itself is still not getting 100% efficency from everything it does, and we humans are far from nature.

What is a man without goals? A dead man.

---------------------------Hello, and Welcome to some arbitrary temporal location in the space-time continuum.

That is a nice guess you got there.

ECKILLER
ECKILLER
Your right, nature is not 100% efficient. Nature created us, therefore we are not 100% efficient. Take for example the human brain. Its huge, and how much of it do we use? Would it not be better to make the brain smaller and use more of it? And give it the ability to grow new cells to replace old defective ones?

This does boil down to computers. But not to the extent that you are beliving. Adding line feeds so that the paper cuts in the right place is bad programming, it''s a hack. A quick fix. Those do not live too long, unless it becomes too expensive to fix them.

Windows i bleive has a bunch of bugs that came to be when large groups of people were making the thing and used different data types. In the end we get stuck with a bunch of code that translates the paramaters between internal and external components of windows. If someone sat down I bet they could write an OS more efficient than dos. (oh wait, that would be unix right?)

As for persision, computers will never be 100% accurate. But hey, neither are you or anyone else in this world. What you think is green, someone might say it emerald. However we have tolerences. If somehting is not going to be noticable then why bother screwing with it?

Oh and as for games running on low resource machines? Try consoles. PSX only has.. 2MB or ram or something. It was originally a 2d system too. It depends on what the machine is made for. And how big it is. A pc was made for general purpose. For 3d games you want a processor with a massive array of FPUs. That way you can translate the whole world at 130Mhz. Ultimetally it will get down to that as we will push the barrier of the speed of electrons and shrink their pathways to the size of one atom. Then we we build parrallel(did i spell that right?) processors for more speed.

Theres my 42¢, im spent for the week.

OneEyeLessThanNone
Just remember, you only have two eyes. No more! So gouging your eye out will only impress your friends twice. No more!

This topic is closed to new replies.

Advertisement