Advertisement

Is there a way to calculate total performance of a hardware to theorize what you would need to run it in software?

Started by June 03, 2012 06:14 PM
2 comments, last by irreversible 12 years, 8 months ago
Gah, its a hard question for me to write. Ill make some examples...so maybe it can be understood better or at all -.-''


Let's say that I want to make an Xbox(not 360) class game, in my own made up language SlowScript, which is 10 times slower than C/C++.
Xbox has a 733Mhz CPU and an 233Mhz GPU.

Let's say I want to recreate Halo, exactly like it was on the console. But, I want to do it in SlowScript. Same GPU. Would I need a CPU with 10 times the "power"?

Is there a way to calculate that? What are the important numbers? What is it that you look for when you want gaming performance? Is it FLOPS?

And then, what if I don't even want to use a GPU, and run it all in the CPU? There must be a way to measure all of this, right?
I guess this also leads to the whole Software Rendering side too. I haven't seen SWR games in years! Well, other than Minecraft and its "clones"?


I just been thinking about this all day. Like, how far are we to just wake up one day and make Fable1 in BASIC(o.O) all running on a CPU.
I know its none sense, we are moving slowly to OpenCL and Cloud and stuff, but, I felt like finding the answer to this question that I have.

So, if you can help me, right on!
There is no single number that represents performance.

Some games may be heavy on one aspect of a CPU's capabilities (e.g. integer math ops) and others may be heavy in other areas (e.g. SIMD floating point ops). Some games may be more affected by cache characteristics than others. Then there's multicore aspects to worry about, and the rest of the machine hardware (RAM, front-side bus, I/O peripherals, etc. etc.).

For reference, compare the classic console emulators that exist to the power of the machines they try to duplicate. Look at the machine requirements of a popular emulator and compare that to the original hardware. That should give you a vague idea of how much overkill it takes to emulate a machine properly. (Hint: it's pretty hefty.)

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Advertisement
Also consider multithreading and you enter a whole new world of unpredictability, where one thread may be stalling another, resulting in decreased performance that will only occur at (almost) random, time-related points.

And sometimes, it may never occur at all, until a very rare edge case scenario. Welcome to programmer's worst nightmare: the phantom freezes...
Comrade, Listen! The Glorious Commonwealth's first Airship has been compromised! Who is the saboteur? Who can be saved? Uncover what the passengers are hiding and write the grisly conclusion of its final hours in an open-ended, player-driven adventure. Dziekujemy! -- Karaski: What Goes Up...
The closest I can think of is the Windows Performance Index. However, it's based on the lowest common denominator, meaning that if you have a monster of a machine, clocking in at 7.0, with a feeble GPU, clocking in at 2.0, then your score will be 2.0.

This topic is closed to new replies.

Advertisement