Advertisement

GPU drivers

Started by November 26, 2011 01:49 AM
8 comments, last by Promit 12 years, 11 months ago
I just started Mass Effect 2. Ran at like mainly 25 fps, but would jump up to 60 or down to 10 throughout playing. The whole game was laggy. Updated drivers and its solid 60 fps now. How is it that drivers are that bad? What exactly do they dictate to get that much more speed, because that is a huge performance increase. Shouldn't drivers only need small updates like new features? I would think these would get carried over to each new generation of cards and be fairly solid, but I'm obviously missing something if they fixed a bug or whatever and increased the cards performance by 250% for this game.

NBA2K, Madden, Maneater, Killing Floor, Sims


I just started Mass Effect 2. Ran at like mainly 25 fps, but would jump up to 60 or down to 10 throughout playing. The whole game was laggy. Updated drivers and its solid 60 fps now. How is it that drivers are that bad? What exactly do they dictate to get that much more speed, because that is a huge performance increase. Shouldn't drivers only need small updates like new features? I would think these would get carried over to each new generation of cards and be fairly solid, but I'm obviously missing something if they fixed a bug or whatever and increased the cards performance by 250% for this game.


For whatever ungodly reason you seem to be assuming that the drivers were written correctly in the first place.
Advertisement
... increased the cards performance by 250% for this game.
I'd assume the 'bug' was resulting in the driver requiring excessive CPU-time, not actual GPU-time. It's even possible that Bioware's QA/programming departments found and reported the performance issues, resulting in better driver-side performance in their game.
Heh.

So, I spent a little time at NVIDIA a few years ago. An enormous amount of the driver development effort is spent on getting new and recently released games up to full performance (I was there soon after the Vista transition, which made things worse). Any time you buy a new game, you're probably due for a driver update. And for all the major titles out there, the driver is reconfigured in subtle or not so subtle ways to make sure that game runs like it's supposed to. I know you're wondering why.

It's a fairly wide range of things, sometimes the driver's fault and sometimes the game's fault. Remember there's a bunch of possible configurations out there, and the games (like Mass Effect) are frequently brought over from the console world and carry assumptions from there. Other times they expose edges and corners in the driver, oddball code paths that aren't running quite right, or there's even a switch that has to be manually set for the correct fast path. The games also sometimes violate the specifications, and maybe they get away with it because of driver version or vendor or hardware. Then the new GPU comes out and breaks the game, and the consumers are mad.

Some of the things I remember (vaguely and distantly):
* A number of games take DISCARD locks on buffers and blithely expect the data to still be there.
* One game never called BeginScene/EndScene. Not sure how they got away with that one.
* In one instance, a shader generator was in use, and the dev team discovered a bug that was writing extra pointless instructions. Trouble is they discovered it after going gold, so they came to NVIDIA who then modified the compiler to hot-patch out the extra code just for that game.
* SLI situations are an unbelievable headache. I've personally never been convinced the technology was even worth developing for general use, and I would say that fully half of the bugs reported against the driver are SLI-related.
* Sometimes the driver skips normally mandatory steps if possible (for example, discarding a buffer when locking, which was NOT marked for it).
* Threading hassles. The driver is multi-thread aware, and depending on how things go that can be a win or a loss.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.



#ifndef MASS_EFFECT
FIX_MASS_EFFECT()
#endif

#ifdef CRYSIS_2
#endif

/// etc


* One game never called BeginScene/EndScene.


How the flying f...rog did they manage that?
Why don't devs just code properly? How is this fair on vendors, or consumers? "hey nvidia, it's crytek here, can you fix our mistake please? we issued one draw call per triangle"
Probably the marketing departments are to blame as well though, I mean, imagine the following conversation:


"hey is skyrim ready for release yet"
"no it still has more bugs than an anthill"
"but it won't be 11/11/11 for another 100 years, just stick it on the shelves, don't care what it's like"
[/quote]

Sorry. I'm up far too late, and felt like having a rant.

Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.
From what I remember, the NVIDIA driver clocked in at around four million lines of code supporting every GPU they ever made and practically every game worth mentioning. You'd be hard pressed to find any AAA title that large.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Advertisement

[quote name='hupsilardee' timestamp='1322369857' post='4888042']
Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.
From what I remember, the NVIDIA driver clocked in at around four million lines of code supporting every GPU they ever made and practically every game worth mentioning. You'd be hard pressed to find any AAA title that large.
[/quote]

I've worked on three major titles so far that were far larger than that. Big companies for major games have huge numbers of libraries all written from scratch. A few hundred thousand lines of networking code that runs lobbies and financial transactions and VoIP, a few hundred thousand lines of rendering code that intelligently handles a huge number of cards, etc., and you quickly reach into the millions.

Then add in all those game's tools, servers, toolchains, and such, those are another few million lines.


[quote name='hupsilardee' timestamp='1322369857' post='4888042']
Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.
From what I remember, the NVIDIA driver clocked in at around four million lines of code supporting every GPU they ever made and practically every game worth mentioning. You'd be hard pressed to find any AAA title that large.
[/quote]

They give every card the same driver? Why not cut out unnecessary code and rebuild for every card, thus saving memory? Or do they use #ifndef and so on to cut out huge swathes of code for different cards?

The whole situation just sounds horrifying and unsustainable to me
Because they'd have a maintenance nightmare getting the right version installed. Particularly since they don't necessarily make the cards -- people use their chipsets and chip IP in other things. Plus, what's written on the card isn't necessarily enough to tell what the silicon is; if you have rev5 or above silicon, you don't need patch 687 in a routine because it was fixed in hardware...

It's a bit off expecting end users to know what revision of what chip design is installed in what memory architecture on the third party manufactured card fitted in a computer they might describe, if asked to say what kind they have, as "a white one".

It's much easier to have all the code in one lump, and then build internal structures like function jumptables of lumps of code by querying the card at boot time.
Yeah, except all the builds aren't approved for all the cards. That's why the NVIDIA driver site makes you pick your exact GPU anyway.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

This topic is closed to new replies.

Advertisement