Ok so I made this great scene, animated a character to act in it, created a storyboard, and recorded my own voice. (It's scary listening to your own voice) So I start clipping the scene in real time and notice that my frame rates are pathetic, again! Have I just outgrown this graphics card? Coder said, "That card blows." While I'd never say that myself (it's been good to me) I do think it might be time to move on. The problem is that my computer has a 4x AGP slot and the card I'm running is 8x. Obviously it's backwards compatible but does that mean I've only been getting half of the card's speed? And if so it doesn't make much sense to get a new card until I get a new machine to fully support it.
In retrospect, it may be time to step back from the graphics angle for a bit. I could always commit more time to my game. But I'm just the type that if I'm not learning something new then I can't sleep at night. Not that I can sleep at night anyway. But the point is, I need a new angle. I hope I find the
right one. Hahahahahahahahaha. Ok so I guess comedy is out.
[edit] I just built up a neural network class and genetic algorithm class. This should get my mind off of graphics for a bit. Does anyone know of a good article describing the AI used in Black & White 1?
OMG! Look, Ma, I'm being quoted!!
[grin]
Do you happen to use any vs2/ps2 at all? The 5200 is known to have exceedingly bad performance at this area, so bad that it's only viable for development use only. Games typically consider the 5200 DX8.1 class hardware, and fallback to 1.1 shaders
Well, this only affects AGP data transfers (a good article to read about AGP can be found here), so it might be the culprit in your case, though I doubt it. Do you use huge amounts of dynamic buffers, updating them every frame?