My graphics card recently gave up and so I have the task of purchasing a new one, right before christmas might I add where money is slim.
Anyway, I usually just read around the reviews and get an idea of what I want that way, rather than looking into the specifications. But this time I was curious, and did look them up.
I currently have an 8800GTX
and was looking at the 480GTX
The number of cores seems to have quadruppled, and the memory bandwidth more than doubled, but I was quite surprised to see that the Fill rate had barely increased. With 37(billion/sec) on the 8800 and 42 on the 480.
And in fact the 9800GTX seems to have an even higher fill rate(43) than the 480.
Was wondering if someone could shed some light on this?
Graphics Card Technology
Unless all you do is render fullscreen quads with an empty pixel shader, fillrate isn't going to be your bottleneck. It's a pretty meaningless statistic for modern GPU's, since the amount of work done per-pixel is so much higher than it used to be in the olden days. For the one case where it might matter (depth-only rendering) Fermi's ROP's jump from 8 pixels-per-clock to 64 pixels-per-clock, so it's not an issue.
Either way you should look at actual benchmarks instead of theoretical numbers.
Either way you should look at actual benchmarks instead of theoretical numbers.
Before considering a 480, you should also read the discussions about the performance problems with these cards. There is something that appears to be a "deliberate issue" which makes texture upload and pbo download extremely slow, to a point where the 480 is considerably slower than a 9800GT in some scenarios.
You find a lenghty discussion about this over on the OpenGL forums.
It has been alleged (though of course nobody knows for sure) that this is an artificial thing so people don't use Geforce cards for modelling, and nVidia gets to sell more Quadro cards. Apparently, mostly 3D modelling software is affected, but who knows. Inhowfar it will affect you, nobody can say, you'll have to find out yourself.
Personally, I'm considering to buy an ATI card for the first time after being a total nVidia fanboy for over a decade. The reason being that what various people brought up on this account really looks like nVidia is braking the card deliberately just to make more money, and that's not something I'm willing to support by principle.
You find a lenghty discussion about this over on the OpenGL forums.
It has been alleged (though of course nobody knows for sure) that this is an artificial thing so people don't use Geforce cards for modelling, and nVidia gets to sell more Quadro cards. Apparently, mostly 3D modelling software is affected, but who knows. Inhowfar it will affect you, nobody can say, you'll have to find out yourself.
Personally, I'm considering to buy an ATI card for the first time after being a total nVidia fanboy for over a decade. The reason being that what various people brought up on this account really looks like nVidia is braking the card deliberately just to make more money, and that's not something I'm willing to support by principle.
Quote: Original post by samothAgreed. Not to mention that I don't like their attitude towards double-precision-on-consumer cards at all. The new 6000 series seems to be a good product (cheaper, mostly the same thing) for the time being.
Personally, I'm considering to buy an ATI card for the first time after being a total nVidia fanboy for over a decade. The reason being that what various people brought up on this account really looks like nVidia is braking the card deliberately just to make more money, and that's not something I'm willing to support by principle.
Previously "Krohm"
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement