Advertisement

What's up with NVIDIA lately?

Started by November 21, 2010 07:21 PM
30 comments, last by zedz 14 years, 3 months ago
The GeForce 4MX was actually a rebranded, low-end GF2 with some minor upgrades. So it would give some poorer performance...
Quote:
Original post by Matias Goldberg
OpenGL read back are insanely slow
What's the worst, by all appearances this is not even a necessary thing. It seems that this is really something they do on purpose to discourage people from using their consumer cards for CAD and modelling. Except there are a hundred other valid reasons why you would want to do a readback, too. And except their consumer cards are expensive enough already.
Which would be ok with me if they were the gold standard as they once used to be. I don't mind paying 50-100 euros extra for having the better thing. But as a matter of fact, you pay 3 times as much as you'd pay for an ATI card which performs better and has no deliberate quirks.
Apart from the few people who print money in their cellar, who can afford (or wants to afford) to pay upwards of 3000 euros for a Quadro card which will need to be replaced in 2 years anyway? I'm not even talking of consumers or indie developers.
If your company name doesn't happen to be Blizzard, this is just a ridiculous amount of money for a thing like a graphics card.

Quote:
insane amounts of energy
Oh come on, what's 250 watts between friends! Heck, that's nearly twice as much as my entire computer consumes at full load, according to APC's systray.
I wonder how those hardcore gamers deal with 4xSLI in their living room. That'll be like having an electric heater under your desk. Literally.

How much I hate seeing this development as a total nVidia fanboy, but one really has to admit to it. I've been sneering at ATI for decades because they were always those whiners who made petty excuses because their sucky cheap cards could not cope with "real cards", and their drivers were such a crap.
Think of simple things as depth and stencil, think of when dynamic branching came out and ATI said "oh yeah, nVidia makes a bit hype, but this is all crap (because our cards can't do it)". Or remember when vertex texturing came out and ATI sold SM2 cards as SM3, and found all kinds of excuses why their R2B was so much better anyway (because their cards just couldn't do it). Half-way functional OpenGL? You wish.

And now, look at it today. ATI are the ones who make the top of the line cards. They work better, faster, use less power, and they cost a third of the price, too.
Advertisement
Quote:
Original post by JoeCooper
The GeForce 4MX was actually a rebranded, low-end GF2 with some minor upgrades. So it would give some poorer performance...


i remember having some friends having bought it, thinking it was a low end gf4. the lack of performance was understandable, the lack of features wasn't. no pixelshaders on this hw => most newer games of then did not run with all the details, which was the reason they bought the gf4 in the first place. not the performance, but the feature set.


other things against nvidia: since radeon9700 ati released drivers bimonthly (or monthly?) that got all trough microsoft certification. nvidia didn't for a very long time (do they regularly release microsoft certified drivers now? i don't care enough about them anymore). i mean, how hard is it to get trough the certification?

and the biggest thing imho: microsoft sued nvidia because of their crappy driver for vista, being nr. one reason for vista crashes in the first half year of vista. it's not like they did not had enough time to write a proper driver (ati did). they considered supporting that os being unneeded.

how ignorant does one have to be to say "certification from the os manufacturer of our main selling platform? we don't need that" or "our main selling platforms os manufacturer works on it's next os? we don't need to support that".
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

Quote:
Original post by JoeCooper
The GeForce 4MX was actually a rebranded, low-end GF2 with some minor upgrades. So it would give some poorer performance...


I was indeed aware of what it was when I bought it. Unfortunately, at the time my PC's AGP port was non-functional -- reading the manual years later it claims only to work with 4x nVidia AGP cards, but I never bothered to try.

In any event, I hadn't recalled correctly -- I had first a 4MX and later a 5200, both PCI, and later I purchased a 4000MX (or somesuch) AGP, just to have a cheapo AGP card handy. It served duty in a secondary PC for awhile. That card was somewhat interesting -- it was a derivation of the GPU found in the original Xbox from what I recall.

throw table_exception("(? ???)? ? ???");

I wanted to stay away from the GF 4 MX and GF FX 5 series fiasco because it's too old. Furthermore I believe nvidia has redeemed from those sins (with the 6, 7 and 8 series), and rumors said the main reason was their efforts were being focused on the XBox's chip.

Furthermore between 6 and 8 series, NVIDIA did things right: they drove graphics innovation, supported everything they should the way they should (unlike ati... no VTF? shader model 2.0b?), the 8 series was a major step into the future, their GL vp40/fp40 profiles were awesome (personally IMHO, it's too bad the ARB decided to go for GLSL). And their tools NVPerfSDK, FX Composer where new, unique and very useful.

Today, those great tools are unmantained, GLSL is a must in OpenGL (at which, nvidia sucks), and they're not driving innovation.

The high-end GeForce 8800 had a high ratio of failures, but it was the beast which no one could surpass even 2 years after it's release, and the first in it's generation.
Today... those problems persists, there's no excuse. And now we have driver problems and their unloyal practices are annoying (GPU burns in 197.96, timebomb, downgrading GPU read back)

So what I see is that they're shifting towards a not very bright future, even worse than in the FX 5 era.

BTW, you can track me as an NVIDIA user back to a Vanta card!
It's the great pendulum of a competitive marketplace. It's hard to be #1 forever.
Advertisement
Quote:
Original post by Matias Goldberg
Furthermore between 6 and 8 series, NVIDIA did things right


And that was pretty much the last time they did, with AMD's failure of a release of the R600 NV appeared to get too comfortable and began their focus on GPGPU.

At which point out of blue AMD brought out the Rv770 and pretty much changed the game; gone was the idea of huge chips and efficiency was the name of the game and well, NV have struggled to keep up ever since.

Well, I say 'keep up'; they haven't really tried, they have just kept on down the same road and Fermi shows just how well that is working...
Quote:

NV appeared to get too comfortable and began their focus on GPGPU.

I agree with most of what has been said here, but to be fair nvidia's GPGPU efforts were IMHO (at least in their intensity) more a result of Intel's declaration of war than of too much comfort. Maybe what we are seeing now is just the result of this war on two fronts with Intel (GPGPU/embedded/ray tracing) on the one side and ATI/AMD (high-end cards/Fusion) on the other. I could imagine that stretched nvidia's resources quite a bit, at least as far as software is concerned.
I have to say, I don't get the concept of being "loyal" to something like a graphics card. Unless you are specifically targeting the capabilities of one line (seems like a foolish thing to do), the choice of ATI or NVIDIA is purely dependant on performance/reliability.

I just decide on a rough budget, and pick the best performance/price point around that budget. I've swapped between NVIDIA and ATI several times.
if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight
Quote:
Original post by ChaosEngine
I have to say, I don't get the concept of being "loyal" to something like a graphics card. Unless you are specifically targeting the capabilities of one line (seems like a foolish thing to do), the choice of ATI or NVIDIA is purely dependant on performance/reliability.
Loyalty is crucial precisely because there is no difference. At the end of the day, individual titles will have unique performance which is mostly unrelated to card itself, but a combination of many factors.

And while statistically one card will outperform the other, individual user only cares about one single title.

Brand is a big deal.

The other side is price ranges. Brand loyalty may push sales in individual ranges higher while acquiring new customers in low and budget price ranges. Acquiring loyalty of those means they'll upgrade.

There is brand identity. When Title X comes up with nVidia logo, it offers some comfort/recognition/... to those loyal to brand.

Finally, there is brand goodwill. Nobody has ever spoken against Google. Everyone hates Microsoft. If your brand gets on good side, you get positive bias. Not only from media (which can be bought), but also increasingly important viral branding - bloggers, enthusiasts, the geek in a group of friends people call when buying new machines.

Brand is all that still matters. Technology, reliability, performance, ... None of majority markets can evaluate those objectively.

Here is an example of brand strength: "Macs do not crash and they never did". When did you last encounter people bitching over Macs crashing. "Oh, the bomb? It's cute" What do numbers say, statistically, which crashes more often - Windows or Mac?

This topic is closed to new replies.

Advertisement