Advertisement

What's up with NVIDIA lately?

Started by November 21, 2010 07:21 PM
30 comments, last by zedz 14 years, 2 months ago
Hi all!

I'm a big fan of NVIDIA, but I can't keep feeling things are going down lately, regarding it's competitors, and I wanted to know if you share my thoughts:

With the last GeForce 400 series, OpenGL read back transfers are insanely slow (affecting mainly top 3D modelling applications), also with some other weird problems.
Also NVPerfSDK is stuck at version 6.5 and doesn't work properly with latest 2xx.xx generation drivers. NVPerfHUD still has some BSODs that have been in there for ages. Meanwhile ATI's redesigned GPU PerfStudio 2.13 (shame it doesn't have DX9 support) and Intel's GPA look superb

They've launched DX11 hardware like 4 months after ATI.
Not to mention that latest cards are drawing insane amounts of energy and producing heat compared to it's competitors.

NVIDIA's OpenGL drivers used to be the best out there. Currently, their GLSL parser is so permissive, I am better off compiling GLSL shaders in my mobile ATI card and looking at the warnings and errors they produce.

Also I almost forgot to mention NVIDIA drivers prevent PhysX from running in GPU mode if they detect a non-NVIDIA GPU is the primary graphics device; that can only be achieved with patched drivers. They've quoted business and stability issues, but including a time bomb is lame and disloyal, both to customers and competitors.
Speaking of driver disasters, watch out for version ForceWare 196.75 which burns your GPU. Nice QA department...

Besides another reason NVIDIA cards were preferred over the other, was their excellent Linux support. Today Intel and specially ATI have catched up (NVIDIA is still a bit ahead IMO though), and at least ATI has the choice to use either the open source drivers (for which ATI releases extensive documentation of their hardware, unlike NVIDIA), or their proprietary one.
NVIDIA on the other side, has opted to kill the open source 2D-only NV driver. NV Driver wasn't great, but was a light in the dark when you use distros like Debian where the only way to get something preinstalled and done automatically is if it is fully compliant with their philosophy.
Plus, a nice fallback to have the 'nv' driver when the proprietary one failed to load unexpectedly.

Last, but not least, a year, two years ago; I used to read a lot of 3D whitepapers from their developer site. Currently it's all about GPGPU (which has it's market, my respect for them), and I often found myself reading more and more whitepapers from ATI and Microsoft (plus the usual GDC & SIGGRAPH papers published by other companies or individuals).

I have to admit some benefits. NVIDIA has made a lot of effort in the HPC (High performance computing) area, for those who can afford it, nsight looks great, and Cg is awesome.

Soooooooooo, all things considered, I still use NVPerfHUD which still a keeper for me (which btw I have to stick to older drivers) and it's a shame ATI's GPU PerfStudio doesn't support D3D9.
But I'm finding less and less reasons to go for GeForce cards, probably one of the few things left is brand loyalty (which is starting to fade away with these many issues I'm commenting), and more attracted for Radeon GPUs.

Probably as a gamer, brand loyalty and good performance is enough to keep me choosing their products, but as a developer, I need a company which cares about my needs. And within those needs are tools that support and ease my work, drivers that guarantee that those tools don't get broken or may kill may card, commitment to their user preferences (i.e the time bomb) and research papers that incentive my creativity and motivate people to share knowledge about graphics programming, etc.

I don't want to start a flame war. I'm trying to point out some facts that have emerging lately and can no longer ignore. I had to share this.
Am I alone in this line of thought? What do you have to say? Am I missing something?

[Edited by - Matias Goldberg on November 21, 2010 10:20:59 PM]
Quote:
Original post by Matias Goldberg
I have to admit some benefits. NVIDIA has made a lot of effort in the HPC (High performance computing) area, for those who can afford it, nsight looks great, and Cg is awesome.

Here's a little secret - GPU in HPC is vastly overhyped.

There is a class of problems where they excel. But these are not as common or as useful as one would like. Many crucial algorithms are branch-and-bound, so the worst-case for GPU not only cannot be eliminated, but is required for algorithmic gains.

The other problem is memory and bandwidth in general. If something can be partitioned out, then there's often cheaper ways. Readback is a big no-no.

It mostly comes down to whether you can form your problem in shape of a bunch of huge matrix operations that still fit into memory.

Per-watt, given equivalent code, quad core frequently wins out. For large problems, Hadoop (Java) cluster running on thousands of cloud machines will work out better/cheaper.

And computing power available today is adequate for easily vectorizable problems. For the rest, GPUs don't do much, or are on par with GPUs. And with software OpenCL, TBB/ABB and similar, there are some much more affordable and competitive choices for many problems.
Advertisement
I have been an nVidia loyalist for many years, so I hate to have to agree with you!

nVidia's OpenGL implementation is the reason I am seriously considering an ATi card. I'm working on a project that will be cross platform, so I am stuck with GL. It really sucks that nVidia's GL implementation is so lax that I can royally screw something up and never know it.

No, I am not a professional programmer. I'm just a hobbyist having fun...

I read some of the links you gave, that is just sad.......

No wonder certain software developer are focusing only on ATI high end card, to a point they have a package of selling the software together with the card.

I don't know what to say on this matter...but I hope OpenCL changes everything.
Another long-time NVIDIA user planning to switch, here. Been thoroughly unimpressed with the last two cards I've had. They're stupidly loud, stupidly hot, the fans don't even speed up until ludicrous heat levels (read: they NEVER speed up without a third-party tool so they just overheat and crash) while NVIDIA prances around and says "everything is fine", and the prices are always way higher than competing ATI cards.

Unless there's some massive change in the situation I'm switching for sure next upgrade.
_______________________________________Pixelante Game Studios - Fowl Language
Quote:
Original post by MarkS
nVidia's OpenGL implementation is the reason I am seriously considering an ATi card.
+1 here. Oh how times have changed...
Advertisement
I've been an nVidia loyalist since my first Riva TNT card... I even stuck with them through the FX series when everyone else was picking up the Radion 9600/9800 cards.

Time to change.

I've had 4 8800 series cards die on me and need replacing now, and one about to go in the Mac Pro. Not only that but nVidia has been concentrating so hard on growing the market of GPUs beyond games/graphics that they have fallen behind AMD in both hardware and driver quality.

So its been over 10 years for me buying nVidia only as a brand loyalist, but the next card in the coming months will be an AMD.
Quote:
Original post by Saruman
I've been an nVidia loyalist since my first Riva TNT card... I even stuck with them through the FX series when everyone else was picking up the Radion 9600/9800 cards.

Time to change.

I've had 4 8800 series cards die on me and need replacing now, and one about to go in the Mac Pro. Not only that but nVidia has been concentrating so hard on growing the market of GPUs beyond games/graphics that they have fallen behind AMD in both hardware and driver quality.

So its been over 10 years for me buying nVidia only as a brand loyalist, but the next card in the coming months will be an AMD.


I'm in the same boat. Loved my TNT!

I bought a GeForce 275 to play Crysis on probably nearly 2 years ago. I was recently talking to a workmate buying a new PC and was stunned to see that I can barely buy a nvidia card here anywhere, it's all ATI stuff. I thought our country must have decided that gaming cards were no longer worth selling, but a little research seems to confirm that nvidia has dropped the ball.

Off a cliff.
they lost me around gf4 days, or so. too much proprietary, too much hacks, and too much faking in their drivers. i still remember how they wrote a special futuremark benchmark mode in their driver that just omitted all geometry known to not be in a frame. when you enabled look-around, half the scene was just not there.

not to say ati never did any cheating, but nvidias cheated much more, and tried to hide it more.

and i liked ati's papers much more even in the old days. much more technical, much less wow. i preferred that to read.

then again, by now, i don't even use gpu's anymore. but if, it would be ati.
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

Its interesting how different AMD's philosophy is from nVidia's. nVidia has always relied on the 'halo effect' of having the performance crown selling their mainstream and entry-level offerings. Unfortunately, they've engineered architectures that, as a result, drew far to much power, threw off too much heat and cost too much to product, even when scaled down for the mainstream market.

I view nVidia's current approach as brute-force, and AMD's as a more elegant one. AMD essentially designs their architecture around a $200 price point and simply concedes to putting two chips on a card to suit the high end. I think its pretty clear that this strategy has paid dividends for AMD for 2-3 generations now. I've owned nVidia cards (2MX, 4MX, 9600GT) and ATI cards (9800pro, 5770 (current)) and while I've been satisfied with both, my ATI cards have always been more impressive, draw less power, and are generally quieter.

Being the smaller company, AMD has to play smart with their GPUs -- in much the same way that they play it smart against Intel. It works for them, They're something like 1/10th the size of Intel in the CPU business and roughly the same against nVidia in the GPU business -- yet they have survived for years and even thrive for periods. They're more focussed -- they have to be -- and its been great for us consumers.

throw table_exception("(? ???)? ? ???");

This topic is closed to new replies.

Advertisement